Digital Content

Information that is published or distributed in a digital form, including text, data, sound recordings, photographs and images, motion pictures, and software.

Anyone can track you with $1,000 of online ads

For around $1,000, anyone can buy online ads that could allow them to track which apps you use, where you spend money, and your location, new research suggests. Privacy concerns have long swirled around how much information online advertising networks collect about people’s browsing, buying, and social media habits—typically to sell you something. But could someone use mobile advertising to learn where you go for coffee? Could a burglar establish a sham company and send ads to your phone to learn when you leave the house? Could a suspicious employer see if you’re using shopping apps on work time? The answer is yes, at least in theory.

Rep Pallone: Google, Facebook, Twitter Content Treatment Not 'Neutral'

House Commerce Committing Ranking Member Frank Pallone (D-NJ) says top edge player online content management policies are not "neutral," a charge that comes as the network neutrality debate continues to rage on. “With a goal of ad clicks or driving page views, these companies’ policies are not neutral; they actively shape content on the web," said Ranking Member Pallone. That came in a request for a meeting with representatives of those edge giants about how they police content on their sites as social media's role in fake news and Russian election meddling becomes grows as a focus of Hill attention. It also is a response to reports of vague, confusing, and inconsistently applied content guidelines.

Usually, ISPs have been targeted for net neutrality criticisms, but increasingly edge providers are at least at the edges of the conversation on Capitol Hill, and Pallone, along with the committee Democrats of which he is the leader, is clearly trying to include tech companies in conversations about their role in net neutrality and the First Amendment going forward.

Social media site favored by 'alt-right' drops Google lawsuit

Gab, the social media site favored by some in the far-right as a “free speech” alternative to Twitter and Facebook, plans to drop its lawsuit against Google for banning its app from the Google Play Store. Gab said that it had been in “productive back-channel” discussions with Google since it filed the lawsuit against the internet search giant in Sept.

“We were encouraged to resubmit our app before the Android store, as opposed to going forth with continued expensive litigation, of which would have cost the company a great fortune in both time and resources,” the company said. “Google has instead offered Gab an opportunity to resubmit our application for an appeal to be reviewed for placement on their Google Play Store, which we are in the process of doing as we speak.”

Facebook moving non-promoted posts out of news feed in trial

Facebook is testing a major change that would shift non-promoted posts out of its news feed, a move that could be catastrophic for publishers relying on the social network for their audience. A new system being trialled in six countries including Slovakia, Serbia and Sri Lanka sees almost all non-promoted posts shifted over to a secondary feed, leaving the main feed focused entirely on original content from friends, and advertisements.

The change has seen users’ engagement with Facebook pages drop precipitously, by 60% to 80% . If replicated more broadly, such a change would destroy many smaller publishers, as well as larger ones with an outsized reliance on social media referrals for visitors.

Tightening Political Ad Disclosure Rules May Not Curb 'Fake News,' Interactive Advertising Bureau Says

The Interactive Advertising Bureau will testify that it supports efforts to strengthen disclosure requirements for online ads that expressly advocate for particular candidates. But the group will also warn lawmakers that tightening those rules won't necessarily affect the spread of "fake news" online. "Enhancing the existing framework by clarifying the responsibility of publishers, platforms, and advertisers in making available these disclosures to the public would create greater legal certainty across the industry and provide valuable information," IAB CEO and President Randall Rothenberg plans to tell Congress in a prepared statement. "But the 'fake news' and 'fake ads' at the center of the current storm did not engage in such overt candidate support. So they were not, and based on current Supreme Court jurisprudence will not, be regulated under the Federal Election Campaign Act."

Rothenberg will testify Oct 24 before the House Oversight subcommittee on information technology, which is slated to hold a hearing about online political ads. David Chavern, CEO of News Media Alliance, will also testify Tuesday, as well as representatives from the Center for Competitive Politics, and the Brennan Center for Justice, among others.

Can Alphabet’s Jigsaw Solve Google’s Most Vexing Problems?

With Alphabet’s engineering resources, Jigsaw translates this research into internet tools that combat hate speech, detect fake news, and defend against cyberattacks. Jigsaw CEO Jared Cohen’s eight-day visit to Pakistan in December provided firsthand insights into what methods extremists are now using to recruit new members online, which Jigsaw aims to circumvent using targeted advertising to counter terrorist propaganda. Although Cohen’s mission sounds philanthropic, Jigsaw operates as a business, no different from any of Alphabet’s moonshots. Yet Cohen says there’s no stress on the group to generate a profit. For now, its value to the enterprise is the ancillary benefits of protecting Google’s myriad other businesses—Android, Gmail, YouTube—from the world’s worst digital threats. And if, in the process, Jigsaw can help address some of the most acute unintended consequences of digital communication, all the better.

“I don’t think it’s fair to ask the government to solve all these problems—they don’t have the resources,” says Alphabet executive chairman Eric Schmidt. “The tech industry has a responsibility to get this right.”

Homegrown ‘fake news’ is a bigger problem than Russian propaganda. Here’s a way to make falsehoods more costly for politicians.

[Commentary] State-sponsored propaganda like the recently unmasked @TEN_GOP Twitter account is of very real concern for our democracy. But we should not allow the debate over Russian interference to crowd out concerns about homegrown misinformation, which was vastly more prevalent during and after the 2016 election. The problem isn’t that we’re only willing to listen to sources that share our political viewpoint; it’s that we’re too vulnerable as human beings to misinformation of all sorts. Given the limitations of human knowledge and judgment, it is not clear how to best protect people from believing false claims.

Brendan Nyhan is a professor of government at Dartmouth College.

Yusaku Horiuchi is a professor of government at Dartmouth College.

How Facebook’s Master Algorithm Powers the Social Network

[Commentary] Artificial intelligence permeates everything at Facebook, the social network’s head of applied machine learning says—and humans are bound to understand Facebook less than ever. The algorithm behind Facebook’s News Feed, a “modular layered cake,” extracts meaning from every post and photo.

DC Court Allows Live Streaming

In a first for the US Court of Appeals for the DC Circuit, oral argument in a major abortion case, Garza vs. Hargan Oct. 20 will be live streamed after Fix the Court, which advocates for greater access to federal courts, made the request. Chief Judge Merrick Garland issued the decision in a one-sentence letter to Fix the Court executive director Gabe Roth, saying simply: "Thank you for your letter of today's date, requesting that the court provide a live audio feed of arguments  in Garza v. Hargan, 17-5236, tomorrow.

How Fiction Becomes Fact on Social Media

At a time when political misinformation is in ready supply, and in demand, “Facebook, Google, and Twitter function as a distribution mechanism, a platform for circulating false information and helping find receptive audiences,” said Brendan Nyhan, a professor of government at Dartmouth College. For starters, said Colleen Seifert, a professor of psychology at the University of Michigan, “People have a benevolent view of Facebook, for instance, as a curator, but in fact it does have a motive of its own. What it’s actually doing is keeping your eyes on the site. It’s curating news and information that will keep you watching.” That kind of curating acts as a fertile host for falsehoods by simultaneously engaging two predigital social-science standbys: the urban myth as “meme,” or viral idea; and individual biases, the automatic, subconscious presumptions that color belief.

Stopping to drill down and determine the true source of a foul-smelling story can be tricky, even for the motivated skeptic, and mentally it’s hard work. Ideological leanings and viewing choices are conscious, downstream factors that come into play only after automatic cognitive biases have already had their way, abetted by the algorithms and social nature of digital interactions.