上海千花网,爱上海,上海419论坛 – Powered by Gregary Thayne!

Adobe Opens Wired New Yorker Tablet Platform to Publishers with Analytics

Posted on by

first_imgThe Digital Publishing Suite aims to be a comprehensive solution for turning traditional magazines created in InDesign into digital publications that can be distributed to devices of all sorts. But the one aspect that intrigues me most is this: It includes Omniture analytics for digital magazines. (Adobe acquired Omniture a year ago for $1.8 billion.)If you use the suite to produce tablet versions of your magazine, you can use the analytics service to get a bevy of information about how they’re being read—all aggregated and anonymized to avoid privacy issues. You can see whether readers are opening the issues they’ve bought. You know which stories they’re jumping to, and whether they tap through every page of an article or abandon it after the first one. You can confirm whether they’re watching ambitious multimedia elements such as embedded video. And you can tell whether they’re reading front-to-back, back-to-front, or hopping around randomly.It sounds like a goldmine of useful information that publishers could use to make publications that serve their readers better. And much of it might help with a magazine’s traditional, dead-tree version, too—I suspect that there’d be a correlation between covers that prompt tablet subscribers to open the issues quickly and ones that are newsstand winners.Analytics can’t tell you everything you need to know about your readers’ relationship with your content. Seeing that a lot of people chose to read a particular article, for instance, says nothing about whether they liked what they got once they finished. So traditional research such as surveys and focus groups still have their place, and free-form feedback such as reader comments on online versions of stories can be very useful. But I know that if I were editing a magazine with digital editions produced with Adobe’s suite, I’d be hungry for the new clues about reader behavior that these analytics could provide. Back when I was editor of PC World, creating a magazine that newsstand buyers and subscribers loved was one of the great pleasures of my job. It was also something of a dark art. There were plenty of signs we’d succeeded: issues flying off the stands, renewal rates staying healthy, high scores in the pricey reader surveys we conducted. But in the end, connecting the dots of reader satisfaction was difficult, and agonizingly slow.(The closest I got to instant gratification, incidentally, was when I traveled on an airplane and happened to sit next to someone who was reading PC World. Rather than introducing myself, I’d peek out of one corner of my eye and see which articles my neighbor lingered on, and which ones he or she skimmed right past.)On the Web, things are different.  Analytics services such as Omniture let editors and other media types see what’s getting read, what’s getting ignored, and how consumers navigate through everything a site has to offer. They let you make decisions in real time, rather than waiting for months.At Adobe’s MAX conference in Los Angeles this week, the publishing software company is announcing its Digital Publishing Suite, the fully-commercialized version of the system that Wired, The New Yorker, and other publications have been using to create iPad versions of their magazines. It’s rolling out in pre-release form for publishers who’d like to try it out; the final version is due in the second quarter of 2011.last_img read more

Tagged: , , , , , , , , , .

YouTube recommendations for altright videos have dropped dramatically study shows

Posted on by

first_imgSOPA Images Google has made “major changes” to its recommendations system on YouTube that have reduced the amount of “alt-right” videos recommended to users, according to a study led by Nicolas Suzor, an associate professor at Queensland University of Technology.During the first two weeks of February, alt-right videos appeared in YouTube’s “Up Next” recommendations sidebar 7.8 percent of the time (roughly one in 13). From Feb. 15 onward, that number dropped to 0.4 percent (roughly one in 250).Suzor’s study took random samples of 3.6 million videos, and used 81 channels listed on a recent study by Rebecca Lewis as a starting point. That list includes voices like Richard Spencer, an American white supremacist, but also includes more mainstream voices like Joe Rogan, who does not self-identify as alt-right but often plays host to more extremist voices on his podcast (including alt-right figures such as Alex Jones).The drop appears significant, but it’s difficult to figure precisely how that drop occurred. We don’t know if YouTube is targeting ‘alt-right’ videos specifically or if the drop off is part of broader changes to YouTube’s recommendation system.YouTube has long spoken about making changes to recommendations. As early as two weeks ago, YouTube was criticised for allowing the flat-Earth movement to flourish using its platform.In response, YouTube has attempted to curtail what it refers to as false information. YouTube says freedom of speech is central to its core tenets, even when people express controversial beliefs, but has been working to reduce the spread of misinformation on its platform.YouTube has also been investing in surfacing credible voices on its platform. “In the last year alone,” said one recent YouTube blog post, “we’ve made hundreds of changes to improve the quality of recommendations for users on YouTube.”In that same blog post, YouTube said it was planning to reduce “recommendations of borderline content and content that could misinform users in harmful ways — such as videos promoting a phony miracle cure for a serious illness, claiming the Earth is flat, or making blatantly false claims about historic events like 9/11.”YouTube stated it would be a gradual change, and apply to less than 1 percent of the content uploaded on YouTube.In a statement sent to CNET, a YouTube spokesperson said the platform wasn’t targeting alt-right videos.”We announced in January that we are reducing recommendations of borderline content or videos that could misinform users in harmful ways. We have not had a chance to thoroughly review this study, however, our recommendation systems are not designed to favor or demote specific misinformation based on specific political perspectives.”Speaking to CNET, Nicolas Suzor is looking for more transparency from YouTube regarding how and why certain videos are recommended.”It’s not good enough that we have to guess about how well these systems are working,” he said, “and our research can only observe from the outside. YouTube has done a lot to improve transparency about its terms of service enforcement on a high level over the last year, but they still need to do more to help people understand how their algorithms are operating.” YouTube 0 Post a comment Tags Digital Media Culture Share your voicelast_img read more

Tagged: , , , , , , , , , .