Robyn Caplan

Current Projects



My dissertation work, tentatively titled Public Interest Frameworks for Computational Systems: Media Policy in the Algorithmic Era, is currently underway, and is expected to be completed in Spring 2019. This work examines how platform companies make and alter their content policies in response to content concerns, such as disinformation flows, and the challenges in establishing schemas to verify the credibility of content uploaded from organizations and individuals from all over the world. My work provides an understanding of the networked and distributed context of policy-making within the global information era, particularly as it applies to United States-based platforms operating in many regions in the world. I have found in this work, that though platform companies have an outsized role in setting standards that ripple through the media industry, standards can be influenced by organizations, institutions, and even users

Papers from this work have already been published in First Monday, and Big Data & Society.

Demonetization: The Shifting Terms of Labor and Compensation in the Platform Economy

My research on emerging content standards revealed that there were divides in terms of how content rules were being applied by platforms. This was particularly true in cases when revenue sharing agreements are being used as a lever to enact content standards, such as “advertiser-friendliness” in the case of YouTube.

In order to developed a more nuanced perspective on how financial incentives shape user experience, I embarked on a new project as part of my summer internship at Microsoft Research. Tarleton Gillespie and I conducted content analysis of YouTube videos, interviews with YouTubers, and discourse analysis of company policies. We focused on how content policies impact marginalized communities, including LGBTQ identified individuals and members affiliated with Black Lives Matter, whose content is frequently targeted by YouTube as being in violation of their advertiser standards of discussing “controversial” topics. We found YouTube’s content policy varies depending on a user’s or an organization’s institutional tie with the company, including the formal category they’ve been placed in by the company, their audience size, the extent to which they’ve benefitted from offline resources, and whether they have a direct relationship with a YouTube employee. This work presents a vision of social media companies that are less ‘platform’ and more ‘podium.’

It has been accepted for publication within Social Media + Society.