“Turker Computer” Process Notes

TurkerComputers_Amias_MacLeod.Parkland_-Florida_-United-States

Turker computer photographed by Amias MacLeod (Parkland, Florida, United States)

Today I finally got around to documenting an ongoing project (officially started in 2013) of the computers owned by Mechanical Turk workers. I’ve used “Turkers,” as they call themselves, in previous projects and written about that experience for the Parsons Journal of Information Mapping. The main question of crowdsourced labor, like Mechanical Turk, is the exploitation of very low-paid workers who, as independent contractors, get no health coverage, retirement, or other benefits. These concerns are well articulated in “The Ladies Vanish” by Shawn Wen in The New Inquiry. This post is intended to talk about the methodology used in this project, rather than the conceptual and artistic motivations of the project (which you can read a little more about here), and hopefully address some of the questions that Wen’s article raises.

One of the points Wen makes in his article is that there is often a false argument made that Turkers are mostly from countries like India, and that the low wages paid are actually substantial for those areas. Wen cites a 2010 study by NYU professor Panos Ipeirotis that showed that the number of American workers on Mechanical Turk is rising significantly and Americans represent over 50% of the workforce on the site. My project, which started as a few experiments in late 2012, seems to confirm this. I was surprised to see mostly American-looking homes in the later images, compared to a more diverse range in the earlier results. In an attempt to get images from more places and users, I tried releasing the jobs late at night, hoping to get workers on the other side of the globe while Americans and Europeans were asleep. I was met with no success, evidence I think for Ipeirotis’ findings.

The question of rate is an important one; unlike large companies whose revenues are tied to this distributed work, like most artists I spend more money each year to make my work than I earn on my practice. While I would be thrilled to pay workers $30 per image, the results of work on Mechanical Turk are always of very mixed quality (especially for odd tasks) and, in order to get enough usable images, I would have to have spent hundreds or thousands of dollars. In the end, my strategy was balance:

  1. Pay a fee far higher than most jobs on the site (workers were paid $1 per image)
  2. Ensure the process was as easy and quick as possible
  3. Hopefully make the process a lot more fun than tagging images or the other usual jobs on the site

Like most jobs on Mechanical Turk, speed in turning around a job is the key to making a decent wage. As the job requester, this meant developing an interface for image upload that was easy to use, required minimal steps, and could be done outside the Mechanical Turk system and on one’s phone, which likely has a built-in camera.

TurkerComputersForm-web

The image upload form, designed using the JotForm service and meant to be as clear and simple as possible.

The form returned a unique ID, which was then input into the Mechanical Turk system to confirm the image’s submission. Amazon gives the average wage paid for the batch – overall the project paid approximately $5/hour. Removing some of the extreme outliers (people who, I hope, left their computers to do other things, then came back with their cameras to finish the job), the average time spend on the assignment was 7 minutes, putting their hourly wage at $8.57. Not great, but above the minimum wage in the United States.

Part of the point of projects like this is to raise questions about how we use technology and, in this particular case, how class plays a role in the technology economy. This is often lost in tech reporting, which is overly-focused on the newest gadgets. It is also lost in histories of technology, written about the social, economic, and scientific climates that a certain technology was born out of, but too often highlighting genius over context. We know all about the birth of the ENIAC and mainframe computers, but little about the staff that supported the scientific work.

In short, I welcome a conversation.

First Experiment With “Mechanical Turk”

The completed tasks, not 20 minutes after publishing them!  Click on the image for full-size.

An initial experiment (my first) with Amazon’s Mechanical Turk, a service that allows workers around the world to complete simple online tasks.  While most of the tasks are weird surveys or spamming attempts that pay between $0.05 – $0.25 per job, I decided that paying a reasonable fee would be fair and more likely to be completed.  I asked participants to visit this site and determine the color they saw when their eyes were closed looking at a bright white screen.  Prepaid for 100 people to complete my task, I assumed it would take a week or two.

I took a shower and came back to my computer to find… 100 responses!  In less than 20 minutes the whole project was finished, to my astonishment.

The first 100 responses – not all what I would call “accurate”

A few thoughts on the process:

  1. The results were almost entirely quality – about 93% took more than 90 seconds to complete the task and returned useable values.  The others apparently did not read the instructions very carefully and returned color names like “pink”, “black”, and “rainbow”.  While interesting, I wanted completely objective answers.  Mechanical Turk allows you to reject jobs that don’t meet your criteria so I did; a few sent back emails saying they were upset that I had rejected their one minute of work, but I think that’s likely par for the course with this system.
  2. It is clear that not all the values are “good” – I can’t imagine the scenario that someone sees bright blue.  In the future, I’ll likely proof the data first.
  3. Mechanical Turk will return your results as a CSV file, which is very useful.  It also includes lots of great “extra” data, including time spent finishing the task and the exact time the task was completed.
  4. I am very much thinking of setting out another batch to be completed.  While $100 isn’t cheap ($110 actually, since Amazon charges a fee), the data was fast and quality.  I have been considering a sliding scale so that the price goes down over time, making those who answer early get paid more, those that wait less.

If you are feeling egalitarian and want to help the project without getting paid, you can head over to this page and email me your results.