Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

MIT is doing some really interesting research into using crowd sourcing like mturk. Check it out: http://groups.csail.mit.edu/uid/research.shtml#crowdcomp

They are tackling tasks like extremely difficult OCR and collaborative editing and proofreading.

I've used mturk at work to automate transcribing short recordings and have found that it works pretty well. The trick is to qualify your workers so that they pass some kind of test. You can also only accept workers that have a rating above some minimum. Then, critically, as suggested by others here, get each task done multiple times for cross-checking. And make sure that your instructions are clear.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: