Soylent: A Word Processor with a Crowd Inside

Soylent: A Word Processor with a Crowd Inside

2010 | Michael S. Bernstein, Greg Little, Robert C. Miller, Bjorn Hartmann, Mark S. Ackerman, David R. Karger, David Crowell, and Katrina Panovich
Soylent is a word processing interface that integrates crowd-sourced human contributions to assist with complex writing tasks. The system enables writers to use Mechanical Turk workers to shorten, proofread, and edit parts of their documents. The Find-Fix-Verify pattern is introduced to improve the reliability of crowd-sourced editing by splitting tasks into generation, review, and verification stages. Evaluation studies demonstrate the feasibility of crowdsourced editing and investigate reliability, cost, wait time, and work time for edits. Soylent includes three main components: Shortn, a text shortening service that reduces text length without changing meaning; Crowdproof, a human-powered spelling and grammar checker that identifies errors missed by AI; and The Human Macro, an interface for offloading arbitrary word processing tasks. The system is designed to integrate paid crowd workers into an interactive user interface to support complex cognition and manipulation tasks on demand. The paper discusses related work in crowdsourcing systems and artificial intelligence for word processing. It also explores challenges in programming with crowd workers, including high variance in effort and errors introduced by Turkers. The Find-Fix-Verify pattern is proposed as a method for programming crowds to reliably complete open-ended tasks that directly edit the user's data. The evaluation of Shortn shows that it can shorten text to 78-90% of the original length, with a median work time of 118 seconds. Crowdproof effectively catches and fixes errors, with 67% of errors identified. The Human Macro allows users to request arbitrary tasks, with a success rate of 88% in intention and 70.8% in accuracy. The paper concludes that Soylent represents a new kind of interactive user interface that provides direct access to a crowd of workers for assistance with tasks requiring human attention and common sense. The system requires new software programming patterns for interface software, as crowds behave differently than computer systems. The Find-Fix-Verify pattern is introduced as a key contribution, splitting complex editing tasks into identification, generation, and verification stages. Future work includes new crowd-driven features for word processing, optimizing crowd-programmed algorithms, and integrating on-demand crowd work into other authoring interfaces.Soylent is a word processing interface that integrates crowd-sourced human contributions to assist with complex writing tasks. The system enables writers to use Mechanical Turk workers to shorten, proofread, and edit parts of their documents. The Find-Fix-Verify pattern is introduced to improve the reliability of crowd-sourced editing by splitting tasks into generation, review, and verification stages. Evaluation studies demonstrate the feasibility of crowdsourced editing and investigate reliability, cost, wait time, and work time for edits. Soylent includes three main components: Shortn, a text shortening service that reduces text length without changing meaning; Crowdproof, a human-powered spelling and grammar checker that identifies errors missed by AI; and The Human Macro, an interface for offloading arbitrary word processing tasks. The system is designed to integrate paid crowd workers into an interactive user interface to support complex cognition and manipulation tasks on demand. The paper discusses related work in crowdsourcing systems and artificial intelligence for word processing. It also explores challenges in programming with crowd workers, including high variance in effort and errors introduced by Turkers. The Find-Fix-Verify pattern is proposed as a method for programming crowds to reliably complete open-ended tasks that directly edit the user's data. The evaluation of Shortn shows that it can shorten text to 78-90% of the original length, with a median work time of 118 seconds. Crowdproof effectively catches and fixes errors, with 67% of errors identified. The Human Macro allows users to request arbitrary tasks, with a success rate of 88% in intention and 70.8% in accuracy. The paper concludes that Soylent represents a new kind of interactive user interface that provides direct access to a crowd of workers for assistance with tasks requiring human attention and common sense. The system requires new software programming patterns for interface software, as crowds behave differently than computer systems. The Find-Fix-Verify pattern is introduced as a key contribution, splitting complex editing tasks into identification, generation, and verification stages. Future work includes new crowd-driven features for word processing, optimizing crowd-programmed algorithms, and integrating on-demand crowd work into other authoring interfaces.
Reach us at info@study.space