Many of us assume that all the free editing and sorting of online content we ordinarily rely on is carried out by AI algorithms — not human persons. Yet in fact, that is often not the case. This is because human workers remain cheaper, quicker, and more reliable than AI for performing myriad tasks where the right answer turns on ineffable contextual criteria too subtle for algorithms to yet decode. The output of this work is then used for machine learning purposes to generate algorithms constructed from large data sets containing thousands of correctly coded observations. The fact that ghost workers are treated as consumers distorts the basic logic of the employment relationship, effectively placing the worker-as-consumer in the worst of both worlds, in which they hold the legal rights of neither group. This puts them in a regulatory limbo position in which they have little or no protection, control, or guarantee of return on investment. This is because the platforms that facilitate ghost work have orchestrated a three-way virtual relationship that absolves all parties of responsibility, while placing nearly all the risks and opportunity costs squarely on the shoulders of the workers. As a result, we believe such a work arrangement as it stands, does not uphold basic standards of Rawlsian justice as fairness.