Negative constraints in programming prompts
The goal is to create a prompt that will get the AI model to fail, using a negative constraint. For example, the prompt tells the AI model not to do something, and yet the model does said thing. The worker must continuously try and modify their prompt until the AI model fails, at which point the worker will grade the response, as well as provide a perfect response to their prompt.