A "Humean degree of freedom" appears in a cognitive system whenever some quantity / label / concept depends on the choice of utility function, or more generalized preferences. For example, the notion of "important impact on the world" depends on which variables, when they change, impact something the system cares about, so if you tell an AI "Tell me about any important impacts of this action", you're asking it to do a calculation that depends on your preferences, which might have high complexity and be difficult to identify to the AI.