Yes. As far as I can tell, the current message of effective altruism sort of focuses in too strongly on "being effective" at its core. This is an anchoring bias that can prevent people from exploring high-leverage opportunities.
Some of the greatest things in the world come from random exploration of opportunities that may or may not have had anything to do with an end goal. For example, Kurt Godel came up with his vaunted Incompleteness Theorems by attempting to prove the existence of the Christian God. This was totally batshit, and I would not expect the current EA framework to reliably produce people like Godel (brilliant people off on a crazy personal side quest they S1 feel strongly about), and in fact would expect the current EA framework to condition people away from Godel ("this is nuts and we can't evaluate it and it doesn't even seem conjecturally plausible") or something like that.
It is possible I am straw-manning the current EA message, but that is my bad mental simulation of the EA message as I understand it (through various chance exposures to fringe elements of the community, who may not be representative of the whole).
Nonetheless, the core point is that until I can reliably see the EA movement reliably generating the "crazy but awesome" things that humans are known to be able to create, or at least not conditioning people away from them (if the core thought process is "EA-focused", this will naturally constrain your thought-space to hew closely to the EA concept), then I would suggest that the EA community message is potentially discouraging natural human creativity (among many other things).