For local models, someone was annoyed enough to make a modified sampler that backtracks when the model outputs a "GPT-ism" and regenerates with the output token probabilities tweaked to avoid it: https://github.com/sam-paech/antislop-sampler/
It only squirts out what is put in it.
Though I am quite confident that a 5 year olds brain is far superior to any AI