I would imagine this dependency on LLM coding makes people less willing to adopt new technologies, ergo, training data would probably no longer be as plentiful? Doesn't this eventually lead to a long term degradation of LLM's?
Anyone else noticing this?
I don't think of this as degradation, more that they will always lag behind the human content. Degradation would be more along the lines of if there is less and less training data being generated and the LLM has something wrong that would lead to degradation of usefulness.