I don't think it is incompetence playing out here. They just want to have a cookie and eat the cookie at the same time and they somewhat succeed. What I mean is that they would like the full text to be read by bots and indexed in Google and other browsers but they also want most interested users to pay for reading it. Having it protected by JS (that can be disabled) means that bots can read it in full and index. Also, most of the interested users will still pay - either because of not being technical enough or valuing their time more than a few dollars to play with JS, or out of decency. When it comes to a ton of JS errors - that's the effect of cost-cutting. Corporations try to make stuff faster to be competitive and if hardly anyone sees those errors then it does not matter enough to be fixed.
Z
Zbyszek Matuszewski
@Zbyszek Matuszewski