With the amount of content on the internet growing by the second it is inevitable that the search engines will need to make constant adjustments as to how they find the most relevant results. As the biggest player in search engine land, the changes at Google tend to be the most closely watched. The most recent update, being trialled first in the U.S. before being rolled out the rest of the world has caused more consternation than most as it is specifically aimed at rooting out poor quality sites so as to give the user a better experience and give better quality and more relevant results.
Some site that are considered as "content farms" have been hit particularly hard, so inevitably this has caused a certain amount of debate by these sites as to how to ensure that they retain their hard earned traffic. Ezine Articles is considering putting limitations on certain topics, e.g. "get your ex back", "reverse phone lookup", etc. Whilst over at Squidoo they are clamping down on duplicate content, so whereas previously you get away with copying and pasting entire articles or even Amazon customer reviews, it seems that if you do so you are likely to have your lens locked.
This raises the whole issue of whether duplicate content is bad or not. When I first started looking into these issues a couple of years ago it seemed that the general consensus was that duplicate content was a bad thing. And then Google quashed that by announcing that actually it is fine provided that you do not produce duplicate content on your own site. Now it seems that a degree of duplicate content is seen as a bad thing, although to be honest I am still not completely clear.
Of course the search engines have to change with the times and constantly evolve. So if this means that some of the poorer (particularly automated) content falls to the bottom then this can only be a good thing. However, it will only make attempts at making money at internet marketing more difficult.