For so long, it's been so easy to fake, misuse, misinterpret and distort "facts" on the internet. It's no surprise we are where we are. Inevitably, things had to change... this time around, in the form of the new regulations and changes of May.
Then there's the dilemma: regulation can be a double-edged knife in many cases, but at the same time, a complete lack of it is not often better.
I'll admit that all of the recent data-related regulations and changes have made my life as a marketer a little more complicated. Not because of noncompliance, because I've always valued good ethics and have never had to worry about that part, but because of all of the many little changes that affect us all. Revised policies, loads of new homework on topics like GDPR, and new limitations on marketing software, or for instance Facebook's ad platform. Most of all though is a growing lack of trust in all content that is shared (or especially promoted) online.
Policies are not that hard to update. It's not that hard to ask for permission to continue sending updates to your email list. It's regaining lost trust that's a little bit trickier than that.
In this sense, I think some of the new regulations will actually help. They reinforce good practices by penalizing bad ones, which has so far seldom been the case. This in turn paves the path for more newcomers to adopt strictly ethical practices, and more people who refuse to being forced out.
But why is it a "double-edged knife" and what can be done about it? Well, people have always taken advantage of automated processes to cause trouble for competitors they don't like. It's not uncommon to hear about Amazon sellers posting fake (bad) reviews on their competitor's products so their competitors get penalized unfairly and even fail as a business.
On Google, "black hat" marketers are known to rise higher in the rankings by linking spammy links to their competitors' websites, again, causing unfair penalties.
This being said tho, I don't believe the potential for this type of system abuse should stop us from trying to regulate completely. It just means we should be very mindful and careful about how we do it. If a system is taught to send automatic penalties to a business, that system can be abused by competitors. But, if those penalties can be appealed and ultimately reviewed fairly by an actual person, the problem fades.
It may not be an ideal compromise, but I find it generally wiser to focus on adapting to, rather than opposing or fighting what is likely an inevitable and long-term change. Be reliable, factually correct and ethical - that's really the best way to avoid further regulation and headaches in the future.