As technology continues to evolve at an increasingly rapid rate, the technological and corporate implications continue to evolve with it. As many already know, the increase in data alone is causing many markets to grow exponentially—the optical networking space is experiencing massive growth as through-put demands continue to rise.

Our own industry—IT Hardware Support—is experiencing monumental growth, simply due to the demand placed on required up-time and the strain placed on the hardware itself in a data-driven world.

However, those examples are the easier side of IT—the quantifiable and qualifiable aspects of IT management that can be planned for and overcome with a simple application of financial planning and mathematics. But what of the other side of IT—the side that becomes a grey area of law and corporate responsibility?

A perfect example is that of the new technology referred to as Deep Fakes—a way to manipulate video imagery to do everything from swap faces in and out of video to change voices, and more. The result: a video that can be presented as evidence of an act that never actually took place.

Appearing on the internet in the last 24 months, the first Deep Fakes were applied to the adult entertainment industry—creating adult scenes with famous actors. The result was of course devastating for those involved, since they were not in the videos at all, yet were objectified in ways that were disturbing to say the least.

Now, as the technology becomes far more available, we are witnessing the next evolutionary state, one that can and will have severe consequences to our nation and the world—the introduction of Deep Fakes into the political arena.

In fact, just this week a video of Nancy Pelosi surfaced, her voice and speech pattern manipulated to make her appear as though she were under the influence of drugs or alcohol. And like any video that has the potential to go viral, Facebook posts of the video were viewed more than 1.4 million times and shared more than 30,000 times. Furthermore, the post received more than 20,000 comments.

So what does this have to do with IT and the enterprise? The challenge that IT will now face is two-fold. First, IT must have the ability to manage such files as they appear on corporate servers and hardware to mitigate risk of potential legal action. Second, entirely new aspects of corporate responsibility must ensure that videos or other documents outside of proper business assets are not being shared as if coming from the company itself.

In order to mitigate the risk of hosting such content, IT teams will have to increase their own security and end-point management measures to patrol and police data—something that is already being done in many cases. Furthermore, the monitoring of social media access on company devices will also need to be increased.

However, there is even more to contemplate. As devices and mobility, along with the proliferation of personal and work-related social media activity, continue to blur,the question becomes who is responsible foropinion, sharing, and so on, and how does itaffect a company’s brand?

For instance, we have all seen the recent news of individuals being fired for being associated with particular groups as images show up on social media. In a world where corporate responsibility is in direct correlation to public sentiment and shareholder value, this new technology can have dire consequences.

Imagine how competitors could leverage Deep Fakes for financial gain. And before the arguments of ethics and business rules come into place, one look no further than the daily news to see a host of nefarious activity coming from the corporate world.

Technology has changed our world for the better. Now it’s simply a matter of managing access, control, corporate responsibility, and, most importantly, our ability to self-educate and research long before we pass judgement. Just because something looks real, doesn’t mean it is.