State of Remote Access Security
Nearly three-quarters of the U.S. workforce will be mobile workers by 2023, IDC predicts. With so many employees on the road at least part of the time, being able to access business critical applications remotely is key to the way we work — but also hard to secure.
In a recent study conducted by IDC for Akamai, more than half of respondents (56%) ranked security breaches as their greatest challenge around application access, even as they acknowledged that providing that level of access is an absolute necessity. Indeed, more than half the companies surveyed said every aspect of securing access is difficult, from managing mobile employees to onboarding and off-boarding contractors and partners.
Unsurprisingly, companies that have experienced the most costly data breaches in the past are the ones who believe most strongly that malicious activity can be thwarted once discovered—presumably due to their experience.
Yet whether companies have endured a past data breach or not, they face a difficult balancing act. On one hand, they need to enable employees and third-parties to remain productive no matter where they are. On the other, they need to keep enterprise applications secure, whether those applications live behind the firewall or in the cloud.
Companies are acutely aware that providing remote access can create a security risk. The majority of respondents say that making sure they’ve secured access to corporate applications is a high priority. They also claim to be on top of monitoring the security risks access creates and the majority say they track threats continuously and believe they can respond to malicious activity quickly enough to fend off data breaches.
The vast majority of respondents — more than 80% — say their leading reason for providing remote access is to ensure that their employees can use corporate resources securely anytime and anywhere. Security is also the most important issue respondents consider in providing remote access to third-party vendors and contractors — a critical consideration as outsourcing continues to grow. More than two-thirds call it moderately difficult to provide, while more than 15% call it difficult or extremely difficult.
Companies could eliminate both the risk and the inconvenience of providing remote access by simply choosing not to offer it. But in a hypercompetitive, data-driven business market, they’ve clearly ruled out that option. Every company surveyed allows at least some part of its workforce to access corporate applications remotely, and 75% expect to extend that privilege to even more employees in the next year or two.
Instead, some respondents seem to have accepted a certain level of loss as inevitable if they want to enjoy the benefits of remote access. On average, they expect to lose an average $6.5 million as a result of unauthorized remote access.
The more employees and/or more third-party vendors companies have, the more they anticipate losing. At the same time, though, companies that have suffered larger losses in the past due to unauthorized access are more likely to believe they can thwart malicious activity in the future once it’s discovered. The survey did not ask whether those who have suffered losses have subsequently increased their security spending, but it stands to reason that if a company faces a data breach, they would be more likely to invest in tools and procedures meant to prevent a repeat, even as they worry those investments might not be sufficient.
Most companies use a piecemeal approach to protecting applications on-premise or in the cloud, which is more susceptible to let an unauthorized user in or unsecured data out. To protect user productivity while ensuring the security, organizations need an access architecture that works seamlessly across users and applications wherever they are — in any data center and on any hybrid cloud — and offer a single point of control for provisioning, changing, and monitoring user permissions and usage. In this way, organizations can simplify their remote access architecture while minimizing potential data bottlenecks and points of failure.
Source: computerworld.com