Are you thinking about adopting a serverless application approach to your next SaaS or mobile app back end? A recent Cloud Foundry study showed that the deployment and evaluation of the serverless approach increased by 27% from 2017 to 2018. The Serverless paradigm is the next level of virtualization moving up stack and one of the hotter trends in the software industry. On top of virtualization and bandwidth services, backend containers and/or API platforms can be invoked on-demand. Serverless can be great for security, due to the cloud provider’s professional attention to maintenance and monitoring. However, it’s a new technology and only time and talented eyes crush vulnerabilities. This blog is a quick overview of serverless security, and how testing may evolve as the technology becomes more widespread.
Types of Serverless
There are two kinds of serverless solutions out there: BaaS (backend-as-a-service) and FaaS (function-as-a-service).
BaaS providers, like Google’s Firebase, offer vast libraries of APIs that provide the features of a traditional web or mobile backend plus value-added features like test automation, performance tuning, cloud messaging and machine learning. BaaS developers write their code mostly to run in the browsers and mobile devices of their users. In addition, these platforms have some support for user-specified server-side functions.
FaaS applications, on the other hand, are written to run in the cloud, not as part of a traditional server stack, but as loosely-coupled collections of event-driven functions. The end user application calls a FaaS function which triggers anything needed on the back-end to execute the function. This makes FaaS applications highly efficient and scalable, as they can use the full resources of the cloud provider’s infrastructure to accomplish a task, but require no resources otherwise. Amazon Lambda and Google Cloud Functions are examples of popular FaaS platforms.
Both BaaS and FaaS platforms offer security advantages over traditional hosted or cloud server paradigms. Popular Linux or Windows-based frameworks are built of hundreds of individual software components. Each has its own history of versions, vulnerabilities and patches. Delivering security, interoperability, and reliability at the same time can be a headache during development and the lifecycle of the application. Whether you use AWS, a private datacenter or downmarket VPS, you are responsible for making sure your server remains patched, up-to-date, and free from vulnerabilities. In cloud environments—including serverless—the cloud provider takes care of low-level vulnerability scanning and patching for you. Many of the serverless providers are major players in internet and e-commerce domains. They have large and well-trained security teams defending their infrastructure.
Serverless Security Concerns
Serverless security overlaps very heavily with standard application security. Cloud providers offer many built-in mitigations to their platforms—strong isolation being the most powerful one. Serverless clouds are, by their nature, multi-domain and multi-tenant environments. Fundamental design principles and decades of research have given us a high degree of confidence in the security of virtual machines and containers. The same degree of confidence is being built in serverless, but may take years to accumulate enough real-world data about safety.
While not common, there are still attack patterns emerging that should be watched closely. “One of the major attacks against serverless apps is ‘Function Event Data Injection’,” said Abhay Bhargav, founder of we45, an application security and DevSecOps company. In this scenario, the attacker leverages an out-of-band set of vulnerabilities. He continued, “For example, a serverless function triggers whenever a user uploads a file to Amazon S3. The function reads the file, processes it and stores it in a database. In an Event Injection attack, the attacker uploads a file laced with malicious code such as a docx file with XML External Entities (XXE) payloads. If the serverless function is vulnerable, it triggers the XXE flaw and allows the attacker access to the backend environment variables of the function, which can be leveraged to gain access to the AWS account of the deployment environment.”
It’s the hybrid case that provides the most serverless security concerns. Serverless paradigms don’t always provide all of the capabilities needed to run a successful web or mobile app. Many serverless applications need to be coupled with traditional server-based capabilities as well. Many FaaS frameworks, for example, don’t store data cost-effectively. A connection to a traditional database server might be required for high-volume or velocity transactions. Each hybrid environment has a unique stack of serverless and standard components. The container, infrastructure, VM, API and application layers all offer unique attack surfaces. That could include chainable vulnerabilities that lead to exploits that only multi-domain testing can find.
Unique to serverless, every call is a cost. Cloud operators often monetize their platforms on a per-function basis. You pay for every end-user’s call, post, or click, regardless of their intentions, and there aren’t always good mechanisms in place to separate legitimate app activity from bad actors. As a consequence, even an attempted penetration of your backend through fuzzing or brute-force attacks can end up costing you dearly in function costs. While a cloud operator might provide an intervention or refund, you should not expect it to be the norm.
Challenges with Serverless Security Testing
Off-the-shelf vulnerability scanners were not built for serverless. Testers need to analyze the function code itself to identify flaws in business logic, improper use of APIs and data types, and idiosyncrasies unique to the serverless cloud provider. All of these reasons are why security testing with source code analysis is so important for securing serverless applications. Source code security expertise is hard to hire because it requires the elusive hacker/programmer skill combo with the patience to review lots of code. The newness of serverless platforms generally makes this even more challenging. Few people understand them well at this stage of market adoption.
The basic mechanics of testing can be challenging. “Debugging is hard in Lambdas—you have some decent tools for observability, but it’s not the same as running it locally,” said Rahul Sethuram, CTO of Ethereum scaling company Connext and serverless user. So other techniques like stress testing and fuzzing can make a big difference in serverless application security testing. These methods probe the APIs that underlie most serverless deployments. Robust testing of authentication, permissions and session management should minimize the risk of side-channel attacks, user hijacking and information leakage. Statistical techniques, often used for performance tuning and scalability testing, can be used to detect potentially costly denial-of-service and penetration attempts. For example, a photo sharing application designed for the North American market might easily detect even a moderate volume of activity in the middle of the night as an attack.
Applications fully deployed on serverless are still rare, but increasing in frequency. Organizations who believe that testing is handled by their cloud provider should confirm that. With any serverless environment, security testing is the ultimate backstop for security shortcomings in this nascent field.