As you secured your Kubernetes cluster, you defined users, roles, and permissions, making it safer one restriction after the other. Along with user accesses, you also need to control what is being authorized by the services you did not create yourself, and that you depend on: third parties.

State of the art

focus personae

While they are often left unnoticed and do not attract much attention, third parties are considered as top 10 security risks.

According to the latest studies, 76% of today's applications contain at least one security flaw, and more than 23% contain high severity flaws.  
Flaw prevalence by median flaw density for CWE categories (source Veracode11)

CWE: Common Weakness Enumeration

Top 5 application vulnerabilities

When working in Information Technology, security issues are something we often don't want to deal with, mainly because we don't exactly understand what can happen and why.

Let's try to learn, understand, and briefly demystify the five more common security flaws that are found in today's software.

n°1 - Information Leakage (CWE-200)

When information is unintentionally available to end-users, this information can then be used to breach the security of an application. This information is not leaked during transfer on the network, but really openly accessible by anyone, even if it shouldn't.

It could expose information about the organization, the software's design, the deployment process, and much more. Such sensitive data can be credential information, configurations, database structure, authentication tokens, and much more.

n°2 - CRLF Injection (CWE-93)

When reading CRLF, developers automatically think of the characters Carriage Return and Line Feed (\r\n) used as an End Of Line (EOL) sequence, and it's exactly what this is about.

By injecting a CRLF sequence into an HTTP parameter or an URL for instance, an attacker could then inject a read method to the destination file, leading to content being written on screen, or logging modification.

n°3 - Cryptographic Issues (CWE-310)

Data confidentiality and integrity is key in today's businesses. This matter is addressed through encoding techniques, encryption, and hashing algorithms.

As we think we all encrypt our data and hash our critical keys, this category of security weaknesses remains in third position, meaning that it might not be addressed everywhere as it should be, or not in the right way; leading to data quality degradation or data corruption.

n°4 - Code Quality (CWE-398)

The 4th weakness is purely human, since it incriminates our own coding habits and the quality of our processes and deliveries. Obviously, these weaknesses are not meant to exist as their existence is not intentional, but it reveals a lack of rigour during the development or the maintenance phase of a software. To be honest, deadline rushes might also have their part of responsibility here.

If a code has a poor level of quality, it can obviously lead to unexpected behaviours, opening breaches into any software or application.

n°5 - Credential Management (CWE-255)

Credential management determines the scope of information users can access from an application. It's only logical that it represents one of the most common vulnerabilities that can be found in software development.

This weakness can regroup a lot of different issues such as:

  • unprotected storage
  • password storage
  • password in configuration files
  • weak password and weak password encoding
  • passwords and keys with no expiration dates or very long ones
  • bad password recovery strategy
  • unprotected transport of credentials
  • ...and more

Even if credential management is known as a major issue and often addressed seriously by software companies and developers, they often rely on "on-the-shelf" solution. Developers put their trust into the default behaviours of such software, without really digging into the details of its operation and the options that may be available.

Fortunately, today that might be one of the simplest weakness to address.

Other security weaknesses

As we only looked into the top five software vulnerabilities, here is the list of the vulnerabilities and their frequency, from the latest Veracode study (11).

Percentage of application with specific CWE types (source Veracode11)

As we can see, there is a very large scope of security issues, and they are widely present in most application and software.

Should we not use third parties then?

We could imagine that relying on external sources is a risk not worth taking, but that is not true at all.

In fact, not relying on third parties would be a very bad idea for any business.

focus personae

Not using third parties would mean a few things:

  • you are willing to lose a enormous amount of time to re-code all libraries and software you need
  • you think your code will have less vulnerability than the one everyone has been using for years/months (and often written by a large community of developers)
  • the time lost on the 90% of software you could choose from is time you won't spend on the 10% of value you bring
  • slower go-to-market
  • delay on your competition
  • slower development of new features
You need third parties, even if they are flawed, because we cannot, should not, and do not want to reinvent the wheel.

You need to assess that your code is going to be flawed, and so will be the software and libraries you use. By knowing the risks, knowing the libraries your third parties needs, acknowledging you'll need time for fixing security issues, you prepare yourself for whatever may happen.

By knowing the risks, you will be able to answer your customers and communicate clearly on your security policies and how you adapt to them.

On-the-shelf services and applications allow us to move faster. Businesses use third parties for the purpose they serve: monitoring, logging, storage, caching, analytics, and much more, focusing their principal effort on the value they want to bring to their own customers

Granting special authorization and accesses

Using open-source libraries, we cannot do much more than know what we use and assess the risks of our technical choices. An acknowledged and measured security risk is worth taking, as a no risk policy is inevitable.

When it comes to containers, it's a different story, since they run on servers we manage (or managed by an orchestrator such as Kubernetes), we can add security layers. Of course, libraries used will remain as they are, but isolating the containers (pods), namespace, defining RBAC, and network policies can prevent unauthorized access to the third party software from outside our production environment.

Continuous Integration and Continuous Deployment

focus personae

CI/CD software have been a huge improvement in the software industry, as they allowed us to automate our testing, integration, and deployment processes. If you are not a CI/CD software provider, there is very little chance that you would want to deploy your own.

Nonetheless, you need a CI/CD software, and you either decide to use a Software as a Service solution, or an On Premise solution. Your entire development and production workflow will rely on this specific third party.

Now, if we go back to our Kubernetes cluster usage, we should not just configure our CI/CD to "work with" our cluster, but we need to consider it as an external user, and this user will need at least:

  • access to a Container Registry (where you store the containers to test and deploy)
  • access to the different namespaces of your cluster depending on the application/container you are deploying

One pipeline does not need access to the entire cluster, but only to a dedicated part of it, with a dedicated user defined. Also, all your team members might not need to have access to all your CI/CD software features.

As the main entry-point of your production environment, your CI/CD should be well configured and every team member should be aware of the issues they might unintentionally cause (containers packaging issues, libraries vulnerabilities, namespace isolation), and understand their responsibilities.

Using an open source library where a vulnerability is found is also the responsibility of the developer or the DevOps who used this library. People are responsible of their codes, dependencies, configuration files, etc...

Software design criteria

focus personae

Security concerns are inevitable, as is the usage of third party software. In today's world, we need to go fast, be efficient, and re-coding everything we need when we have such an amazing open source community and software library available would simply be a waste of time.

We need to understand and be aware of the risks and the issues that might wait ahead, and to do so, the first step might simply be to consider security as a main criteria while choosing an "on-the-shelf service" as we design architecture and software.

Are we aware of what we are using? Are we listing all our dependencies? Are we updating every one of our third parties every time a patch is available?

A risk you know is a risk you are not scared of, and are prepared to deal with.

A true story - just for "fun"

Let's end this article on a "lighter" note, by a true story about the consequences of a security flaw.

Once upon a time, a company was using a CI/CD Software as a Service, and added its cloud provider credentials in it so the pipelines would spawn instances once per day to run tests. One day, the CI/CD company was breached, and they informed their customers to change their credentials. Which they did, replacing the credentials in the CI/CD with new ones. A year later, their Cloud Provider contacts them because of a weird activity. In fact, Their invoice had unexpectedly increased in the last 4 days. After investigating, they discovered that "someone" had spawned hundreds of expensive instances on their account to mine bitcoin. So what happened? Turned out, the leaked credentials were removed from the CI/CD software, but not from the company's cloud provider's account, so the credentials remained valid. It ended up costing $12.000 in extra cloud services (and we don't count here the time spent by developers and DevOps on the investigation)...

Sources used in this article