Skip to main content

· 7 min read

Web Application Firewalls (WAFs) are the most advanced firewall capabilities in the industry. They've evolved from traditional firewalls focusing on network layer traffic to sophisticated systems that can understand and track session state and make sense of what's happening at the application layer.

The Need for WAFs

As cyber-attacks become more advanced, climbing up the ladder of the Open Systems Interconnection model, there's a growing need for a different kind of inspection. This inspection should not only understand and make sense of network traffic but also be able to parse the "good" from the "bad" traffic. This is where WAFs come in.

WAFs can protect your systems through several means. One such method is signature-based detection, where a known attack signature has been documented, and the WAF parses the traffic, looking for a pattern match. Another method involves the application of behaviour analysis and profiling. Advanced WAFs can conduct a behavioural baseline to construct a profile and look for deviations relative to that profile.

The Changing Landscape of Cyber Attacks

In the past, attacks on applications and infrastructure were carried out by individual hackers manually. However, to become more efficient and drive more results, malicious operators and organizations have largely automated and industrialized attacks through the use of distributed botnets.

The Evolution of Application Development

Applications and their development have undergone significant changes with the advent of cloud deployments, container technologies, and microservices. Developers often reuse other people's code to achieve outcomes and functionality for their applications. This has led to an increase in the use of third-party libraries during the application development process.

Attackers are aware of this and are looking to exploit vulnerabilities found in commonly used third-party libraries such as OpenSSL. This means that the number of well-known vulnerabilities multiplies exponentially the more they are used in the development process. WAFs and adjacent technologies can help provide gap protection in the form of signature-based and behaviour-based identification and blocking. This can help address not only known vulnerabilities and threats but also zero-day threats and vulnerabilities.

Understanding WAF Functionality

The Open Web Application Security Project (OWASP) Top 10, which outlines the most prevalent vulnerabilities found in applications and walks through the means of mitigation by way of compensating controls.

Adjacent WAF technologies and functionality include:

  • API gateways
  • Bot management and mitigation
  • Runtime Application Self-Protection (RASP)
  • Distributed Denial of Service (DDoS) protection
  • Content Delivery Networks (CDNs)
  • Data Loss Prevention (DLP)
  • Data Masking and Redaction
  • Security Information and Event Management (SIEMs)
  • Security orchestration and incident response automation

By understanding the latest developments in WAF technology, you can better incorporate and integrate it with your existing and planned technology deployments, including cloud, on-premises, and hybrid topologies.

The Rise of Botnets

First, let's talk about botnets. These are networks of compromised computers controlled by hackers. Initially, botnets were used mainly for Distributed Denial of Service (DDoS) attacks. However, hackers have now industrialized botnets to automate attacks for different purposes. They can grow the size of the botnet, execute DDoS attacks, or even carry out surgical strikes against websites and applications.

What's more, hackers have started offering botnets-as-a-service on the dark web. This means that attackers can rent botnets to execute their own campaigns. It's a structured, albeit illegitimate, business model that's making cyber attacks more efficient and widespread.

The Complexity of Code and the Use of Third-Party Libraries

The past decade has seen an explosion of open-source code. This has given developers a plethora of choices about which libraries to use to minimize development effort. However, this has also opened up new avenues for attackers.

Attackers are constantly looking for vulnerabilities in commonly used libraries like OpenSSL. A vulnerability in such a common core security library can have serious security implications. Remember the Heartbleed Bug? It was a serious vulnerability in the OpenSSL cryptographic software library that allowed for the theft of information protected by SSL and Transport Layer Security (TLS) protocols.

The Advent of Microservices

Another trend in the development world is the use of microservices. These are small, discrete services that allow development teams to deploy new functionality iteratively and in quick, small sprints. However, each microservice potentially represents its own unique attack surface that can be exploited.

Developers often incorporate third-party libraries in these microservices as needed. This can introduce more individual attack surfaces and vulnerable third-party libraries, exposing your organization to additional risk.

The Challenge of Secure Application Development

Application development is like the Wild West. Developers have full freedom to pull third-party libraries from anywhere on the web. But what if they're using versions of these libraries that have been modified with backdoors or other malicious code? Or what if they're using older versions with known vulnerabilities?

The good news is that with the advent of DevOps, the ability to lock down source libraries through programmatically managed pipelines and build processes has greatly increased. However, many development teams are still in the early phases of adopting mature DevOps deployments. In the meantime, this needs to be balanced with compensating controls like regular vulnerability scanning or virtual patching and attack detection by using Web Application Firewalls (WAFs).

The Threat of Compromised Credentials

It's estimated that 50% of cyberattacks involve compromised credentials. The system of using usernames and passwords to gain access to websites is fundamentally broken, but it continues to perpetuate. For attackers, using compromised credentials is the simplest way in the front door. They want to expend the least amount of effort.

Compromised Accounts: The Dark Side of the Web

When we talk about compromised accounts, we're usually referring to end-user accounts. These are the accounts that everyday users like you and me have with various online services. When a major service like Yahoo! gets hacked, the stolen credentials can be used in what's known as credential stuffing attacks.

In these attacks, bots are configured to replace the variables of username and password with the compromised data. These bots can then attempt to gain access to other services using these stolen credentials. The scary part? These repositories of hacked usernames and passwords can be found on the dark web and sold to anyone willing to pay in Bitcoin. And they're not just sold once - they can be resold over and over again.

Sensitive and Privileged Accounts: A Hacker's Goldmine

Another type of account that can be compromised is a sensitive or privileged account. These are accounts that have administrative privileges over operating systems, databases, and network devices. If a hacker can gain access to these accounts, they can gain full control of a system or network.

A hacker might do this by escalating their privileges. For example, if a hacker gains access to a non-privileged account, they can then attempt to escalate their privileges by exploiting vulnerabilities in the system. This could involve identifying vulnerable software versions, researching known exploits, and then using these exploits to gain higher-level access.

Types of Attacks: Understanding the Threat Landscape

Now that we've covered the types of accounts that can be compromised let's move on to the types of attacks that can occur. For this, we'll use the Open Web Application Security Project's (OWASP) Top 10 list, which is the industry standard for categorizing application-level vulnerabilities and attacks.

The OWASP Top 10 includes:

  1. Injection
  2. Broken Authentication
  3. Sensitive Data Exposure
  4. XML External Entities (XXE)
  5. Broken Access Control
  6. Security Misconfiguration
  7. Cross-Site Scripting (XSS)
  8. Insecure Deserialization
  9. Using Components with Known Vulnerabilities
  10. Insufficient Logging and Monitoring

Each of these attacks represents a different way that a hacker can exploit vulnerabilities in an application or system. By understanding these attacks, we can better protect ourselves and our systems.


In the world of cybersecurity, knowledge is power. By understanding the types of accounts that can be compromised and the types of attacks that can occur, we can better protect ourselves and our systems. Remember, the first step towards protection is understanding the threats we face. Stay safe out there!

· 3 min read

A New Approach to Software Integration. Flow is a concept in networked software integration that is event-driven, loosely coupled, highly adaptable and extensible. It is defined by standard interfaces and protocols that enable integration with minimal conflict and toil. Although there isn't a universally agreed-upon standard for flow today, it's poised to drive significant changes in integrating businesses and other institutions.

Key Properties of Flow

Flow is the movement of information between disparate software applications and services. It is characterised by the following:

  • Consumers (or their agents) request streams from producers through self-service interfaces.
  • Producers (or their agents) choose which requests to accept or reject.
  • Once a connection is established, consumers do not need to request information actively—it is automatically pushed to them as it is available.
  • Producers (or their agents) maintain control of the transmission of relevant information—i.e., what information to transmit, when, and to whom.
  • Information is transmitted and received over standard network protocols—including to-be-determined protocols precisely aligned with flow mechanics.

Flow and Integration

Flow and event-driven architectures are exciting as they are crucial in our economic system's evolution. We are quickly digitising and automating the exchanges of value—information, money, and so on—that constitute our economy. However, most integrations we execute across organisational boundaries today are not in real-time, and they require mostly proprietary formats and protocols to complete.

The World Wide Flow (WWF)

The global activity graph is the World Wide Flow (WWF). The WWF promises to democratise the distribution of activity data and create a platform on which new products or services can be discovered through trial and error at low cost. The WWF promises to enable automation at scales ranging from individuals to global corporations or even global geopolitical systems.

Flow and Event-Driven Architecture

Event-driven architecture (EDA) is the set of software architecture patterns in which systems utilise events to complete tasks. EDA is a loosely coupled method of acquiring data where and when it is valid, like APIs. However, its passive nature eliminates many time- and resource-consuming aspects of receiving "real-time" data via APIs. EDA provides a more composable and evolutionary approach to building event and data streams.

The Ancestors of Flow

There are plenty of examples of real-time data passed between organisations today, but most don't flow, as defined here. Generally, existing streaming interfaces are built around proprietary interfaces. The producer typically designs the APIs and other interfaces for their purposes only or is being utilised for a specific use case, such as industrial systems.

Code and Flow

If we are looking at the flow basis, we must look at another critical trendsetting the stage. "Serverless" programming depends on the system's flow of events and data. The increased adoption of managed queuing technologies such as Amazon Managed Streaming for Apache Kafka (Amazon MSK) or Google Cloud Pub/Sub, combined with the rapid growth of functions as a service (FaaS) code packaging and execution, is a valid signal that flow is already in its infancy.

In conclusion, flow is a promising concept that could revolutionise integrating software and systems. Flow could unlock a new level of automation and efficiency in our digital economy by enabling real-time, event-driven communication between disparate systems.

· 3 min read

OpenAPI is a specification for designing and describing RESTful APIs. The OpenAPI extension, also known as the OpenAPI specification extension, is a way to add additional information to an API definition. In OpenAPI specifications, extensions allow adding vendor-specific or custom fields to a specification document. They are defined as a field in the specification with the key starting with "x-", for example, "x-vendor-field". The contents and meaning of these extensions are specific to the vendor or tool using them and are not part of the OpenAPI specification.

OpenAPI extensions can help designers in several ways:

  • Adding custom fields: Extensions allow designers to add custom fields to the OpenAPI specification, which can provide additional information about the API and enhance the design.

  • Enhancing tool support: By using extensions, designers can add functionality specific to their tools or workflows and improve the tool support for their API design.

  • Improving collaboration: Extensions can be used to share additional information between different teams and stakeholders involved in the API design process, enhancing collaboration and communication.

  • Supporting vendor-specific features: Extensions can support vendor-specific features, such as specific security protocols or data formats. The core OpenAPI specification may not support that.

  • Streamlining development: By using extensions, designers can simplify the development process and ensure that all necessary information is included in the specification, reducing the risk of miscommunication or misunderstandings.


The "x-badges" extension in OpenAPI specifications allows designers to display badges, or small graphical elements, in the API documentation. These badges can be used to provide additional information about the API or to highlight specific features.

Here are some of the ways that "x-badges" can help with OpenAPI specifications:

  • Showing API status: Badges can be used to indicate the status of an API, such as "beta" or "deprecated." This information helps developers understand the current state of the API and whether it is appropriate to use.

  • Highlighting important information: Badges can highlight important information about the API, such as the version number, release date, or supported platforms. This information can be displayed prominently in the API documentation, making it easier for developers to find.

  • Providing visual cues: Badges can give visual cues that draw attention to specific information in the API documentation. This makes it easier for developers to find the information they need quickly.

Overall, the "x-badges" extension in OpenAPI specifications provides a simple and effective way to display additional information about the API. By using badges, designers can improve the usability and clarity of their API documentation.

Docusaurus Plushie


Providing sample code: The "x-code-sample" extension can be used to include sample code snippets for different programming languages. This can help developers understand how to use the API and make it easier for them to get started.


Defining authentication information: The "x-client-id" and "x-client-secret" extensions can be used to define the client ID and secret required for authentication with the API. This can help ensure that developers have the information they need to properly use the API.


Enforcing security measures: The "x-pkce-only" extension can be used to enforce the use of Proof Key for Code Exchange (PKCE) in OAuth 2.0. This is a security measure that helps prevent unauthorized access to an API.


In summary, the OpenAPI extension allows designers to provide additional information and constraints to the API definition, making it easier for developers to understand and use the API. By using extensions, designers can improve the usability and security of their APIs.

· 6 min read

The growth of technology has resulted in APIs becoming a critical component of modern software development. They serve as the means of communication between different systems, allowing for the exchange of data and the execution of specific functions. As the importance of APIs continues to rise, adopting a more structured and well-designed approach to their development has become imperative. One such approach is the "API First" culture, which prioritises the design and development of APIs at the forefront of the software development process.

In this blog post, we will delve into the significance of an "API First" culture for enterprises and provide examples of companies that have suffered from not adopting this approach. We will also examine the benefits of this culture and explain why it is a superior approach to the traditional "Code First" culture.

An "API First" culture recognises APIs' critical role in modern software development and strongly emphasises their design and development. This approach ensures that APIs are well-designed, user-friendly, and secure. Organisations can improve the user experience, increase security, and simplify maintenance processes by prioritising the creation of APIs.

However, not all organisations fully embrace an "API First" culture. As a result, some companies have suffered from not adopting this approach, resulting in poorly designed APIs that are difficult to use and maintain. This can lead to decreased user adoption, increased development costs, and decreased overall project success.

Compared to the traditional "Code First" culture, an "API First" culture is a better approach. The "Code First" culture prioritises code development, with the design of APIs being an afterthought. This approach can lead to poorly designed APIs that are difficult to use and maintain. In contrast, an "API First" culture places the design and development of APIs at the forefront of the software development process, ensuring they are well-designed and user-friendly.

In short, an "API First" culture is essential for modern software development and has numerous benefits over the traditional "Code First" culture. By placing the design and development of APIs at the forefront of the software development process, organisations can ensure that their APIs are well-designed, user-friendly, and secure.

Why "API First" is Better than "Code First"

The "Code First" culture is a traditional approach to software development where developers begin coding without first defining the API. Unfortunately, this approach can lead to several problems, including:

  • Lack of standardisation: Without a well-defined API, different systems and teams may use different methods to communicate with each other, leading to a lack of standardisation.

  • Difficulty integrating with other systems: Code-first approaches can make incorporating new techniques and technologies into the existing architecture challenging.

  • Lack of scalability: Code-first approaches can make it challenging to scale applications as new systems and services are added to the architecture.

On the other hand, the "API First" culture prioritises the design and development of APIs, making it easier to ensure standardisation, scalability, and integration. By starting with the API, developers can:

  • Define a clear and consistent interface for communication between different systems and services.

  • Design the API to be scalable and flexible, making integrating new systems and technologies easier as they become available.

  • Ensure the API is well-documented, making it easier for other teams and developers to understand and use.

Learning the hard way...

Some real world examples of Companies that Have Suffered from Not Adopting an "API First" Approach

Several examples of companies have suffered from not adopting an "API First" approach. One such example is Twitter. In the early days of Twitter, the company focused on growing its user base and did not strongly emphasise the development of APIs. Unfortunately, this led to a proliferation of third-party applications that used Twitter's data in unapproved and often unreliable ways.

Another example is Uber. In the company's early days, the focus was on building the core service, and APIs were not a priority. Unfortunately, this led to a fragmented ecosystem of third-party applications that used Uber's data and services in inconsistent and often unreliable ways.

Both examples illustrate the importance of an "API First" culture, as companies prioritising the development of APIs can better ensure standardisation, scalability, and integration.

Benefits of an "API First" Culture

An "API First" culture has several benefits, including:

  • Improved Standardisation: By defining APIs before starting to code, organisations can ensure that different systems and services use a consistent and standardised approach to communication.

  • Better Integration: API First approaches make integrating new systems and services into the existing architecture easier, as the API provides a clear and consistent interface for communication.

  • Improved Scalability: API First approaches make it easier to scale applications, as the API can be designed to be flexible and scalable from the start.

  • Improved Documentation: Building a market leading product also requires great developer experience.

  • Better User Experience: By designing APIs first, organisations can ensure that their applications provide a consistent and seamless user experience, regardless of the device or platform used.

  • Faster Time to Market: By prioritising the development of APIs, organisations can reduce the time required to bring new products and services to market. The API provides a clear and consistent interface for integration with other systems and services.

  • Increased Innovation: An API First culture encourages innovation, making it easier for developers to integrate new technologies and services into the existing architecture. This can lead to the development of new and innovative products and services.

How to Adopt an "API First" Culture

Adopting an "API First" culture within the enterprise requires a shift in mindset and approach. Here are some steps that organisations can take to embrace an API First culture:

  • Define API standards: Establish clear standards and guidelines for API design and development. This will help to ensure consistency and standardisation across the organisation. Prioritise API development: Make the development of APIs a priority, and allocate sufficient resources and time to the API development process.

  • Foster collaboration: Encourage collaboration between different teams and departments, including product management, design, development, and testing.

  • Invest in API management tools: API gateways and API management platforms help manage and monitor API usage. They also simplify common non-functional requirements like rate limiting, caching and mocking responses.

  • Encourage innovation: Developers and teams should be creative and innovative when designing and developing APIs. This can lead to the development of new and innovative products and services.


An "API First" culture is essential for organisations that want to ensure standardisation, scalability, and integration in their software development processes. Organisations can improve the user experience, reduce development time, and increase innovation by prioritising the design and development of APIs.

Adopting an API First culture requires a shift in mindset and approach, but the benefits are substantial. Organisations that invest in API development and management will be well-positioned to compete in the digital marketplace and deliver innovative products and services to their customers.

Docusaurus Plushie

· 4 min read

APIs have become an integral component of modern software development as technology evolves. They serve as a means of communication between disparate systems, enabling the exchange of data and the execution of specific functions. As the importance of APIs grows, it is imperative to prioritize the design and implementation of these essential components to meet the needs of both users and stakeholders.

In this blog post, we will delve into some of the best practices for API architecture. This will encompass various topics, including design, security, and maintenance. By adhering to these guidelines, you can ensure that your APIs are reliable, scalable, user-friendly, and easy to maintain.

A well-designed API architecture can have a significant impact on the overall success of a project. It can improve the user experience, increase security, and simplify maintenance processes. Therefore, it is essential to consider the various elements that make up a robust API architecture, including design patterns, security protocols, and data management strategies.

Design, in particular, is a critical aspect of API architecture. A well-designed API should be intuitive, straightforward, and easy to use. However, it should also be flexible enough to accommodate changing requirements and adapt to new technologies. This can be achieved through clear and consistent naming conventions, intuitive endpoints, and the implementation of appropriate error-handling mechanisms.

Moreover, the security of an API is a crucial consideration. With the increasing use of APIs, it is imperative to protect sensitive data against malicious attacks. This can be achieved through encryption and authentication protocols and by implementing strict access controls.

Maintenance is another critical aspect of API architecture. It is essential to have a well-defined process in place to ensure that APIs are up-to-date, secure, and performing optimally. This may involve regular software updates, performance monitoring, and the implementation of disaster recovery plans.

In conclusion, by following the best practices for API architecture, you can ensure that your APIs are reliable, scalable, and easy to use. Furthermore, by prioritizing design, security, and maintenance, you can create a robust API architecture that meets the needs of your users and stakeholders and supports the success of your project.

Following these guidelines ensures that your APIs are reliable, scalable, and easy to use:


  • Use a consistent naming convention: Consistent naming conventions make it easier for developers to understand and use your API.
  • Use clear and concise URL structures: URLs should be intuitive, self-explanatory, and indicate the resource they are accessing.
  • Use HTTP status codes appropriately: HTTP status codes provide information about the result of an API request. Use them appropriately to indicate success or failure and provide additional information about the request's status.


  • Use HTTPS: HTTPS encrypts all data transmitted between the API and client, protecting sensitive information from prying eyes.
  • Use API keys: API keys allow you to control access to your API and track usage.
  • Implement rate limiting: Rate limiting helps prevent abuse and ensures your API can handle high traffic levels.


  • Monitor API usage: Regularly monitoring API usage helps you understand its use and identify potential issues.
  • Document your API: Documentation is crucial for developers to understand and use your API.
  • Test your API regularly: Regular testing helps you catch and fix issues before they affect users.

Following these best practices ensures your API is well-designed, secure, and scalable. This will make it easier for developers to use and will help ensure that it continues to meet the needs of users and stakeholders over time. Effective API architecture is essential for the success of any software development project. By paying close attention to design, security, and maintenance, you can ensure that your APIs are reliable, scalable, and easy to use.

Docusaurus Plushie