As technology continues to evolve, speed becomes more important.

As well as faster, more powerful devices, software developers are also expected to write new products and services more quickly too.

These demands have changed the way new apps are written. Most now adopt a modular approach where code is recycled and reused to reduce duplicating work. This approach allows developers to ‘plug in’ code from other sources so that they can connect to third party services quickly and easily to make their apps more useful.

You can think of the process as being a bit like Lego. Each brick is a module of code that the developer bolts together to create an all-new app. And some of those pre-made bricks come from other companies and programmers.

So what’s the problem?

In general, modular development is a great idea for bringing new apps to market faster. But it also means that developers place a lot of trust in the security of these third party modules – particularly as they cannot change or update the code they use.

This lack of transparency can be a problem. If the code contains a bug or some kind of security flaw, the developer won’t know about it. Worse still, they then replicate that flaw in their own app, potentially exposing their users to hacker and cybercriminals.

Give me an example

This element of the unknown isn’t just a cybersecurity risk – sometimes it can create issues with privacy too. Facebook provides a number of code modules and APIs to help developers connect to their services easily.

However, these modules typically include tracking code, allowing Facebook to ‘see’ what users are doing and to help add even more detail to the user’s profile which is then used for selling ads.

The Metropolitan Police decided to implement the Meta Pixel tool to help them understand the effectiveness of their Facebook recruitment ads. However, the tool was applied to every page on their website – including the online forms used to report sensitive crimes like sexual assault.

As a result, details of these crimes, and the victims, were automatically forwarded to Facebook.

This is both embarrassing – and potentially a breach of privacy and data protection laws. And it highlights the risks that modern app developers face.

What can you do?

Unfortunately, these kinds of incidents are completely unintentional – the developers never intend to expose their users to harm. But it also means that app users are completely reliant on developers to understand how to use code modules correctly – and that they test their apps continuously to identify potential privacy and security risks.

As this problem becomes more commonplace, we can only hope that developers pay more attention to where they source their code modules – and the potential implications of each one.