Engineering Change Part 2: How programming languages and platform choice influence adaptability

By Declan Newman
An abstract image with the words The Inviqa Guide to Engineering Change and the number two on it

This five-part series explores how composable architecture, emerging technologies, and a collaborative engineering culture drive meaningful transformation. It’s a practical guide for organisations looking to modernise legacy systems, unlock agility, and build future-ready digital solutions. 

This is part two: How language and platform choice influence adaptability

In it, Head of Engineering at Inviqa, Declan Newman, explores the programming languages and platforms Inviqa uses to balance performance, flexibility, and scalability.

In a snapshot:
•    TypeScript and JavaScript power unified front-end and back-end development
•    Python drives data and AI work; Go supports high-performance services
•    CMS abstraction layers future-proof content platforms
•    PxM, DAM, and commerce platforms are selected based on client needs, not one-size-fits-all

You can read part one by clicking here
 

 

The programming languages we speak at Inviqa

Over the years, we’ve worked with many programming languages and paradigms - from low-level C to functional Haskell and beyond. These days, we focus on a core set of languages that offer the right balance of clarity, productivity, and performance for modern software engineering.

TypeScript: TypeScript is a mainstay across much of our stack. It underpins our front-end work (typically using frameworks like Next.js) and is our preferred language for building serverless APIs and lightweight backend services. 

We value its static typing, expressive syntax, and rapid startup times - especially when running in serverless environments. Using TypeScript also enables a unified development model across both client and server codebases. 

JavaScript: JavaScript still has an important place, particularly in cases where speed of iteration, flexibility, or wide ecosystem support is key. 

For small tools, quick prototypes, or scenarios where static typing isn’t required, it remains a practical and productive option that lets us leverage the vast JavaScript ecosystem. 

Python: Python is the backbone of our data and AI work. We use it for orchestration, ETL pipelines, data transformation, and machine learning integrations - particularly when working with platforms like Snowflake, Pandas, or the OpenAI API. Python’s maturity and the breadth of libraries available allow us to go from prototype to production in the data/ML space very quickly. 

Golang: When low latency, high throughput, and simplicity are critical, we reach for Go. Its lightweight concurrency model (goroutines), fast compile times, and minimal runtime overhead make it ideal for high-performance microservices, APIs, and systems integrations. 

We often use Go for platform-level services that demand both speed and reliability - such as API gateways, internal developer tools, or any component expected to handle significant traffic with tight response times. Its static binary outputs also simplify deployment, especially in containerised environments. 

Legacy and Interoperability: We maintain strong experience with older technologies such as PHP, Perl, and Java.

This allows us to take on modernisation projects with confidence, incrementally refactor large monoliths, or integrate new services without disrupting business continuity. Whatever the legacy system, we can interface with it and gradually bring it up to modern standards.

 

The technology platforms we work with most often

Similarly, we’ve worked with many technology platforms, giving us experience in not only how each one works, its competencies and quirks, but which ones are most likely to meet a client’s needs. 
From content management systems to business intelligence tools, we have preferred platforms, but are just as comfortable taking on the challenge of learning a new platform when it's most suited to a client’s requirement.

Content Management Systems

For businesses that depend on managing a large volume of content - pages, articles, product information, and so on - choosing the right CMS is critical. Having delivered CMS solutions across a wide range of sectors, we understand that one size does not fit all. 

However, we typically recommend a headless CMS architecture for flexibility, scalability, and ease of integration with modern front-end frameworks. 

Preferred enterprise CMS platforms include Contentful and ContentStack for their robust, API-first capabilities, but we’re equally experienced with open-source and lighter-weight options such as Strapi or Jamstack oriented tools (for example, using static site generators) when budget or specific requirements call for it. The priority is to select the technology that best fits editorial workflows, business needs, and integration landscape. 

To keep things future-proof, we will often provide a CMS abstraction layer - essentially a custom middle-tier API that sits between the front end and the CMS. This isolates your front-end presentation from any one CMS’s specifics. Down the line, if you ever decide to switch CMS platforms (say from Contentful to another service), you can do so without a costly rewrite of your front-end applications. It’s about de-risking your content platform decisions and providing flexibility for future growth and changing needs.

Product Experience Management

PxM is the next iteration of product information management (PIM) systems, an evolution that helps meet rising customer expectations for personalised and contextualised product experiences. You can learn more about PIM here.

Product Experience Management (PxM) is all about ensuring that product information is accurate, consistent, and compelling across all digital channels. A good PxM implementation ensures that customers see the right product content – complete, high-quality, and tailored to their context – regardless of where they interact with your brand. This is more than just a back-office tool; it’s a driver of customer engagement and commercial performance.

It’s especially vital for businesses with large or complex product catalogues, where manually managing content for every SKU across multiple markets or languages can quickly become unmanageable. PxM provides the tools and processes to centralise, enrich, and distribute product data efficiently and we design PxM solutions to integrate tightly with existing commerce platforms, content systems, and data sources. 

In partnership with Akeneo, a leading PxM platform in the market, we deliver scalable and extensible product data management solutions and ensures PxM is not an isolated system but a core part of your digital experience strategy. For organisations managing millions of product records or launching in new regions, Akeneo helps streamline workflows, maintain governance, and support omnichannel consistency. 

Digital Asset Management 

Digital asset management goes hand-in-hand with content management, especially for organisations dealing with large volumes of images, videos, and other media. We ensure that DAM is an integral part of the architecture when needed, so your rich media assets are well-organised, easily searchable, and properly permissioned. 

We have experience implementing both standalone DAM platforms and DAM solutions that are tightly integrated with CMS products. The right approach will be based on your asset volume, workflow complexity, and user access requirements and key considerations to address include taxonomy design (so assets are logically categorised), metadata schemes (to describe assets for search and compliance), versioning of assets, and workflow automation (for approvals, publishing, or expiration of assets). 

Whether you’re managing a library of marketing images, an archive of videos, or any digital content at scale, Inviqa will design a DAM strategy that fits seamlessly into your overall solution. The goal is to let your team, and your applications retrieve the right asset at the right time, every time, without hassle.

Commerce 

Whether you're launching a new online store or scaling an established commerce platform, we’ll help you deliver a seamless and secure shopping experience tailored to your business. Our approach is grounded in practical experience: we’ve implemented successful online stores using platforms like Shopify, BigCommerce and Adobe Commerce (Magento), and we know how to extend these out-of-the-box solutions with custom functionality when needed. 

We routinely build and integrate microservices to handle bespoke requirements that go beyond the standard platform features - for example, custom promotional logic, dynamic pricing engines, or complex fulfilment orchestration workflows. This modular architecture ensures your commerce system can evolve and scale without becoming fragile or overly complex. If a particular feature can’t be easily achieved within the commerce platform, we build the missing piece as an independent, well-integrated service. 

Integration with payment providers is carried out with an unwavering focus on security and compliance. We have experience working with a broad range of payment gateways - including Stripe, Worldpay, PayPal, and Amazon Pay - and we ensure that all transactions meet PCI DSS security standards. From checkout to refund handling, we make sure every payment touchpoint is robust and safe. 

Importantly, we never advocate complexity for its own sake. Where possible, we recommend the simplest, most reliable technology that will meet your current and future business goals. Sometimes that means leveraging a proven SaaS platform like Shopify; other times it means delivering a fully custom headless storefront backed by scalable APIs. In every case, we work with you to design an commerce architecture that delivers value from day one and adapts as your business grows.

Business Intelligence 

From real-time event pipelines to business dashboards, our data engineering and business intelligence solutions are grounded in proven technology choices and architecture patterns that prioritise scalability, clarity, and actionable insight. 

We predominantly use event-driven architectures to move and transform data. For example, we’ll ingest and process events through messaging systems like Amazon SQS, SNS, Apache Kafka, or RabbitMQ to decouple producers and consumers. This means as your data volumes grow, or new systems need to consume the data, we can scale out without bottlenecks or tight coupling between components. 

For data processing, we favour Python along with its rich ecosystem of libraries. Whether we’re streaming telemetry data into a data lake, performing Extract, Transform, Load (ETL) tasks with Pandas or PySpark, or orchestrating complex workflows with tools like dbt, Python gives us the flexibility and reliability to deliver results quickly. It’s an ideal fit for transforming and cleaning data, and for gluing together various cloud services. 

When it comes to data storage and analytics, we work with modern cloud data warehouses such as Snowflake, Amazon Redshift, or Google BigQuery - these provide the performance and scalability for large-scale analytics. On top of that, to help teams explore and make sense of the data, we integrate Business Intelligence (BI) tools like Power BI, Tableau, or Looker. These tools make it easy to visualise key metrics and enable self-service reporting, so decision-makers can get insights without needing a developer for every query. 

Whether you’re modernising legacy ETL processes, building a real-time analytics pipeline, or just need better reporting on your business data, we design solutions that are effective today and adaptable for tomorrow’s questions.

 

So, that’s a peek into the tools and tech we lean on to build smart, scalable solutions. From the languages we code in to the platforms we trust, it’s all about choosing what works best for each challenge. 

Look out for the next instalment, where Declan explores our approach to platform engineering, bespoke solutions, and keeping users secure with smart authentication strategies.