Streamline and shorten release cycles
Provide visibility through automation to the release process
Automate testing from unit- to system-level tests
Gain confidence in code changes, automatically
Digitalize the use of OSS packages/libraries
Visibility and management of licenses and security vulnerabilities
Infra & Analytics
GitOps: Repeatable, consistent and resilient infrastructure
Make sense of your data, from Dev to Delivery
Version control tools control the way we parallelize work and integrate changes together. Distributed version control tools like Git provide great flexibility and control over different agile workflows with high levels of automation through different CI/CD tools.
Lots can be done to improve software builds. Modern build systems provide much better dependency management and high-level abstractions to help integrate a variety of different tasks. Furthermore, how can you truly trust your build until you can guarantee that it works in an isolated immutable environment? We want to create build systems that can be trusted, always repeatable and optimized for performance.
Building quality and security into software requires building quality and security into the development process. Our vast knowledge of testing techniques and tools can enable your team to truly deliver to the highest possible quality, whilst building awareness and knowledge along the way. By applying quality gateways in the development process, we ensure this quality is integrated right from the start.
Managing build artifacts is like managing source code. Creating build artifacts daily that can be used for testing and having a well-defined promotion strategy are key to enabling Continuous Delivery. After all, it is not the source code that we want to ship but the output of the build. Additionally, we typically want to create different flavours of the build; a debug build for the bulk of testing and optimized release builds for non-functional testing.
Manual testing is no longer an option if you want to stay competitive. We can help implement automated functional, performance and system security testing to give you the confidence to deliver, continuously, and increase your competitive advantage. Digitalization of requirements and tests is a necessary step, enabling you to understand the impact of change and trace test results to ensure software does not regress.
Different levels of static code analysis help identify and remediate different kinds of technical debt. For example, static analysis can be used to:
- Enforce an intended architecture
- Comply with coding guidelines: style, quality and security
- Identify 3rd party component licenses and security vulnerabilities
- Development bottlenecks through different kinds of hotspots
Enforcing good design helps build more evolvable and maintainable software. Verifa have created a model for defining “Architecture as Code” allowing developers to maintain the architectural design in a YAML file together with their source code (e.g. in a git repo). This way, architecture and design go through the same process that software does: continuous development and review.
Make your software designs matter by enforcing them and making them part of the agile software practice.
We help companies adopt new methods and frameworks for unit testing enabling effective tests to be written.
Automating higher levels of testing, such as integration and system tests, is a necessity for agile development. These tests form a regression suite to re-verify every code change, allowing teams to develop faster with more confidence.
We help companies adopt new testing techniques for acceptance testing and help create a robust a resilient environment for the tests to be executed in using practices such as Infrastructure as Code and Configuration as Code.
Non-functional tests are those tests which test non-functional requirements, such as performance, robustness and security. Typically acceptance tests can be re-purposed for this use case.
We help companies create a regression test suite that focuses not just on functionality, but also ensuring that their systems will survive over time and security attacks.
Software Composition Analysis is a technique used to identify which components are used in your project. The project can include internal proprietary components as well as open source components. Both cases must be taken into the consideration to have full knowledge of components that make up your software system and to create the bill of materials or BOM, which is by definitions the structure of the project.
When we have knowledge of which component are there in our project we need to store that information somewhere. That place should be somewhere where the information is easy to access and only the most relevant information is clearly presented. This place should also enable other support functions for open source license compliance. Those functions could be like generation of license documentations and being integration point for other tools it the toolchain. This software component management software is called software component catalogue. In software component catalogue both proprietary and open source components are listed.
The clearing process or component clearing means the process in which we identify the license under which the software component is licensed. This can be done using some license scanner, which scan the source code using different pattern matching algorithms to recognize the license text or parts of it.
Finding license related test from the source code is not so simple task, because of several reasons: there can be only parts of the full license text, license text can be slightly modified or there can be only a reference to see URL or some file. Also, when scanning a whole software project, it is obvious that there will be several different open source licenses in it, so it must also be taken into the consideration that the licenses are mutually compatible. The component clearing is not so straight forward process, so that is why there should be own team to do it.
It should also be distinguished that component clearing and license clearing are two different things. License clearing means the process where license text is read, and the granted rights, obligations and restrictions are evaluated for each license. License clearing is done at least with the help from legal and often by legal only.
One other big matter in the world of open source compliance is perception of vulnerabilities. Open source component as every piece of software can have security holes. The good thing about software being open source is that that the vulnerabilities are commonly known and therefor the top fix for that is also known and because of this transparency open source components are often more secure than their commercial counterparts.
The known vulnerabilities are listed in NVD (National Vulnerability Database), which is hosted by US government. The riskiest time is starting from when a vulnerability is found and published to NVD and to the time when a solution to cover that vulnerability is made and published.
A known vulnerability is always related to a certain component and its version, so the identification of the possible vulnerabilities in a software project should be done right after the software composition analysis, because at this time we know which components we have in our project, so we can then just query NVD for vulnerabilities for our components. Possible vulnerabilities then should be listed somewhere and of course fixed.
In open source compliance we often have a toolchain to perform all tasks, so we do not have only one tool to handle every step. We are trying to solve the OSS Compliance Workflow using OSS. Typically tools like OSS Review Toolkit are using for Software Composition Analysis.
For open source component catalogue, we have used SW360. It is a solid platform for watching information about projects and their open source components.
One tools that is very reliable in component clearing if Fossology. It does a deep scan for source code files for license texts and provides a good platform for reviewing scan results.
For vulnerability management we have used SW360
GitOps is the modern term for making Git repositories the source of truth for systems. These systems could be your end product or your delivery pipeline. What does this mean? It means applying all the great things with development to infrastructure!
We have helped several teams and organizations make their journey to the cloud. We give people the confidence, if required, and the knowledge, when needed, and the right technologies to help them take off to the cloud!
Monitoring is an important part of any system, whether it be for infrastructure or applications. Monitoring helps us take proactive measures towards maximising uptime and preparing for potential scalability issues.
Using social code analysis we identify bottlenecks in your development velocity. By combining change history together with software complexity we correlate parts of code that are frequently changed with those that are complex to quickly identify candidates for refactoring to speed up your development.
Our assessments include interactive workshops helping everyone to understand and appreciate different roles, and better understand how better teamwork can lead to better results
Visual Release Process
Using physical cards and notes we help extract your release process into a visual view, with all relevant stakeholders, providing a bird’s eye view to what goes into a release.
The outcome of the assessment should be a clear and well prioritised list of next steps to help you on your journey towards Continuous Delivery & DevOps