I’d like to thank the team for the release 6.0. This version is a big achievement. It is the first version of the software that uses different components and links different machines from different sites. Let’s go for version 7.0!
The client is a global leader in blood component, therapeutic apheresis, and cellular technologies. The corporation’s main activities are development, manufacturing, export, import, marketing, and distribution of medical devices, supplies, and accessories.
The tested product is a dedicated system intended to collect, store, and process session data from blood transfusion devices (4 of them) that can be connected to the client’s local area network (LAN).
This connection allows viewing and analyzing the gathered data, adapting the device/parameter settings via the customer’s PC, and bi-directional communication with third-party systems linked to the same LAN.
The IT solution consists of 3 applications: the web server application (WSA), the device assistant application (DAA), and the updater.
WSA is responsible for:
DAA is responsible for:
Updater is responsible for:
To ensure software compliance with a range of international security standards established for healthcare-related products, the client turned to a1qa to face the challenge.
As the software was embedded in a medical device, its development and testing should comply with the IEC 62304 standard. It provides a list of tasks and activities that support the safe design and maintenance of medical device software. The goal is to ensure the app does what is intended without causing any unacceptable risks.
Within the IEC 62304 regulation, the IT product is assigned with the safety class according to the possible effects on the patient, operator, or other people resulting from a hazard (being a potential source of harm) to which the system can contribute.
The software safety classes shall initially be assigned based on severity as follows:
The IT solution under test is assigned with the safety class C, and it places a heavy burden on the QA team. Helping assure the quality of software that may lead to such severe consequences is highly challenging and responsible, it requires complete attention from QA engineers. The stakes are too high to let a bug make it into the production.
From the beginning of involvement in the project, the a1qa team adjusted to the Agile delivery model.
Considering the specific product focus and the high risk of potential harm to health, a1qa established and further fine-tuned the knowledge transfer process so that newcomers were aware of the slightest nuances of the software architecture.
Firstly, the experts immersed themselves in the specifics of the product, corresponding devices, and international standards. After that, they jumpstarted working with emulators, deploying and upgrading the app on virtual machines. They conducted smoke tests that covered the entire functionality.
a1qa’s team fast-tracked the learning process by consolidating the baseline features from multiple modules into a single solution. Thus, a full exploration of all 6 modules took 2 months instead of a year.
Prior to functional testing, the team evaluated the requirements to ascertain if they were compliant with the primary characteristics of quality and possessed no logical contradictions.
They checked each new build by applying smoke tests aimed to ensure the absence of critical issues that could block further testing. Then, they incorporated new feature testing to make sure recently added functionality was implemented fully without affecting the logic of the product.
Regression testing scope helped confirm the faultless operation of the previously introduced refinement. The engineers prepared tests to check the main logic of the project and the rest functionality.
To supplement the process, the QA team conducted defect validation. Finally, the rate of rejected defects was calculated to define the quality of the defect fixing.
To cover extensive regression and new feature scope and fast-track the testing process, the client put test automation in place as a major time-saver.
Adherence to shift-left testing fostered the involvement of fully interchangeable cross-functional testing professionals so that functional testing expert was engaged in automated testing activities.
Currently, the team is configuring the CI pipeline for high-level system tests while for low-level unit, component, and interface tests it’s already set up.
The team tuned regular late-night and post-build test runs performed every 5 days using GitHub Actions.
To increase transparency, the engineers are establishing and configuring reporting practices based on metrics covering the number of defects and dynamics of failed tests.
Due to close cooperation with developers and joint performance on the project, a1qa also follows quarterly indicators established for the whole delivery team.
Although the test automation process has been initiated only recently, a1qa has made a major contribution to further speed-up in time-to-market.
a1qa’s specialists performed cross-browser (Edge, IE11, Mozilla Firefox, Google Chrome, Safari) and cross-platform (Windows and SQL servers) tests to optimize time when all the planned functionality was fully implemented.
To minimize the risks of defects and fix them at the early stages, browsers were rotated from build to build for the smoke test implementation. MAT (functionality) and AT (GUI) types of coverage were applied for all compatibility tests.
Apart from that, the QA engineers verified compatibility of the website layout for mobile devices (Android, iOS tablets, and the range of browsers).
Considering that the client’s solution transfers highly sensitive blood donation data, the a1qa team carried out an external security audit in accordance with the range of international standards, including IEC 62304 (Class C), HIPAA, FDA, and OWASP.
The audit focused on assessing the security level of the system, identifying its vulnerabilities, validating the version against the requirements of security regulations, and defining further steps for improvement.
A security audit strategy applied by the QA team included the following steps:
Having completed the audit, the team prepared a detailed report containing the security level of the system, existing risks, an extensive description of spotted vulnerabilities, and recommendations for secure fixing.
Due to the highly specific features of the tested system and the availability of multiple blood transfusion devices connected to it, a1qa came up with a customized performance testing approach.
QA activities comprised the testing of 4 basic dimensions:
This ‘fit for purpose’ approach helped craft the process and consider multiple factors affecting the server to assess the influence of each one and determine the ways for enhancement.
The performance QA team tested the abovementioned cases against several aspects:
And finally, they verified the joint impact of these aspects on the system with the estimated ratio of devices.
The QA specialists tested the databases with the various number of records – from an empty one to a filled in. They applied:
For each type of the database, the engineers created and ran approximately 500 test cases.
As a result, a1qa identified bottlenecks in the performance of the client’s software, evaluated its technical capabilities, and provided the opportunities for enhancement.
Multiple hospitals that use the client’s solution store large-scale amounts of highly sensitive data that must be saved each time they upgrade to the new release version. The risk may arise as the database structure or technologies change, and often the information from the old version may not be supported by the new one.
To ascertain not a single record is lost on the way, the a1qa engineers adopted data migration testing. They prepared test cases for the creation of test databases that could comprise as many customer’s records as possible. Then they deployed the old environment, generated a test database for the previous version, copied a virtual machine, performed an upgrade on one of the copies, and compared a copy with the original.
Currently, they support the quality of migration in terms of 4 different system versions.
Both testing types were applied for 22 locales, including Chinese and Korean.
Internationalization testing was to pinpoint potential bottlenecks in the application design that could impede the adaptation process. It ensured that the code could handle international support without breaking functionality that would cause either data loss or display problems.
The engineers were verifying the functionality of the product with multiple culture/locale settings applying every type of international input.
Typical checks included text direction and navigation, typing text using the clipboard in the target language, ensuring that time, date, and numbers, and unit formats correspond to the established OS settings.
Localization testing was performed after the implementation of planned functionality using a pseudo-localization approach. It allowed finding all major issues before starting core localization tests.
The engineers created 2 resource files – one European and one Chinese (hieroglyphs). Once these files were received by the system, localization tests were performed for the remaining languages.
Common localization testing checks incorporated verifying the data was processed in the target language, the text was translated fully, the data wasn’t deformed after size changes, UI elements were embedded in the controls.
Localization testing was presented by:
To reduce testing time, the resource files are analyzed for changes made to them before starting the localization tests. After the analysis, a test strategy is built depending on the number and scope of changes.
This customized technique combining both testing types helped detect over 90% of high-priority issues.
The engineers carried out installation testing to ensure that the application could be installed and run correctly on multiple devices. They also fulfilled compatibility and licensing tests to check how the OS and the whole environment affected the installation process.
Installation testing was conducted on each build for a default environment. To minimize the risks of defects related to different supported environments, fresh installation was performed for all of them during each build.
The engineers ran acceptance tests after adopting all the changes and upgrading functionality on chosen testing environments for the current phase.
Once the development team fixed the defects, a new build was created. Its testing process resumed and ended only after the stabilization of project quality and the execution of control build testing.
Testing of integration was performed for third-party components each time their new version was delivered, against multiple devices, and builds.
As for components, they send data in a special format (DLOG file extension) not only to the main system but to other client’s systems in the form of logs. To ensure that the integration was performed correctly, the engineers applied another service that was writing logs describing the data transfer process and checked their presence.
In terms of devices, each time the device had to be connected to the end user’s system, its program flow was to be configured and sent to this device. Therefore, the QA engineers tested communication with this device to verify that the command was sent with the required status, all the logs were recorded, and a message indicating a successful result was delivered.
They tested the build integration of these two embedded components in the customer’s system. It was vital to ascertain that no data was lost on the way by fulfilling brief smoke tests of the data being sent.
The QA experts fulfilled 2 types of acceptance testing within this phase: MAT aimed to check that all functions worked as designed under ordinary conditions and AT applied to verify that functionality operated properly under both ordinary and urgent conditions.
The engineers performed the assessment based on the expected output results. Besides this, functional tests included integration with third-party application testing to verify that integrated products performed properly and were implemented correctly in compliance with the project requirements.