At Men&Mice we challenge ourselves to give our customers the best software solution experience possible with rigorous automated and manual testing.
Feb 24th, 2022
Software may be eating the world, but there’s a lot that goes into putting software out for public use in the first place. At Men&Mice we put our Micetro software solution through rigorous testing before we release it to our customers.
Micetro is known for integrating seamlessly into your current environment. That’s part of the reason why it’s a sustainable network management solution. However, that means it needs to be compatible with and maintain compatibility with several different systems. This would include the following services both on-premises and in the cloud:
For more information on system requirements and supported systems check out our documentation: https://menandmice.com/docs/10.2/guides/implementation/system_requirements
The nature of the tech industry is that it’s always innovating. As companies take on multicloud and SASE (Secure Access Service Edge) architectures, we need to make sure we add those services to our growing test matrix. While we do that, we also need to maintain older code within our solution, up to a point, so as to not force our customers to upgrade before they’re ready and be confident with backwards compatibility. Lastly, we need to be able to understand and even replicate our customers’ environments, at least the common topologies, so that we can best understand how they work every day.
While we are constantly making improvements and adding testing to our process, the following portrays a general flow of how new versions are developed.
5. Unit tests are run on all supported platforms and databases
6. Other integration tests are run if needed
7. Code changes are reviewed by at least two other developers
8. Pull requests are approved and new code is pulled into the relevant branch
Types of Testing Used in Micetro Development
Let’s dive into the kinds of testing Men&Mice does and how these tests ensure our users have the best experience possible when using the Micetro solution.
Small, often automated tests, which ensure that a small part of the code meets the originally intended behavior. Unit tests are run early and often so that developers can move quickly with confidence that the code is working as it should. Different unit testing frameworks are used such as:
Unit tests are written alongside the code. A side benefit of writing these in parallel is that our developers end up with better code. We run tests on all pull requests and if a test fails then that code will need to be fixed.
This testing happens after unit testing and the tests are often written in Python at Men&Mice. We’ll also run integration tests on pull requests, after unit testing is complete. Micetro offers REST, SOAP, and JSON-RPC APIs and integration testing is done through these APIs.
We’ll run these tests nightly using different configurations of supported operating systems, databases, DNS and DHCP services, network devices, as well as using on-premises and cloud services.
Once Integration tests are complete, we move on to end-to-end system tests. System tests validate that the entire software solution is working as expected. At Men&Mice we use the Cypress Testing Framework to run system tests. Cypress is a front-end testing framework we use on our web application, which lets us do E2E (End-to-End) advanced custom testing.
Stress testing generally involves putting the software under a heavier load than usual. This can be used to validate scalability, uptime, and performance expectations. When performing stress tests we’ll sometimes stress the system manually to emulate the kinds of stress our customers might put on Micetro. Other tools to perform these kinds of testing were created by our developers.
Example stress test from Men&Mice:
API Recording and Replaying – We place a hook in the Central server which records all incoming API calls to a log. The calls can then be replayed. We use this to do things like emulate multiple users logging into the Web Application at once.
These tests are great for finding concurrency issues, meaning when multiple users are trying to perform similar tasks we need the solution to handle that appropriately whether that’s with error handling or by presenting a conflict strategy. Concurrency issues can be very hard to find with regular automated testing, so stress/load testing helps uncover these issues.
This testing includes multiple tests often categorized with a particular attribute. For example, there may be a test suite which test for performance. Other examples include:
Developing a software solution which doesn’t have any defects is challenging. We do our best to catch all issues before they go out, but when customers run into issues, we do our best to fix them immediately. That’s why we use an Agile methodology to give our customers the best experience with our software. Have you run into an issue, or possibly just have a suggestion for improvement? Perhaps you wish an error message just had a little more information? We’re here for it! Let us know: https://menandmice.com/get-in-touch