As AIaaS,was being developed it was made clear that it will exceed its main target, which is no other than help thousands of colleagues to automate processes that otherwise they would not have been capable to do due to the systems’ incapacity to work with non-structured data.
It would not have made sense to offer an innovative and robust product that solves this common issue without facing a complex implementation. At MCCM, we have worked with some providers that were held back due to the necessary infrastructure or tedious configuration to utilise their own products, therefore, we wanted to be certain that AIaaS should rise up to the measure of its creation.
Many of you may already know that the algorithms used in the area of artificial intelligence require a high processing capacity, hence, a question to answer was: how to offer an innovative, personal, self-floating product to clients without requiring of them a complex infrastructure in order to execute AIaaS? The answer to this question was: ‘by storing it in the cloud’.
Offering the service through the cloud by calling a REST service permits to use it from any RPA software (UiPath, BluePrism, Automation Anywhere, etc, or by default through any programming language). Sometimes, in order to use the Machine Learning algorithms is required a high processing power, subsequently, instead of having to set a computer on the clients’ site, a self-floating structure was created, providing simultaneous service to all clients without the need for them to have a live machine.
After analysing several options, Amazon Sagemaker service was the chosen one to store AIaaS, above Google ML. Owing to Amazon Sagemaker, the issue of implementing a machine on the clients’ site was solved. At MCCM we are faithful supporters of automation, thus, it would not be right for us not to offer a fully automated product.
Once AIaaS’s models were stored in Amazon Sagemaker it was fairly easy to use it. However, specific information from MCCM employees was not recorded just the overall data. This was a confirmation to our concerns, then, a system to monitor and administrate each users’ log was created. Kong an Open-Source API Management and Microservice Management did the trick; by doing so, users’ calls were dealt through Kong, thus, the MVP of the final product was taking shape. Lastly, access to the logs was provided to users via a web application.
The web application was the least innovative part yet not exempt from difficulties. MCCM takes automation as its own religion, so Kong was implemented within the MCCM website, using different microservices, in addition to Stripe, so payments and invoice postings could be performed and accessible for users.
All the infrastructure is developed with microservices exploiting Dockers containers, compiling each dependency into an isolated container makes them portable to any interface, for example, from the developers’ computer to the cloud; that characteristic allows continuous integration and deployment of the self-floating solutions.
What do you think about it? Does it lock complex? Dockers, microservices, APIs administration, Stripe… Do not worry, at the end of the day is as easy as logging into a website. AIaaS is no other thing than a cognitive API, very cognitive! 🙂