Artificial Intelligence Framework: Moriarty
In consulting and data science, it is interesting to use artificial intelligence frameworks that allow code reuse in different projects for different clients.
At Everis I had the opportunity to work with the Moriarty framework, integrating NLP Python modules.
Moriarty is a tool that can generate Big Data near real-time analytics solutions (Streaming Analytics). This new tool makes possible the collaboration among the data scientist and the software engineer. Through Moriarty, they join forces for the rapid generation of new software solutions. The data scientist works with algorithms and data transformations using a visual interface, while the software engineer works with the idea of services to be invoked. The underlying idea is that a user can build projects of Artificial Intelligence and Data Analytics without having to make any line of code. The main power of the tool is to reduce the ‘time to market’ in an application which embeds complex algorithms of Artificial Intelligence. It is based on different Artificial Intelligence algorithms (like Deep Learning, Natural Language Processing and Semantic Web) and Big Datamodules (Spark as a distributed data engine and access to NoSQL databases). Moriarty is divided into several layers; its core is a BPMN engine, which executes the processing and defines data analytics process, called workflows. Each workflow is defined by the standard BPMN model and is linked to a set of reusable functions or Artificial Intelligence algorithms written following a service-oriented architecture.
Some example of code applied:
model = Sequential()
model.add(Embedding(max_features, embedding_dims, input_length=maxlen))
model.add(LSTM(600))
model.add(Dense(99, activation='softmax'))
model.compile(loss='categorical_crossentropy', optimizer='adam', metrics=['accuracy','categorical_accuracy'])


Comments
Post a Comment