The Ops behind ChatGPT 2 – MLOps and DataOps

As seen in the diagram, you put the prompt in and get an answer generated by the trained neural network. It’s that simple. Now, you may be wondering, what happens in between? The answer to that is fascinating, but can be boiled down to a concise statement: we don’t know.

Truly, neural networks are a mystery because they are built and modeled around our own neurons, so they aren’t trained by humans; they train themselves for the best possible success, similar to the way a human would find their best method of study for themselves when trying to pass a test. So, we don’t really know what is at the core of these neural networks; we just know we can train them to become good at having a conversation.

You can train a similar one at home, too. Some companies have developed more compressed versions of LLMs that can be placed on smaller servers, such as Meta’s LLaMA. But even besides that, you can find a never-ending amount of generative AI models on any cloud provider of your preference and on open source sites such as Hugging Face, which you can plug and play to try and understand better.

Summary

The journey of a DataOps or MLOps engineer is just a DevOps engineer who has gotten some understanding of data and machine learning concepts. That’s pretty much it. But, as we saw in this chapter, the usage of those concepts is a pretty useful thing.

First, we talked about the differences and similarities between DevOps and these associated fields and how they are connected with each other. Using that, we managed to produce a couple of practical use cases that can come in handy when using Python with DataOps and MLOps.

Next, we talked about handling the proverbial big data. We talked about the aspects that make the data so big and how to tackle each of these aspects individually using a use case for each.

Finally, we talked about ChatGPT and how it works in delivering all the things that it delivers to users around the world. We discussed the simplicity of its complexity and its mystery, as well as the new age of open source LLMs that has accelerated the development of generative AI.

In the next chapter, we will get into perhaps the most powerful tool in the DevOps arsenal, Infrastructure as Code (IaC), and how Python is used in this realm.

Ti Ja

Leave a Reply

Your email address will not be published. Required fields are marked *

Careers ,Privacy Policy ,Terms of Use, Copyright Policy ,Inclusion Statement, Accessibility, Site Map