If your applications don't run in containers and kubernetes yet, Anthos doesn't seem to be a perfect solution at the first glance. But this is not the case. It allows for the modernization of legacy applications to be carried out on-premises and moved to cloud native environments.
If you are not in a hurry and don't know where to start with all of this, start with Istio and Cloud Run, they are real game changers. Anthos lets you carry out modernization with your existing resources. It runs on GKE. Apart from the GCP, your engineers can manage workloads running on the third-party clouds and on-premises. You can adopt a set of proven tools that let you increase the speed of your development, improve security and reliability of your infrastructure and applications.
You can scale and automate and, as we all know, this is one and only way to stay up to date with the customers and competitors without vendor lock-in. No matter if you are on your local market only or in many regions with special policies and no matter what situation is out there , Anthos enables you to track, update and manage configuration and policy changes everywhere.
Users can enjoy the cloud that suits them best for their application deployment and management needs. It is not a cheap solution, but your guys will love it. Fun fact: Anthos means flower in greek, it grows on-premise but needs water from the cloud to flourish :. I am part of the Marketing team at Revolgy. I am passionate about storytelling, board games, sci-fi, and architecture and urbanism.
Stackdriver Logging can get expensive. It will be of most interest Home Insights Blog. But is there a way to simplify the management of such a large ecosystem without going insane? But how does that work exactly? Enter your keywords. Featured links. Log in Account. Log in Your Red Hat account gives you access to your member profile and preferences, and the following services based on your customer status: Customer Portal Red Hat Connect for Business Partners.
User management Certification Central. Register now Not registered yet? Here are a few reasons why you should be: Browse Knowledgebase articles, manage support cases and subscriptions, download updates, and more from one place.
View users in your organization, and edit their account information, preferences, and permissions. Manage your Red Hat certifications, view exam history, and download certification-related logos and documents.
Edit your profile and preferences Your Red Hat account gives you access to your member profile, preferences, and other services depending on your customer status. But there's also a rich history of embedded developers that have been working on IoT or real-world scenarios, such as moving a robot or adjusting a weld, for some time. And now, we see those two worlds of IoT intersecting at the edge, and that intersection creates a neat opportunity to do a few things.
First, it is great to be able to get a rich set of real time or near-real time data that shows what's happening in say, a factory line, in a retail store or in a hospital. Next, we can apply artificial intelligence to figure out how companies should use that data. And thirdly, we can process that data and apply AI in a way that leverages all that cloud innovation that we've seen in the past few years.
There are a series of trends that start coming together. While AI in concept has been around for a while, we've seen it really take off recently as we've had the compute power to process the data and information to apply the AI. Secondly, large data sets really help make AI possible , and I think we're going to see trends in the future where AI is going to be more accessible. If you look back in time, developers were a rare breed. Over time, development has become more accessible, which has resulted in more developers — and that's a great positive feedback loop that I think is going to continue in the future.
The introduction of 5G into all of this is another major positive trend. Now, we can get that data closer to that factory, to that retail store, to that hospital — all while knowing that we've got the right latency, bandwidth and reliable connection that maybe we didn't have before.
The opportunity is really dealing with the data where it sits. Imagine a scenario where I collect all this data from the sensors in my factory. You could try to send all that to the cloud, and that's where your compute and development might happen. That has huge implications. Maybe I don't have persistent connectivity. The bandwidth costs of that are enormous. Imagine sending all these video streams from hundreds and hundreds of cameras from an installation to the cloud.
So, there are lots of reasons why bringing all that data somewhere else doesn't make sense. But doing it at the edge where the data is today, if I add compute there, now I can analyze all that data and then I can take action without having to send the data anywhere. Now, we bring the compute to the data. We've been working with Audi around analyzing weld data.
When they build a car, they do about 5, welds per car — all run through their line. The old way was for someone to go in and inspect that one car per day out of their whole line and say, "Well, how'd we do? Are the welds good or not? Now with AI, they've been able to do a real-time inspection of each of those welds — they're inspecting every weld. Now, they're able to know for a fact they've got great quality out of every car, every weld, every time.
Or, when they see that they don't have the right quality, they can also take action on that data in near-real time. They can adjust the mix of chemicals in the welding machine or make other real-time adjustments on the line as they're building that car.
So, the opportunity for developers at the edge is to have much more real-time interaction with all of that data as it's happening. Let's shift gears to retail. If, for example, I've got a trained model looking at foot traffic patterns [and] I'm running that model or algorithm in my retail store, then I'm learning, as people are coming in, so I'm updating and changing it all at the edge without having to go anywhere else.
There are so many examples like this where developers have a new frontier, if you will, to work at the edge. Intel has been working with developers for decades. They are just so key to all of the work we do, regardless of the technology area, that we cultivate these relationships and help design products, tools, code samples — all kinds of resources — to make the developers' lives easier.
Recently, we've been focused on a product called OpenVINO , which is a resource that's focused on edge inference, and making that as simple as possible. We started with the notion of: How do we make sure that developers can write once and deploy anywhere?
In the old way of doing things, the developer might have had to create custom code for each one of those pieces of silicon. We don't want them to do that. So, we worked hard to create OpenVINO, which lets developers write one time, express their intent and then deploy that code across a variety of silicon from Intel. And the product has just been getting better and better.
0コメント