With thousands of different software applications flying around the cloud of a data center’s computer servers and storage devices, things can get complex. There might be a business in explaining the story of their travels to overtaxed system managers.
Boundary, a small start-up in San Francisco that monitors the performance of applications in the cloud, offers a product that shows just how complex a mess that is, in close to real time. That is a big improvement from the daily or less frequent check-ins people do at most data centers. More important, the monitoring can show where slowdowns might be occurring, or where operational savings might be found.
Those are the business reasons for the product; what is more interesting is the way it shows how few of us can even tell where all our software begins and ends anymore. Cloud-based systems now continually touch us through everything from Google searches and phone apps to our own corporate software.
There are so many so-called layers of software, handling things like authentication, middleware to other applications and systems, or the job of fetching more data and otherwise communicating and responding with systems all over the place, that it can morph to a point that it is not really controlled.
“Our customers don’t know how their applications are configured,” said Gary Read, the chief executive of Boundary. “You can have an application spread over 70 servers, touching five or six different layers of the software stack. The infrastructure processing the application is always changing, too.” Customers using public clouds like Amazon Web Services, he says, can also find themselves putting together applications from data centers on opposite sides of the country. That costs money and slows down delivery of an application.
Boundary hopes it can make a Big Data play out of mapping all this near chaos, improving performance and eventually predicting where there might be trouble. Mr. Read says Boundary is collecting about 630,000 actions by one or another of its customers’ applications every second, or 55 billion records a day. It has started offering a free version of the service, in exchange for access to more customer data.
“We’ll let anyone process two gigabytes a day for free” while gathering information on what the applications do, he says. “The more data, the better the analytics we can do, and the better we can make the product.”