Infrastructure as Code (IaC) is the management of infrastructure (networks, virtual machines, load balancers, and connection topology) through machine-readable definition files, rather than a physical hardware configuration. GigaOm Analyst Michael Levan is currently conducting research on IaC testing, an emerging sector that brings the benefits of software testing to the infrastructure management space.
IaC testing is a young field, but Levan was intrigued at the maturity he saw from a specific solution, BridgeCrew Checkov, which he describes as a leader in the space. In a recent video, Levan walks viewers through the solution, which features a UI for admins to view the results of different tests, as well a command line interface to run tests and view results. He also explores issues and priorities that IT decision makers should consider when evaluating IaC solutions.
Levan says two approaches to IaC testing are … Read More »
In June we published updated versions of our Hybrid Cloud Data Protection Key Criteria and Radar reports. This year we decided to split most of our Radars in two to give a better view of the market landscape to our clients. In fact, categorizing solutions for either SMBs or Enterprises makes it more likely you will find a solution that is well-suited to your organization. Here are two slides from the webinar where I talk about the two groups:
The data protection market is evolving quickly and what is really interesting is that for some of the vendors we looked at we found a compelling vision along with a strong execution, showing them to be in similar positions as innovators and outperformers in both the enterprise and SMB market segments. The reason is simple: The role of data protection has radically … Read More »
I love working for Gigaom because we don’t settle. Ever. The Gigaom Key Criteria and Radar reports have gotten a lot of attention lately, because they focus on technology, execution, and roadmaps, instead of vendor revenue and market share. It’s an approach that is really appreciated by decision makers and our subscribers.
GigaOm subscribers know that we produce a lot of different types of reports. These include use case analysis reports, benchmarks, TCO reports, and more, but I thought we needed something unique to help give our subscribers an additional edge over their competition. In the end you work with an analyst firm just for one reason: to get better information, quicker, so you can improve and speed up your decision-making. This means it’s our job to anticipate and predict what’s coming next, to help you get ahead of the market.
Bridging … Read More »
The term Observability might come as a conundrum for anyone who has been involved in IT Operations over the years, given how a major part of the challenge has always been to keep “single pane of glass” visibility on what can be a complex IT estate. Moving to the cloud also requires visibility, across the virtualized infrastructure and services in use — this is to be expected. Less evident at the outset is how cloud-based architectures change the nature of what needs to be managed:
cloud-based applications can become very complex, particularly if they are based on microservices
they can rely on multiple integration points with external and SaaS-based applications and services, accessed via APIs.
they are often hosted in multiple cloud environments or may still have portions hosted in on-premises or private clouds as well.
Artificial intelligence (AI) and machine learning (ML) are redefining the enterprise IT landscape, as across verticals see the potential for AI and ML to automate repetitive tasks and solve complex problems. But just how far does the potential of AI/ML reach?
GigaOm co-founder and CEO Ben Book recently appeared on an episode of the 7investing podcast with 7investing Founder and CEO Simon Erickson to discuss technology trends. He says most enterprises know that AI and ML will impact their business, but some are still trying to figure out just how the technology will work for them.
“The early adopters were webscale and high growth new industry and digital companies, like Google, Twitter, Uber, Facebook, investing in data scientists and other data-intensive industries such as finance and insurance,” says Book.
He says that while AI and ML have already made their mark in verticals … Read More »
High-performance object storage is a data storage architecture designed for handling large amounts of unstructured data. It has historically been known for its ability to store these massive amounts of information as objects, rather than files. But use cases have broadened in recent years as more organizations produce large amounts of data and want the ability to organize, manage, and search it.
“High-performance object storage is common in two scenarios,” says GigaOm analyst Enrico Signoretti. “On one hand, these types of systems are used to consolidate more workloads on a single system. On the other hand, they are used as interactive storage for highly demanding workloads that also present huge data sets, like in AI, HPC, or big data analytics.”
In his new GigaOm Radar Report for High-Performance Object Storage, Signoretti looks at the fast-moving market of high-performance object storage solutions and … Read More »