The Ultimate 2022 Data Center Predictions List | #cybersecurity | #cyberattack


2022 is almost upon us, and in the land of trade publications, this means one thing – predictions for the year ahead.

As part of this annual ritual, tens of thousands of executives and spokespeople are rushing to outline a vision of the future that invariably benefits their business. Strangely, no one ever predicts a global pandemic, or a giant boat blocking the world’s busiest shipping lane.

Here at Data Center Knowledge, we believe in respecting traditions.

So we have collected and curated 43 unique predictions about the trends that are set to impact the data center and IT industries in the next 12 months, and even arranged them in alphabetical order by organization – from Aerospike to Wolf Security.

Why not binge-read through everything that the hardware and software vendors, service providers, and consultancies expect to happen?

So pull up a chair, grab a beverage, and enjoy a seasonally optimistic look at what 2022 has in store.

Happy holidays.

What the industry expects from 2022

Featuring: Aerospike | Axis Communications | Cisco | Cloudentity | CloudSphere | Commvault | Datagen | Dataiku | Digital.ai | EdgeQ | Eigen Technologies | IBM | Juniper Networks | Kinetica | Lightmatter | Netenrich | Neurala | Ocado Technology | Omdia | ReadyWorks | Scality | Splunk | Synthesis AI | Vade | Vxchange | Wolf Security

 

Lenley Hensarling, Chief Strategy Officer:

Supply chain issues, both hardware and personnel, will continue with the cloud

“Supply chain issues throughout the global economy, driven by Covid, have been well documented. And the cloud is no exception. As companies continue to migrate to the cloud in 2022, they’ll be surprised to find hardware and personnel shortages that may force them to alter their plans. Some organizations may have trouble getting the number of cloud instances they’ll need. They’ll also struggle finding staff to manage their cloud operations with so much turnover happening in the workforce. These restrictions will lead to more organizations seeking out fully managed service cloud offerings.”

Companies will look to shrink the ideation to production cycle

“Next year, many organizations will start to look for new ways to drive revenue growth again after spending the past year getting back on track from the pandemic. One avenue they’ll begin to explore is taking advantage of business opportunities that are happening in the moment and are somewhat transient. Like many of today’s digital businesses that operate successfully in this new economy, established companies need to be able to move fast and shrink their ideation to production cycles. Exercising the elastic scalability of the cloud can help companies tailor the consumption of compute, network, and storage resources to the pace of their business. And it’ll provide the agility needed to identify a need or demand and quickly pivot to seize new opportunities.”

 

Johan Paulsson, CTO:

Connection across hybrid environments

To the end user, the architecture being used to deliver services has become invisible. Whether processing takes place on a device, local server or in a remote data center, everything is connected. As a security solution vendor, it’s up to us to provide the tools and flexibility to help people decide on the best solution for their unique situation. Given that ‘connected’ has become the default, we do believe that most surveillance solutions will ultimately be hybrid; combining cloud, on-premise server and edge technologies.”

A new default for cybersecurity

“While we foresaw the rapid acceleration towards Zero Trust network architectures a year ago, we now believe it to be a default approach. The Covid-19 pandemic has played a role here too, as flexible working has seen more devices connected remotely over the public internet. Taking a Zero Trust approach involves evaluating the security profile for each device each time it connects which has significant implications for the video surveillance industry, with various checks and validations moving from ‘nice to have’ to ‘must have.’”

 

Liz Centoni, Chief Strategy Officer and GM of Applications:

Data deluge, data gravity, and the need for predictive insights will propel the Edge towards whole new application development and experience paradigms

“Modern enterprises are defined by the business applications they create, connect to and use. The pandemic has shown that enterprise companies can sustain and grow their businesses, embracing direct connections via application-service to application-service or endpoint to endpoint. In effect, applications, whether they are servicing end-user customers or are business-to-business focused or even machine-to-machine connections, will become the boundary of the enterprise.

“The business interactions that happen across all of these different types of applications will create an ever-expanding deluge of data. Every aspect of every interaction with and across enterprise apps will generate additional data to provide predictive insights. With predictive insights, the data will likely gravitate to a central data store for some use cases. However, other use cases will require pre-processing of some data at the Edge, including machine learning and other capabilities.”

Only through the delivery of predictive and seamless Internet access will the metaverse be realized, and access to technology and innovation become ubiquitous

“There is no doubt that the trend for untethered connectivity and communications will continue. The sheer convenience of using devices wirelessly is obvious to everyone, whether nomadic or mobile.

“This always-on internet connectivity will further help alleviate social and economic disparity through more equitable access to the modern economy, especially in non-metropolitan areas, helping create jobs for everyone. But this also means that if wireless connectivity is lost or interrupted, activities must not come to a grinding halt. The future needs ubiquitous, reliable, always-on internet connectivity at low price points. A future that includes seamless internet services requires the heterogeneity of access – meaning AI-augmented and seamless connectivity between every cellular and WiFi generation and the upcoming LEO satellite constellations and beyond.”

 

Nathanael Coffing, co-founder and CSO:

Automation will be key to mitigate the rising number of API attacks due to the growing attack surface

“In the next year and beyond, the number of API attacks will continue to rise as APIs usage continues to increase exponentially. This is because each API and developer is another potential point of entry for cyberattacks. The State of 2021 API Security, Privacy and Governance Report revealed that in the last year, at least 44% of enterprises have experienced substantial issues concerning privacy, data leakage and object property exposure with internal or external-facing APIs. As a result of these issues, 97% of enterprises experienced delays in releasing new applications and service enhancements due to identity and authorization issues with APIs and services.

“To mitigate this looming threat, IT and security teams must do a better job of protecting the enterprise by ensuring APIs are discovered and the right security guardrails are in place for every API. Given the rapid propagation of APIs, automation becomes the defining requirement for building the principle of least privilege and zero trust into your APIs. This starts by adding machine identity, workload identity and correlating them with the requestor user identities to allow mutual authentication. Once every entity in a transaction is authenticated, declarative authorization becomes the next logical step in providing developers the tools they need to adhere to security requirements. It’s impossible to implement proper security measures for every single identity with manual coding, especially when machine and API transactions are so rapid and temporal.”

 

Keith Neilson, technical evangelist:

Cloud migrations will increase

“Companies already in the cloud will continue to evolve and rationalize their multi-cloud strategies for any number of reasons that may include pricing, availability, license bundling and other factors. Because of this, we will see more cloud-first enterprises moving resources from one cloud to another. So, while such an enterprise may have a steady percentage of its assets in the cloud over time, those assets will be spread across a more diverse third party landscape of multiple cloud providers. The cyber asset management mandate in this scenario will be to facilitate smooth and secure operations over this range of multiple cloud vendors – so the enterprise can reduce risk exposure from having a single cloud vendor, without introducing new risks from misaligned multi-cloud assets.”

Growth of Industry 4.0 will drive more alignment of IT and OT assets

“Digital transformations overall have increased as a response for adapting to COVID-19 business impacts, and this is true for Industry 4.0 transformations as well. Resilience equals agility. And given the goals for this in production and supply chain settings, IT and OT assets are increasingly facing the need to be better coordinated. Cyber asset management therefore must encompass the entire spectrum of OT and IT assets – from IT teams’ hardware, networking and connectivity, to OT-related machine data and machine software. The more complete picture organizations can get across this spectrum of assets, including their characteristics, behaviors and interdependencies, the better orchestration and fewer surprises companies will have.”

 

Don Foster, Global Vice President of Sales Engineering:

Get ready for data integrity fire drills

“In 2022 CEOs, CFOs, CIOs and other C-Suite executives will begin mandating that their IT teams conduct periodic “fire drills” to test the strength, resilience, and speed of their cyber defense and disaster recovery processes and solutions.

“These executives might believe that their organization’s IT team has implemented a robust strategy to fend off and, if necessary, recover from ransomware and other attacks. But with the integrity of their business threatened by these attacks, these executives are following the famous old Russian proverb “trust but verify.” They might “trust” that their IT Team’s strategy will work, they are also “verifying” that it will work as well.”

“Data integrity fire drills can confirm if an organization’s cyberattack defenses are robust, or discover weak-points in these defenses that cyber criminals might exploit. These drills can also verify that, if a worst case scenario occurs and data has been locked, altered, or destroyed by an attack, it can be quickly recovered from a secure backup copy — turning what could have been a data disaster into a data speed bump.”

 

Ofir Chakon, co-founder and CEO:

The supply chain crisis will worsen but digital twins will save the day

“Federal Reserve chair Jerome Powell and other experts predict that the global supply chain crisis will only get worse in 2022 before it gets better. In fact, a recent Wall Street Journal poll of leading economists finds almost half of the respondents cite supply chain bottlenecks as the biggest threat to growth in the next 12 to 18 months. Unpredictable weather patterns and labor shortages will intensify the disruptions caused by the global pandemic. As a result, private businesses and government agencies will turn to solutions that could help alleviate the pressures.

“One such solution will be digital twins, a machine learning driven simulation of real-world objects to predict disruptions and provide recommendations on how to avoid them. Organizations whose operations are heavily supply chain dependent should consider investing in digital twins technology to stay competitive.”

 

Sophie Sachet, VP of Corporate Strategy:

Specifying data use cases will make or break DX

“Some companies will continue to delay digital transformation as they erroneously focus on ‘getting organizational data right’ versus leveraging the right data for a specific use case, which will lead to years and millions of dollars lost.

“The reason 70% of digital transformation projects fall short isn’t due to knee jerk investments in shiny new customer-facing tech like apps and experiences. They continue to fail because organizations don’t lay the necessary data foundations for scalability, which impedes any effort to derive real value from the exponential quantity of data they’re amassing.

“Data is only contextual to its use, so a clear purpose must be established before it can be an effective change agent by literally becoming part of the digital transformation journey itself. We’ll see more companies lag behind in their DX journey until they understand that the culprit is poor data quality and not having the right data in the right place.”

 

Florian Schouten, VP of Product Management:

Major outages from faulty configuration changes leads to overhaul in governance and risk management

“The current technology landscape is becoming more complex with addition and increased use of microservices, cloud consumption and edge computing. On top of that, companies have increased change and release velocity to unprecedented levels to keep up with the innovation demands of their digital transformation initiatives. This means development pushes bug fixes, new features and entire releases more rapidly into production — often through a combination of manual and automated changes. In today’s world, the traditional means of change governance via a change advisory board (CAB) has dimmed in importance in the interest of providing organizations more agility, and this has come at a cost: an elevated risk of change failure and the business disruption that ensues from it, like the recent Facebook outage. Because of this growing complexity within IT departments, 2022 will see an increased emphasis on governance and risk management.”

IT teams gain a bigger seat at the table with value streams

“Despite spending billions on digital transformation, global enterprises are still struggling to deliver digital value. Why? Because pieced-together point solutions can’t handle the demands of enterprise organizations. Value Stream Management (VSM) promotes the use of systems-thinking to optimize the whole — not just the parts. In 2020, the pandemic made every organization realize the need to become a digital business, and since then, the accelerated rate of change has pressured organizations to become extremely agile. Accomplishing this goal requires aligning business and development value streams to operate together with a shared mission, vision, and cadence — and the secret sauce is VSM.”

 

Vinay Ravuri, CEO:

Cloudification of 5G

“As billions of people and a trillion devices get connected, compute must migrate closer to the data source (versus the current operating model of data moving to the compute). The whole notion of edge computing will evolve around intelligent connectivity (5G) + intelligent compute (AI).

“Those who would profit from this mega-trend will be players who can service this new industry model. Traditional telco operators will need to respond on how they can monetize beyond providing infrastructure. Be watchful of new disruptive players – hyperscale cloud providers such as Amazon Web Services, who recently announced private 5G networks. They will be transforming the industry towards a cloud-driven, app-driven 5G network model. Hyperscalers in particular are uniquely positioned to capitalize on this new paradigm by supplying enterprises with a local edge cloud – complete with cloud services, hardware resources, and virtualized 5G network.

“Lastly, expect Netflix-type of turnkey 5G services invoked within enterprise settings such as factories, warehouses, airports… As new content and new data become available at the edge by new users, there will need to be content delivery platform. Hyperscale cloud providers have the unique ability (i.e. platform) to deliver such 5G content and services.”

Death of Moore’s law. What happens next?

“Moore’s law (the doubling of transistors every two years on the same silicon real estate) which has governed the performance and miniaturization of semiconductor technology since the 1960s has stalled and will come to its last breath in this decade. For the first time in semiconductor industry, the intersection of Moore’s Law, hardware acceleration, and ‘softwarization’ will be principle to the future of performance scaling and chip design. The industry will shift towards software-defined hardware with emphasis on highly custom, programmable chip designs rather than monolithic one-size-fits all chips.

“Specific intensive functions will be compartmentalized to hardware accelerators, but the design trend will be towards smaller processors to tackle more specified tasks using software definition. With respect to 5G, customers will look for elastically programable baseband chips that are dynamically configurable, fluidly adaptable, and yet performant.”

 

Dr. Lewis Z. Liu, co-founder and CEO:

The Metaverse is a worrying prospect

“I’m part of the first social media generation. I was at Harvard when Facebook launched and was one of the first 500 users. Back then it seemed like innocent fun, and we all put our lives on it. Now though we know that social media is damaging our politics, our personal lives and our mental health. With that in mind I frankly don’t see how the Metaverse can be a good thing. Especially if it is owned and managed by the same people who have so badly damaged our societies with social media. I now have two children under five and I would rather they play and develop in the real world by climbing trees than doing it in a fake virtual space.”

Ethics in AI grows in importance

“A small data approach to AI will gain even more momentum in 2022. People are finally asking the right questions on the data used to power AI. When things such as the Metaverse arrive, this is going to be even more important. Just think of all the data that will be used to build that environment and how it will influence everything that happens there. Based on what we’ve seen so far do we think this will be done right? Given the track record of the social media giants over the last decade I’m not confident. It is more important than ever for us to move beyond the problematic big data approach where there is no control or accountability in what is being fed to AI models.”

 

Mark Cox, Public Cloud Director, IBM UK & Ireland:

Enterprises will strategically migrate workloads as they embrace modernization

“As organizations move further into their hybrid and multicloud journeys, their focus will shift towards determining which workloads go where. Early on in their cloud journeys, organizations often moved simple workloads to the cloud and now they are evaluating migrating more mission-critical, complex workloads as they embrace modernization. In the year ahead, they’ll need to take inventory of their IT environments to select which workloads and applications are best suited for the cloud and which should remain on-premises.”

Preparing for data governance: The rise of Industry Clouds

“As organizations grapple with security and compliance, 64% of the same study’s C-Suite respondents agree industry-related regulatory compliance is a significant obstacle. Adhering to compliance and security requirements is especially important for highly regulated industries such as the financial services sector and government agencies for example.

“As these industries strive to meet the demands of today’s digital-first customers and constituents, cloud adoption is evolving towards specialized clouds. Industry-specific platforms will increasingly be adopted to help them balance innovation with stringent compliance protocols. By choosing the right platform – one with built-in controls – they will be able to innovate at the pace of change, ensuring they don’t get left behind while their industry puts new regulations into place or modifies existing ones.”

 

Mike Bushong, vice president, enterprise marketing:

Cloud practices become normalized within the enterprise segment at large

“Those companies either born in the cloud or at the tail end of their digital transformation efforts will move first. Although this may create problems – if you are competing against a company that suddenly develops a meaningful digital advantage, the consequences can be dire. The industry is littered with the corpses of companies that lost to Amazon and other online retailers. Digital players have disrupted transportation, logistics, manufacturing…virtually every sector imaginable.

“Looking ahead, what can companies do to avoid these pitfalls? It probably starts with people. A change in operations is a change in both skills and culture. Most companies won’t be able to hire an entire new staff, so it starts with one or two key hires with the requisite cloud-like skills to help lead the transformation. And that needs to be followed by a real focus on training people up, to open up a clear career path for them to build a foundation for the next decade.”

Supply chain diversification will become a priority in 2022

“As companies stare down the supply chain, expect 2022 to launch an architectural renaissance focused on how to reduce the risks that come with a concentrated set of suppliers.

“Whether it’s multi-vendor networking to allow for diverse vendor choice, or the introduction of new components to create more supply optionality, the networking landscape will begin to evolve. And that evolution will necessarily start with the architectures that drive network choices.

“In 2022, the network landscape will likely evolve at the pace of refresh. For companies that have been sitting on legacy equipment, natural upgrade cycles provide an opportunity to take a giant leap forward. New capacity buildouts will drive opportunities to introduce new practices that bring in both new components and new vendors. Will it be fast at an industry level? Probably not. But it’s hard to imagine getting through another generation of equipment without significant turnover of legacy architectures and the gear that supports them. This means we could see wholesale architectural shifts break over the next 3-5 years, and 2022 seems poised to be an inflection point.”

 

Amit Vij, co-founder and president:

The world will get more answers about the mysteries of UFOs and UAPs.

“This year’s US Intelligence report on UFOs was a landmark of transparency and insight into UAP/UFO sightings. However, it did not provide any definitive conclusions on the true nature of UAPs. That’s partly a function of the limitations of legacy technologies available. In 2022, thanks to projects like NORAD’s Pathfinder that are planned to go fully operational, we’ll start to gain a clearer picture of UAPs. These new capabilities enable tracking and classifying moving objects at significantly increased levels of sophistication based on AI and 1000X faster processing due to advances in parallel processing through vectorization. While there’s no guarantee of discovering aliens next year, governments and defense agencies will be able to demystify more sightings and share findings faster than before.”

Harnessing time and space data will be a major market opportunity

“Projections from Deloitte suggest that 40% of connected IoT devices will be capable of sharing their location by 2025, up from 10% in 2020 – making geospatial data the fastest growing space in the data landscape and creating the potential for crisis within unprepared organizations. This acceleration of geospatial data will be driven by the declining cost of sensors, more satellites gathering time/space data, and 5G rollouts. This will open up new ways of using geospatial information. But managing fast-moving, high-volume location data in a reasonable timeframe has always been a challenge, and these new devices will make it even worse.

“IoT data has always had a time dimension, i.e. logs from smart devices about their interactions and changes in state, but now the space dimension is taking off, and many organizations don’t have the skills or resources to cope with the onslaught. This will force them to explore new approaches and technologies to get the full value of time and space data. Early adopters will have a huge market opportunity within their respective industries, while slower organizations will risk getting left behind.”

 

Nicholas Harris, CEO:

Companies will demand energy-efficient AI

“As the climate crisis has become impossible to ignore, companies are prioritizing sustainable practices deep into the supply chain and AI compute decisions are no exception. The demand for computing power to train and run increasingly larger neural networks will only continue to grow in 2022. As a result, I predict we’ll see an increase in companies committing to reduce the carbon footprint of their AI and investing in ways to make both AI hardware and software more energy efficient.”

 

John Bambenek, Principal Threat Hunter:

Cyber insurance is broken, but still worth it in 2022

“The ongoing digitization of nearly every industry coupled with more sophisticated and widespread cyberattacks have led to a dramatic increase in financial losses for enterprises. Cyber insurance is meant to be a safety net in these instances, but the sheer amount of cybercrime activity has caused premiums to rise at an unsustainable rate. Companies, particularly small and medium-sized businesses, will be doing cost/benefit analyses for cyber insurance in the coming year and questioning whether there is an alternative to this broken system. Given the current threat landscape, however, companies would be unwise to abandon these insurance policies in the near term. Instead, it is critical for the cyber insurance industry to work with companies to develop policies that will get its economics under control and premiums down to a manageable level.”

Lack of skilled IT and security professionals and worker attrition persist

“Organizations will continue to face resource challenges as they look to fill existing and new IT positions. Ultimately, they will have to hire more people, rely on vendors for services, or invest in automation. The end state will likely be some form of all three. The work needs doing regardless of headcount, so allowing automation to handle the basic problems enables IT experts to focus and resolve the more critical issues. Companies can also focus on their IT workers’ well-being with balanced workloads to retain valued staff.”

 

Dr. Max Versace, PhD, co-founder and CEO:

Let the clouds be in the sky

“AI will accelerate its migration from servers to edges. Where data lives needs to be interpreted (often in real-time) and should not leave the walls of the company. Today, a plethora of AI-ready processors, cameras, and other hardware makes this possible. Increasingly, companies are realizing that the way to build a truly efficient AI algorithm is to train it on their own unique data, which might vary substantially over time. To do that effectively, the intelligence needs to directly interface with the sensors producing the data. From there, AI should run at a compute edge, and interface with cloud infrastructure only occasionally for backups and/or increased functionality.

“No critical process – e.g., in a manufacturing plant – would and should exclusively rely on cloud AI, exposing the manufacturing floor to connectivity/latency issues that could disrupt production. 2022 will see edge learning technologies on the rise, enabling AI to ‘reprogram’ from scratch in a few seconds, whenever and wherever needed. This paradigm-shifting technology will empower AI to truly serve its purpose at speeds, latency, and costs that make it affordable for every user.”

 

Gabriel Straub, Chief Data Scientist:

Quality data will become the cornerstone of MLOps

“ML success depends on making the development, deployment and management of ML solutions at scale as simple as possible. Developer time should be spent on developing and trialing new approaches, not on deploying and monitoring services. And in the case of ML, this doesn’t just apply to code but also to data. Andrew Ng has been speaking about data-centric AI in this context, about how improving the quality of your data can often lead to better outcomes than improving your algorithms (at least for the same amount of effort). So how do you do this in practice? How do you make sure that you manage the quality of data at least as carefully as the quantity of data you collect?

“There are two things that will make a big difference: 1) making sure that data consumers are always at the heart of your data thinking and 2) ensuring that data governance is a function that enables you to unlock the value in your data, safely, rather than one that focuses on locking down data.”

 

Manoj Sukumaran, principal analyst for data center computing and networking:

No end to chip shortage

“The data center server market continues to be supply constrained because of the shortage of key semiconductor components like power management ICs, micro controllers, and other ASICs. The demand for servers remains very strong across market segments and vendor order backlogs are at historically higher levels.”

Omdia lowered its annual server revenue forecast to 86 billion reflecting the impact of the semiconductor shortage. Vendors are unable to fulfill all orders, and many expect major spill over into 2022. Omdia does not expect the component shortage to improve until at least the second half of 2022.

 

Moises Levy, principal analyst for data center physical infrastructure and sustainability:

Sustainability is at the top of the agenda

“Sustainability efforts are gaining traction, with many companies exploring strategies to reduce their greenhouse gas (GHG) emissions (scope 1, 2, and 3), including pursuing new practices for the effective and efficient use of resources, and the circular economy. Investors are demanding that big companies report sustainability-related efforts and actions and see it as a competitive advantage. Cloud service provider and colocation provider data centers are leading the race; however, many enterprise data center operators see sustainability as a threat to reliability and downtime.

“Sustainability initiatives are growing, and more money is being poured into R&D. Sustainability-linked loans to fund projects are also growing. That being said, more innovation and collaborative efforts are needed to achieve a more sustainable future. New technologies can be a game-changer for sustainability goals. They include higher efficiency, smaller-footprint equipment, liquid cooling, smart grid–ready UPS, improved battery energy storage systems (ESS), enhanced monitoring, and AI-enabled analytics.”

 

Andrew Sweeney, co-founder and co-CEO:

The rise of the Digital Platform Conductor

“By 2023 nearly 40% of the global population will be tracked digitally in order to influence behavior according to Gartner. This ‘Internet of Behavior’ will require the deployment of an exponentially larger number of physical IoT devices to collect and process data at the edge. This trend, coupled with the enterprise’s desire to manage physical systems with the same elasticity as their cloud systems, will give rise to a new IT Orchestration category called Digital Platform Conductors. Essentially, centralized IT infrastructure orchestration platforms that use AI and Data to control workload and data placement across the IT infrastructure stack.”

Automation over outsourcing

“By 2025 over 50% of the revenue of the major consultancies will be derived from service delivery that leverages automation and AI instead of just human resources. 2020 will see large consultancies, VARs, Integrators, and Outsourcers strategically moving to build, acquire, or adopt automation technologies or risk becoming non-competitive. This will represent a major shift in how IT infrastructure programs are delivered over the next several years.”

 

Paul Speciale, CMO:

On-premises data centers will remain an enterprise priority for the foreseeable future

“Decentralization of IT services, applications and data has been an ongoing trend over the last decade. Applications and data have long been moving from corporate data centers to public clouds. However, the enterprise data center is far from dead. We predict that corporations will maintain their investments in corporate on-premises data center infrastructure even as cloud adoption continues for reasons of control, performance and cost-efficiency. And we expect this to continue for the foreseeable future.

“This will lead to a new level of sophisticated IT management capabilities to optimize multi-data center, multi-cloud application and data management solutions. Data storage and management is a multi-site hybrid-IT problem.

“Most enterprises today have realized that a smart, balanced approach to applications and infrastructure across enterprise (private) data centers and public cloud services leads to the most optimal delivery of services, agility, best time-to-market, and cost efficiencies.”

 

Will Cappelli, DevOps sales specialist:

Serverless is going to be big – and confusing

“What people don’t realize is that not only is the backend becoming function-based, but there is also a major revolution in programming languages used to create the entire stack, front and back.

“The issue is that all the challenges that microservices and containers present are cranked to 11 in a function-based architecture. We’ve gone from looking at application components with lifetimes of months to microservices measured in microseconds, and we’ll need an atomic clock to measure the lifetime of a function in these stacks.”

 

Kate Matsudaira, VP of Engineering:

Data residency laws will create new headaches for IT and business leaders

“We got a taste of it when the EU rolled out more stringent privacy protections with GDPR. We’re also seeing countries like Australia and Germany and China passing laws that say that data generated in a country can’t leave that country.

“That has huge implications for a business that is primarily run elsewhere, or whose observability products run elsewhere, because they can’t pull data out of those countries

“This is just the start of a lot of future legislation, and businesses that play internationally and the vendors whose data technologies they rely on have to think about how to build systems that work within these new parameters.”

 

Yashar Behzadi, founder and CEO:

Synthetic data will be a requirement to build the Metaverse

“The metaverse cannot be built without the use of synthetic data. To recreate reality as a digital twin, it’s necessary to deeply understand humans, objects, 3D environments, and their interactions with one another. Creating these AI capabilities requires tremendous amounts of high-quality labeled 3D data––data that is impossible for humans to label. We are incapable of labeling distance in 3D space, inferring material properties or labeling light sources needed to recreate spaces in high-fidelity. Synthetic data built using a combination of generative AI models and visual effects (VFX) technologies will be a key enabler of the AI models required to power new metaverse applications.”

 

Adrien Gendre, Chief Product Officer:

AI-generated threats will no longer be hype

“Expect cybercriminals to leverage AI-generated email threats especially in targeted attacks. We are currently seeing threats being created manually, but with improved technologies available to mass-produce messaging for email threats based on what’s trending in the news or what is being mentioned in a company’s social accounts, the potential to target their victims is even greater. This could be a game-changer in the way attacks are being built and would put having an AI Response as a must-have in your cybersecurity toolbox.”

 

Nicolas Joffre, Security Operations Manager:

Tech support scams will fool more workers than ever

“We’ve seen a large amount of tech support scams impersonating McAfee, Norton and Windows Defender in 2021. The attacks used a variety of techniques to bypass filtering technologies such as brand obfuscation, telephone number obfuscation, the use of images, sending from high reputation domains and more. The ask from these attacks was to call a toll free number, which unfortunately many individuals did. It is very likely to be an active form of attack next year due to its success in 2021.”

 

Ernest Sampera, co-founder and CMO:

You’ve come a long way, baby, but…

“The colocation and data center industry has been talking about redundancy, multi-cloud, and multi-region since these things existed. They’ve not been doing it for fun; they’ve been doing it to avoid situations like the most recent AWS outage. The most impressive SLAs won’t stop outages, so the industry will need to continue talking about, perhaps shouting it from higher elevations, about the importance of redundancy in infrastructure planning.”

Living on a razor’s edge

“The world was starting to settle into the ‘near-normal’ post-pandemic before the latest developments this fall. There were already plenty of uncertainties about what IT was going to look like as restrictions began to ease, but this latest variant adds in a new wrinkle to this return and the edge: companies who are now in a good place with those plans are now turning their attention to how they can make networks faster by taking a deeper look into edge networks and 5G.”

Get more G’s or die trying

“Speaking of 5G, the penetration of 5G into the data center mix isn’t new. What is changing, however, is that we are starting to see more and more customers who need 5G capabilities for their workloads. Sensors in equipment in remote applications (e.g.: windmills, power substations) can now transmit their data faster than ever before, forcing data centers into being more 5G-friendly. Supporting 5G workloads is slightly different than typical protocols because it requires different equipment, so data centers and colocation facilities will need to get with all the G’s or get left in the dust.”

 

Robert Masse, Security Advisory Board member:

Weaponization of firmware attacks will lower the bar for entry

“Certain industries where these attacks could be more probable should start thinking about the risks posed by the weaponization of hardware-level malware and exploits. They are very difficult to detect even in the best-case scenario. Rogue processes and memory mapping bypasses will be hot topics in 2022, and we can also expect to see threat actors targeting CPUs, the BIOS and microcode as part of a revised kill-chain for ransomware attacks.”

Ransomware gangs could put lives at risk

“Attackers have noticed that hitting certain industries will produce a higher likelihood of payment. We could see more attacks on healthcare and E&R organizations. Threat actors may well target high risk devices, such as critical medical support systems and their supporting infrastructure, where the risk of significant harm will be highest and therefore a payout will come quickly. This has already started to happen in regions such as Canada, with surgeries being delayed due to ransomware attacks.”



Original Source link

Leave a Reply

Your email address will not be published. Required fields are marked *

four + 4 =