CIO The voice of IT leadership Thu, 22 Jun 2023 17:03:44 +0000 http://backend.userland.com/rss092 Copyright (c) 2023 IDG Communications, Inc. en-US 179690362 AWS invests $100 million in new Generative AI Innovation Center Thu, 22 Jun 2023 16:30:00 +0000

Amazon Web Services (AWS) on Thursday said that it was investing $100 million to start a new program, dubbed the Generative AI Innovation Center, in an effort to help enterprises accelerate the development of generative AI-based applications.

The new program will connect AWS AI and machine learning (ML) experts with enterprises to help them envision, design, and launch new generative AI products, services, and processes, the company said, adding that these applications can be targeted at industries such as manufacturing, healthcare, and financial services among others.

The program’s team members — consisting of data scientists, engineers, and application architects — will also support enterprises via free workshops and training, AWS said. Enterprises will also get added support from the AWS Partner Network.

The Generative AI Innovation Center will also provide AWS products and services such as Amazon CodeWhisperer, Amazon Bedrock and infrastructure including Amazon EC2 Inf1 Instances, and Amazon EC2 P5 instances powered by NVIDIA H100 Tensor Core GPUs.

Additionally, customers can build, train, and deploy their own models with Amazon SageMaker or use Amazon SageMaker Jumpstart to deploy some of today’s most popular FMs, including Cohere’s large language models and Hugging Face’s BLOOM among others, the company said. 

Sales-enablement software provider Highspot and customer engagement software provider Twilio have already signed up for the program, it added.  

The move to launch a program around building generative AI applications can be seen as a strategic move to not only help reach more enterprise customers but also showcase a proof-of-concept.

The Generative AI Innnovation Center follows the blueprint of AWS’ Data Lab program, which was designed to bring enterprise customers and AWS data specialists together to solve complex data challenges in tangible ways, using AWS products and services.

In January, the company made the Data Lab program available in India.

Artificial Intelligence, Enterprise Applications, Generative AI]]>
https://www.cio.com/article/642829/aws-invests-100-million-in-new-generative-ai-innovation-center.html 642829
Converged endpoint management: reduce cost, complexity, and risk Thu, 22 Jun 2023 15:53:03 +0000

Prevention is always better than cure. In cybersecurity, it’s also usually cheaper and less likely to expose the organization to reputational, financial, and compliance risk. That’s why prevention-first security is a best practice for delivering cyber-hygiene across enterprise endpoints. The challenge is that endpoint security and management teams often work in silos, using separate point products. The result is extra cost, complexity, and security risk.

However, by consolidating onto a converged endpoint management (XEM) platform, these teams can finally serve the greater good — reducing risk and licensing costs, increasing productivity, and enhancing the end-user experience.

Why silos are bad for business

The endpoint is where modern business happens: whether it’s the laptop of a sales executive; the CEO’s smartphone; or the cloud servers, containers, and virtual machines (VMs) that power digital infrastructure. The challenge for modern IT ops and security teams is the sheer volume of endpoints they must manage and secure. And they’re growing in number all the time as organizations migrate to microservices architectures and embrace remote working. This makes visibility particularly difficult – especially unmanaged home worker devices, and dynamic, ephemeral cloud containers. It represents a growing headache for IT teams already stretched to the limit by determined threat actors and rigorous compliance requirements. 

This alone would be bad enough, but these challenges are exacerbated by the way many IT and security teams work today. Traditionally, endpoint management and endpoint security have been treated as separate functions by organizations. While IT management might include deploying and updating operating systems and applications and managing the software and hardware lifecycle, security teams focus on deploying security technologies, enforcing policy, threat monitoring, and incident response. According to IDC research, 75% of organizations treat endpoint management and security separately in this way. 

The result? Over 50% of these organizations use multiple discrete solutions to support their individual siloed functions, rather than looking for platforms that can solve IT challenges for both. In fact, there are numerous areas of overlap between IT management and security. Policy enforcement is required for both OS configuration and security controls. Installing endpoint security tools could be classed as an IT operations task. And vulnerability management and compliance could be the work of both functions. 

Creating enterprise risk

The problem with this approach is that it can add unnecessary extra risk and complexity, if siloed tools create visibility and control gaps, and isolated teams make uncoordinated decisions. This can also sap productivity and add operational cost, especially if teams are duplicating activities. In fact, more complex environments are usually more expensive to run, especially if it means supporting multiple software licenses and tools when fewer would do. Use of multiple tools can also degrade the user experience if it forces IT ops/security team members to swivel chairs between different portals. 

This disjointed approach is already having a notable impact on organizations. According to a 2022 Tanium study, most organizations don’t believe their existing security posture can stop the majority of endpoint attacks. It also takes a quarter (25%) of organizations between a week and a month to roll out a critical patch: too long when seconds and minutes count. For some (12%) it takes over a month. 

That explains why nearly two-thirds (64%) of enterprises believe it’s moderately to extremely likely they’ll be a cyber-attack victim in the coming 12 months. Serious breaches can lead to lost productivity, system downtime, reputational damage, information theft, and direct revenue impact, they admit.

How XEM can help

This is why many organizations are looking at XEM to bring their IT ops and security teams together and ensure they’re working from a single source of truth. IDC highlights four key benefits of consolidating onto a single platform:

Reducing risk and complexity: XEM can streamline and simplify compliance tasks, which often require a major input of time from both security and IT ops teams. It also features tools for rapid threat detection, patching, and remediation, thus helping to mitigate cyber risk, enhance cyber-hygiene and reduce the footprint of software running on devices.

Lowering costs: XEM helps to reduce the number of tools that must be maintained by IT ops and security teams, and associated costs related to licenses and ongoing management. By consolidating on the right single-vendor platform offering a wide range of IT security and operations capabilities, organizations can also replace entire products. A high degree of automation supports cost reduction in areas like asset management.

Enhancing collaboration: While organizations recognize the need to unify IT ops and security teams, they can’t force the issue. But with a single XEM platform, teams have a place where organic collaboration can be “discovered and nurtured,” according to IDC. Working from a single source of truth is the first step in this journey.

Improving the employee experienceIf left unchecked, bad employee experiences can sap productivity and even drive talent away from an organization. But XEM can help to spot performance issues early on, preventing a flood of helpdesk tickets, and streamline the patching experience with automation and self-service, to reduce downtime. 

These benefits are already helping to transform the way enterprises handle endpoint management and security. According to IDC, 50% of organizations are “very familiar and operational” in terms of deploying XEM, and nearly 60% would consider buying such a product. That bodes well for the future. As Forrester senior analyst, Andrew Hewitt, explains, patch management, endpoint visibility, and compliance may not be particularly sexy topics in computing today. But they remain critical challenges. With XEM, organizations are one step closer to solving them.

Learn how Tanium XEM can help you reduce cost, complexity and risk

Digital Transformation]]>
https://www.cio.com/article/642972/converged-endpoint-management-reduce-cost-complexity-and-risk.html 642972
From details to big picture: how to improve security effectiveness Thu, 22 Jun 2023 14:52:04 +0000

Benjamin Franklin once wrote: “For the want of a nail, the shoe was lost; for the want of a shoe the horse was lost; and for the want of a horse the rider was lost, being overtaken and slain by the enemy, all for the want of care about a horseshoe nail.” It’s a saying with a history that goes back centuries, and it points out how small details can lead to big consequences.

In IT security, we face a similar problem. There are so many interlocking parts in today’s IT infrastructure that it’s hard to keep track of all the assets, applications and systems that are in place. At the same time, the tide of new software vulnerabilities released each month can threaten to overwhelm even the best organised security team.

Not all vulnerabilities are created equal

However, there is an approach that can solve this problem. Rather than looking at every single issue or new vulnerability that comes in, how can we look for the ones that really matter?

In our TruRisk Research Report 2023 we analysed more than six billion scans and trillions of anonymised data points from across our customer base to build up a picture of what threats companies faced and why.

When you look at the total number of new vulnerabilities that we faced in 2022 – 25,228 according to the CVE list – you might feel nervous, but only 93 vulnerabilities were actually exploited by malware. Conversely, what might be a low priority risk to your organisation may be a critical issue to another, based on the software they use and how they deploy. By prioritising the right issues that might affect our organisation, we can get ahead of potential risks. We can focus on those problems that represent real threats, rather than feeling overwhelmed.

Automation makes the difference

Responding to all the thousands of issues that exist is hard, if not impossible, with manual effort alone. We have to automate around patching, so that issues get closed faster. According to our data, the difference is huge – automated patching is 36% faster compared to manual updates, and patches are deployed 45% more often.

Using this time, IT security teams can focus on results rather than alerts or detections. Your team’s talent and skills can be put to better use concentrating on risk and preventing attacks before they take place, rather than feeling under pressure to catch up all the time.

Your team needs assistance to prioritise the most severe vulnerabilities present in your mission-critical assets and resolve them before attackers can exploit them. Taking a risk-based approach allows you to quantify and prioritise your team’s efforts, and communicate effectively with their executives and boards. Effectively, you can know the right nails to concentrate on, so that your organisation can run smoothly and securely.

Click here to download the 2023 Qualys TruRisk Threat Research Report to better understand your organisation’s cybersecurity needs.

Security Software]]>
https://www.cio.com/article/642942/from-details-to-big-picture-how-to-improve-security-effectiveness.html 642942
The 10 highest-paying industries for IT talent Thu, 22 Jun 2023 10:01:00 +0000

Technology has quickly become a top priority for businesses across every industry. So much so that IT roles are no longer just the purview of the IT department. Every business unit has a stake in the IT services, apps, networks, hardware, and software needed to meet business goals and objectives, and many of them are hiring their own technologists.

While Silicon Valley still pays top dollar for IT pros, the war for talent has moved beyond the technology industry, with other verticals vying for talented IT workers who have the skills to enable digital transformation, process improvement, change management, and the development of apps and services. And as the demand for tech talent grows in industries beyond tech, salaries are on the rise in fields such as consulting, finance, hospitality, and more.

Here are the 10 industries with the highest tech salaries, and how much they’ve increased in value since 2021, according to the 2023 Dice Tech Salary Report.

1. Consulting

In the consulting industry, technology has become an important tool for making decisions, designing solutions, improving processes, and providing insights on optimizing business strategy. You’ll find plenty of IT consulting jobs available, a role that helps organizations identify technology solutions and strategies for improving their hardware, software, networks, and other IT infrastructure. Consulting firms are increasingly turning to tech talent to help build in-house platforms, according to the report from Dice. There’s a demand for skills such as cybersecurity, cloud, IT project management, UX/UI design, change management, and business analysis.

Average salary: US$131,995

Increase since 2021: 0%

2. Healthcare

Technology is paramount for the healthcare industry, including the medical, pharmaceutical, and biotechnology industries. Technology has evolved at a rapid pace in healthcare settings, spiking a demand for talented IT and tech professionals. It’s an industry that handles critical, private, and sensitive data so there’s a consistent demand for cybersecurity and data professionals. But you’ll also find a high demand for software engineers, data analysts, business analysts, data scientists, systems administrators, and help desk technicians. It’s an industry that can also offer some stability to tech workers, since there will always be a need for tech workers to help keep a vital industry running. 

Average salary: US$129,118

Increase since 2021: +3.4%

3. Finance

The demand for tech workers in the finance industry has only continued to grow as financial services have moved online. Even internally, finance companies such as Discover, have focused on building IT and tech training platforms to upskill workers to help meet the rapidly growing need for talent. There’s a high demand for software engineers, data engineers, business analysts and data scientists, as finance companies move to build in-house tools and services for customers. There’s also a push for digital transformation in the industry, with companies looking to integrate new and emerging technology, while modernizing legacy finance tech.

Average salary: US$128,571

Increase since 2021: 0%

4. Software

The software industry is a natural fit for IT jobs, seeing as it’s an industry that fully relies on technology. There is always a demand for knowledgeable IT pros who can help organizations design, develop, implement, and maintain software products and services. There’s a broad range of roles that fall under the software industry, the most obvious ones being software developer and engineer. But you’ll also find demand for quality assurance, DevOps, technical support, and software sales engineers. There’s also a need for project managers, product managers, cybersecurity professionals, data scientists, database administrators, and software architects.

Average salary: US$124,071

Increase since 2021: 0%

5. Aerospace and defense

It likely comes as no surprise that there’s a high demand for engineers in the aerospace and defense industry including avionics, systems, AI, software, network, quality assurance, robotics, radio frequency (RF), simulation, flight test, and manufacturing engineers. But you’ll also find demand for other technology roles such as cybersecurity analyst, project manager, aerospace technologist, geospatial analyst, communications specialists, software tester, technical writer, and data analyst. It’s an industry that consistently needs skilled IT and tech workers with the expertise to develop, design, maintain, and implement complex aerospace systems, while ensuring they remain safe and secure.

Average salary: US$121,560

Increase since 2021: +2.9%

6. Consumer products 

The consumer products industry has seen massive growth in IT salaries, rising just over 14% since 2021, buoyed by a high demand for talent. Knowledge areas especially vital to the consumer products industry include e-commerce, digital marketing, supply chain management, and mobile app development. Other relevant roles include security professionals, project managers, UX/UI designers, product managers, data analysts, and business analysts. There’s a demand for skills around product optimization, customer service management, tracking digital and marketing trends, demand forecasting, data-driven decision making, improving efficiency, lowering costs, and navigating supply-chain management.

Average salary: US$121,052

Increase since 2021: +14.4%

7. Entertainment

At first thought, you might not think of the entertainment industry as having a high demand for IT roles, but there’s plenty of need for skilled tech professionals. Technology is a cornerstone to developing movies, video games, live events, music, and television shows. There’s typically a high demand for multimedia developers, video game developers, virtual reality developers, and production technologists. There’s also a growing need for streaming platform engineers, now that streaming services dominate for TV and movies. Other vital roles include project manager, security specialist, web developer, data analyst, and systems administrator.

Average salary: US$119,921

Increase since 2021: +0.1%

8. Utilities/Energy

You’ll find demand for some unique job titles in the utilities and energy industries, including SCADA engineer, renewable energy engineer, smart metering specialist, grid modernization specialist, energy data analyst, energy efficient consultant, GIS specialist, and energy storage engineer. There’s also a demand for more typical IT roles such as project manager, data scientist, cybersecurity professional, RPA developer, IoT engineer, asset management specialist, data center manager, and more. The industry has a demand for highly-skilled IT pros who have the skills and knowledge to navigate complex and technical systems and networks.

Average salary: US$118,498

Increase since 2021: +3.6%

9. Telecommunications

IT jobs are a natural fit for the telecommunications industry, given that it is fully dependent on technology. You’ll find demand for nearly every IT job you can think of, including project manager, software developer, systems analyst, network engineer, security specialist, data analyst, radio frequency (RF) engineer, cloud engineer, and consultants. There’s a broad range of job roles available, and you’ll be hard pressed to find an IT job title that doesn’t fall under the umbrella of telecommunications. And for an industry that has relied on technology since the start, there has still been a nearly 7% increase in the average tech salary since 2021, suggesting the demand only continues to grow.

Average salary: US$115,940

Increase since 2021: +6.8%

10. Insurance

As insurance companies turn to digital services, the industry has seen growth in demand for IT workers who can help build, deploy, and maintain internal and external apps and services. In order to meet the growing demand for tech talent, companies such as Progressive have invested in internal upskilling bootcamp programs to help close the skills gap. Similar to the healthcare industry, the insurance industry deals with a high volume of often sensitive and confidential data, so there’s typically a demand for cybersecurity and data workers, in addition to developers and engineers. And as more insurance companies develop client-facing apps and services, there’s a need for UX/UI designers, developers, and engineers.

Average salary: US$114,522

Increase since 2021: 0%

Careers, IT Jobs, Salaries]]>
https://www.cio.com/article/481457/the-10-highest-paying-industries-for-it-talent.html 481457
IT execs’ doctorate research helps drive digital success Thu, 22 Jun 2023 10:00:00 +0000

According to Statista, $1.5T was spent on digital transformation initiatives globally in 2021, and that number is only continuing to grow. Yet research from BCG shows that 70% of digital initiatives fail, which translates to more than a trillion dollars in failure.

Why are digital transformation initiatives failing at such a high rate, and how do we set our companies up for success?

For an episode of the Tech Whisperers podcast, I had a chance to explore these questions in depth with John Hill, chief digital information officer at MSC Industrial Supply; Susan Nakashima, a recently retired IT executive and now adjunct professor at Pepperdine University; and Michael Seals, chief digital officer and SVP of strategy at Hussmann. Each of these leaders recently completed doctorate degrees and conducted research related to these challenges.

After the show, we spent some more time unpacking their research and leadership philosophies, focusing on the changing role of CIOs and the opportunity they have to lead, guide, and facilitate a new discussion at the digital C-suite table. What follows is that conversation, edited for length and clarity.

All were quick to emphasize, as Seals said, “gratitude and appreciation for the people reading this and listening to the podcast, because so many contributed to our knowledge. It was a team effort.” Hill observed that more than 160 CIOs contributed to his research, while Nakashima’s work was supported by the contributions of 400 survey participants, including 92 team leaders and their employees, representing three industries (utilities, entertainment, and nonprofit) across four regions of the country.

Dan Roberts: John, through your research, you created the term organizational digital agility (ODA), which includes three components: slack, alignment, and speed. In the podcast we talked about slack. How do you define alignment, and why is it important?

John Hill: Alignment reflects that organization’s ability to understand the relative importance of every initiative and operation against the others. Those that possess more effective methods of prioritizing initiatives across the organization have a lot of advantages over those that do not. Most importantly, they can resolve conflicts that come up when there are competing initiatives.

A lot of organizations will prioritize by saying, for example, here are our top five initiatives. Oftentimes, though, those aren’t in ordinal order. Associates can be assigned to those initiatives and a number of other projects and operations. If they don’t have that alignment process, when a conflict comes up, they might be working on the wrong thing. As a result, a more important initiative might fall behind.

A second issue that happens is that, when there are changes in assumptions in initiatives midyear, organizations that have that alignment process in place can reassess all the impacts and reprioritize everything. They’re able to be much more nimble during the year. That relates a third dynamic, which is when new initiatives are proposed midyear, the organization knows how they fit within the current portfolio. And more importantly, they can provide clarity to the entire organization. Organizations that don’t do it well tend to just add it and then there’s lack of clarity as to what needs to be done when. If leaders have clarity on prioritizing organizational initiatives, it really allows that organization to react with a much higher level of speed and precision.

Roberts: Can you elaborate on the role of the CIO and how it relates to alignment?

Hill: The CIO is an orchestrator. They’re at the center, because there aren’t many initiatives that don’t require some form of technology, so if the CIO understands the importance of alignment, they have ability to orchestrate that.

There are a few things they can do. First, make sure they have an ordinal ranking. It can’t be, ‘This is my top.’ It’s got to be one, two down to 50, 60, whatever the number is. Second, they need to understand true capacity in the organization. There are lots of tools out there, but probably the simplest is an enterprise Kanban board. Third, they can look to reduce the use of shared resources as much as possible. Smaller companies are going to have a challenge on that, but larger companies can try to create end-to-end teams that are made up of everything that a product owner needs to prioritize what they need to get done.

Roberts: Mike, can you talk about the changing role of the CIO and the shift from working to get a seat at the table to having responsibility for getting the C-suite to the digital table?

Michael Seals: In my mind it’s not a digital table; it’s a digital restaurant. Digital conversations are occurring simultaneously, so the CIO has to be able to network across the organization. I recently changed my role to be more strategically focused and brought in a new CIO, Erin Williams, who has been a very powerful voice and addition to the Hussmann leadership team.

However, it’s a big job and can’t just be the CIO. There are a lot of technology leaders across the organization. To this end, we created a role for a digital architect to help us define the future state of our digital environment, drive our data initiatives and, most importantly, ensure all these initiatives integrate. Without this digital “manifold,” we would not be able to capture the real value of our digital initiatives. To fill this role, we elevated Kesava Annadorai, one of our bright, innovative thinkers. 

Quite frankly, IT is not the sole place for digital strategies anymore. They’re popping up across the organization. So, for the C-suite, it’s about, how do we bring all these things together and create that alignment across the organization that John talked about. That’s why it was it was so important for us to develop an enterprise governance model to help coordinate all these different digital activities.

When it comes to the digital table, I think the C-suite is fully engaged now. They see the threats, and they’re gaining awareness about the technology opportunity related to it. They’re also realizing that their strategies are so dependent upon technology that you can’t separate the two. The top-level leadership team must be part of that conversation.

Roberts: I’ve observed that the best leaders embody what I call the 7Cs of great leaders: They have the Customer at the center, they build great Culture, they Cultivate their people, they lead with Courage, they don’t manage Change but bring people along the journey, and they’re Collaborators and Communicators. Susan, can you speak to this and what stands out for you?

Susan Nakashima: I absolutely agree with the 7 Cs. The job is getting more complex, but I also believe it’s becoming an even more exciting time to be a leader in IT. We talk about the three-legged stool — people, process, and technology — and while I believe all three are critical, my heart’s really focused on the people component.

I think it’s important to get to know each member of our team, personally and professionally, understand what motivates them, and channel that energy for their success as well as the organization’s. I believe leveraging employees’ abilities and preparing them for the next step up the career ladder is really a privilege and something I find intrinsically rewarding.

Roberts: Turning to metrics, Mike, performance management is a key component of your framework. What are successful digital leaders measuring that others are not, and what’s the result?

Seals: Digital transformation is unique compared to other major enterprise-level initiatives like TQM and Lean Management. In those previous waves of management theory, there was always a single kind of purpose around them. Digital transformation is contextual. It’s tied to customer value and competitive advantage. Like a fingerprint, it’s unique to each organization. I spent a lot of time talking to a lot of really smart people on this topic, and nobody can say there is a single measure digital transformation success.

My research settled on a two-part measurement: One, is it making the company stronger versus the competition — that’s your competitive advantage — and secondly, are we creating more customer value? Those are the two outcome variables that were built into my research. Ultimately, though, each organization is going to have a different metric that they measure. What’s most important, from what my research identified, is that the organization is measuring it and articulating it. Then the organization can rally behind it.

Roberts: John, let’s talk about metrics in terms of speed, which is another component of your ODA framework. Why is speed so hard to measure and what can leaders do to get their arms around it?

Hill: Given the rate of innovation and change in technology, speed is so important for organizations to be able to get that competitive advantage faster than their peers. Yet finding a proxy for organizational speed is difficult. In my research, I couldn’t really find it. Still, we intuitively know that the longer a project takes, the more chances it’s going to go off the rails. To study this phenomenon a little better, I used the size of project as a proxy.  

What I found is, the organizations that are better at chunking down their projects into smaller pieces have a better likelihood of beating any environmental factors that might impact their execution. Leaders will then ask me, ‘Well, what can I do besides Agile?’ One, put processes in place that will address the speed at which decisions need to be made on issues that affect digitization efforts. Two, track the time it takes to make decisions. If a team says they have to go through three steering committees in order to get something resolved, that’s probably not the definition of speed. Third, going back to my other point about alignment, there are inevitably going to still be conflicts that come up that aren’t resolved in terms of just a pure alignment. My suggestion is to put in an organization-wide cross-functional steering committee that is set up to resolve those conflicts immediately.

Roberts: Susan, your research focused on psychological safety and its impact on winning the hearts and minds of our teams. What is the definition of psychological safety and how does that contribute to driving speed and agility in our teams?

Nakashima: Amy Edmondson, who developed the concept of team psychological safety, described it as a shared belief by team members that the team is safe for interpersonal risk-taking. She went on to say that when individual contributors are selected to be team leaders based on their technical proficiency, they may not have the interpersonal skills they need to foster these open dialogues that are just so important.

That’s certainly what I’ve seen and continue to see, which is first-level technical leaders struggling with the human side of their leadership role. What I found is that, without establishing a psychological safety net, leaders are creating environments where employees are unwilling or afraid to share their ideas and expertise. Certainly, that’s what came about as part of my research. But with effective transformational leadership training, leaders are creating a safe environment, allowing their staff to truly bring forth their best thinking in our fast-paced and competitive industry.

Robert: How are each of you leveraging your dissertation research in your day jobs?

Hill: At MSC, my peers and I took our list of strategic projects and made an ordinal list so we could give clarity to associates. Within the teams, we’re trying to drive people to understand that portfolio-based approach to sprints — you can’t compare a new feature against other things. That then helps drive the slack conversation. It’s much easier for me to get people to think about that portfolio and the assigning different pieces to sprints versus saying, reserve time for slack.

The last thing is trying to create those end-to-end pods for the product teams. I’ve got a number of product teams directly in my organization, so we’re incubating it there first. Those teams will have everything they need to deliver — from business analyst, engineers, data analysts, data engineers, QA — so ultimately, there’s not a multi-step approach where we’re ready to go but the data team or the security team is not ready to go. These pods will have everything they need to be able to execute at a speed we expect.

Nakashima: I was privileged to be asked to present my research to the members of Innovate at UCLA at the fall CXO Exchange in December. Teaching a digital innovation class at Pepperdine University has also been an honor, as well as mentoring some of my students. I’m also pursuing opportunities to share my research as part of technical leadership training programs, so it’s a very exciting time.

Seals: To me, the research is interesting, but the real value for the audience is in the discussion about intellectual curiosity and intentional learning. The success of the CIO role is as much about understanding organizational theory as it is about technology. You have said it many times — the CIO role is unique in that it has a complete view of the enterprise. Understanding how organizations work is necessary; understanding why they work in the way they do is a differentiator for a CIO. That is the common theme across all our research — trying to understand the ‘why.’

Tune in to the Tech Whisperers podcast for more insights from these technology executives and takeaways from their doctoral research.

Digital Transformation, IT Leadership]]>
https://www.cio.com/article/482176/it-execs-doctorate-research-helps-drive-digital-success.html 482176
How AI is reshaping demand for IT skills and talent Thu, 22 Jun 2023 10:00:00 +0000

AI is quickly becoming an essential part of daily work. It’s already being used to help improve operational processes, strengthen customer service, measure employee experience, and bolster cybersecurity efforts, among other applications. And with AI deepening its presence in daily life, as more people turn to AI bot services, such as ChatGPT, to answer questions and get help with tasks, its presence in the workplace will only accelerate.

Much of the discussion around AI in the workplace has been about the jobs it could replace. It’s also sparked conversations around ethics, compliance, and governance issues, with many companies taking a cautious approach to adopting AI technologies and IT leaders debating the best path forward.

While the full promise of AI is still uncertain, it’s early impact on the workplace can’t be ignored. It’s clear that AI will make its mark on every industry in the coming years, and it’s already creating a shift in demand for skills employers are looking for. AI has also sparked renewed interest in long-held IT skills, while creating entirely new roles and skills companies will need to adopt to successfully embrace AI.

Emerging AI jobs and skills

The rise of AI in the workplace has created demand for new and emerging roles in IT and beyond. Chief among these are roles such as prompt engineers, AI compliance specialists, and AI product managers, according to Jim Chilton, CTO of Cengage Group.

Other emerging roles include AI data annotators, legal professionals specializing in AI regulation, AI ethics advisors, and content moderators to track potential disinformation around AI, says Robert Kim, CTO at Presidio.

Organizations are also seeking more established IT skills such as predictive analytics, natural language processing, deep learning, and machine learning, says Mike Hendrickson, VP of tech and dev products at Skillsoft. In addition to these skills, he says he’s also seen an uptick in demand for skills around large language models, ChatGPT, and similar generative AI bots.

AI has also created a demand for new C-suite roles “focused purely on leveraging generative AI throughout all aspects of business—from internal ways of working to external AI-powered product solutions for customers,” says Chilton.

“Those who embrace the technology and understand how to use it to accelerate and improve their work will be rewarded, while those that don’t will be left behind,” says Chilton. “Ultimately, the profitability barrier between those who embrace AI and those who don’t will determine the longevity of those businesses, or even those industries.”

Agile skills will put you a step ahead

Agile might not be the first skill you think of when it comes to AI, but companies that have already embraced agile workflows and mindsets will be in the best position to integrate AI tools and solutions. These organizations will be better prepared to accommodate the rapid change associated with AI, making it easier to adopt new technology as it emerges.

Organizations with an agile and DevOps mentality, where you’re continually deploying, redeploying, and testing, are already well accustomed to the process of releasing new processes, services, or products, and then getting feedback and continuously improving on it, says Hendrickson. This mentality will make it easier for those companies to quickly embrace and deploy AI tools and solutions compared to companies with slower processes, legacy technology, or roadblocks to deployment.  

There’s also a growing need for domain and organizational knowledge associated with AI, as it’s vital to have a deep understanding of organizational needs in order to determine which AI technologies will be best suited to a given application. “Those that have agile in their organization are going to be able to harness that domain expertise and domain knowledge much better,” Hendrickson says.

A stronger focus on security

AI opens new doors for security threats and compliance issues as well that organizations must be prepared to address.

“On the technical side, I see security as hugely important,” says Hendrickson. “A lot of companies say, ‘We’re not letting people touch ChatGPT yet, we’re just not allowing it—it’s blocked.’” But end-users’ propensity for finding ways to improve their work processes will no doubt lead to greater levels of shadow IT around such emerging technologies, and thus, security implications will eventually need to be tackled beyond simply trying to hold back the tide.

Moreover, Hendrickson points to the fact that just a few years ago, discussions around machine learning centered around its ability to break encryption, and with quantum machine learning on the horizon, that concern has only increased. As companies navigate AI in the workplace, they’re going to need skilled professionals who can identify potential risks and pinpoint possible solutions.

There are also increased complexities around “managing the infrastructure and platforms that provide resources to power applications, and to store and access data,” says Kim. Organizations will need people capable of employing automation to help with securing, provisioning, and orchestrating these modern distributed platforms.

Soft skills persist

Technical skills are changing faster than ever—to the point where it’s likely that what students learn in their first year of a CS degree could be obsolete soon after they graduate. AI will only accelerate the pace of technology, and even automate some of the hard skills IT professionals have to offer, which means soft skills will only become more important.

“The half-life for hard skills or technical skills is getting shorter as technology rapidly changes,” says Chilton. “Just a few years ago there was a big push to have everyone learn to code. While we still need people who can code, the growth of low-code or no-code platforms now reduces the need for coding skills. Skills that are more enduring tend be those such as the ability to think critically, problem-solve, communicate effectively, and collaborate with others.”  

With AI, there’s also the opportunity for organizations to decrease mundane, tedious, and administrative tasks, says Kim. This will free up workers to focus on projects that require more brainpower and require a stronger emphasis on time management, team collaboration, and leadership to ensure success.

Demand for workers invested in continuous learning and development will also continue. Going into tech, workers make an “implicit commitment to themselves that they’ll continuously learn and improve because tech changes so quickly,” says Hendrickson. Companies will be even more motivated to hire tech workers who demonstrate a passionate commitment to learning new skills and maintaining a finger on the pulse on emerging technologies.  

An eye on upskilling

As with most things IT, demand for AI skills will outpace the talent market, so companies will need to turn inward and identify opportunities for training.

To address this, Hendrickson says Skillsoft has created teams around individuals with AI backgrounds, tasking them with upskilling others in the organization. Such approaches to building talent from within provide a huge benefit, he argues, as they emphasize the importance of domain and organizational knowledge.

“You want to upskill the people in your organization because they already have the knowledge of the potential products or any benefits,” he says. Rather than hiring from competitors or outside the organization, “take the talent you have and upskill them into the right roles,” he adds. You’ll not only gain the skills you need to advance with AI adoption, but you’ll retain that expertise, and domain and organizational knowledge that’s so vital to digital transformation.

Another area that will benefit from upskilling is AI ethics. Having employees with strong domain expertise and organizational knowledge who can keep an eye on ethical questions that arise surrounding AI will be crucial. Hendrickson calls these folks the “humans in the loop,” as they provide the human checks and balances to monitor the veracity and value of generative AI.

Hendrickson gives the example of using Bard and ChatGPT to write code to scrape a website, and using one AI to check the other AI’s work. The final programs didn’t work, yet both AI bots claimed the programming was correct. In this case, a human eye was necessary to identify the mistakes made by both bots. Ultimately, results from generative AI are not solid enough to be relied on without having humans involved to fact check.

“Even if we’re programming with a bot, the human is the one who’s going to hopefully make the final choice,” says Hendrickson.

Add AI sanity checks to your list of future must-have skills.

Artificial Intelligence, CIO, Emerging Technology, Generative AI, IT Leadership, IT Skills]]>
https://www.cio.com/article/641589/how-ai-is-reshaping-demand-for-it-skills-and-talent.html 641589
Generative AI won’t automate your way to business model innovation Wed, 21 Jun 2023 20:27:01 +0000

Generative AI is changing the world of work, with AI-powered workflows now slated to streamline customer service, employee experience, IT, and other fields. If we just slap the letters “GPT” to our efforts, everything will be right on track, right?

Nope.

Integrating artificial intelligence into business has spawned enterprise-wide automation. One report estimates that 4,000 positions were eliminated by AI in May alone. I get it. Restructuring and automating are necessary parts of business survival. But as legendary Apple designer Jony Ive once advised Airbnb co-founder and CEO Brian Chesky as the company mulled cuts, “You’re not going to cut your way to innovation.”

In 2022, companies were still reeling from the rapid digital transformation efforts to survive the pandemic. Then ChatGPT came along and changed everything again. Generative AI is already starting to power the day-to-day tools we use, inspire a new wave of intelligent applications, and even reimagine the world of enterprise software and the world of IT. Not since the iPhone have we witnessed a technology change the course of human behavior and the imagination of people everywhere practically overnight. Future-proofing work now becomes a mandate and an opportunity to innovate.

This is a time when businesses are required to do so much more with less. At the same time, the tech sector is facing a severe skills shortage, with The Financial Times reporting managers having a harder, not easier, time finding the talent they need despite all the headlines of layoffs.

To guide business leaders in how to reskill their employees and hire new tech talent, they will first need to understand what’s changing and why. New research published jointly between Pearson and ServiceNow found that AI is already affecting the tech skills needed for tomorrow’s work. The study’s data shows that as automation eliminates repetitive tasks, the pendulum will swing toward the distinctly human skills of communication, creativity, and analytical thinking. The more we design AI to do the work where it excels, the less humans will have to behave like machines.

But not even a ChatGPT super prompt will make progress or transformation easier. A response generated by AI won’t solve the real business challenges facing executives across the organization, either. A common but critical challenge I hear from CIOs, CTOs, and CDOs every day is that they have a difficult time helping the C-suite understand that IT is the very architecture for the future of business, not a cost center. How do you convince decision-makers to collaborate on linking IT strategy with business strategy?

Technology won’t solve (all) your problems

When avant-garde artist, composer, musician, and film director Laurie Anderson was named artist-in-residence at Australian Institute for Machine Learning (AIML), she mused about the role of AI in creative problem-solving. She recalled one of her favorite quotes by, of all people, her meditation teacher: “If you think technology will solve your problems, you don’t understand technology — and you don’t understand your problems.”

Her point is that AI or generative AI isn’t a silver bullet. She compared AI to the purpose of art, which made me think differently about the role then of AI and creativity in business transformation.

“When people say the purpose of art is to make the world a better place I always think: better for who?” Anderson said in the same Wired article. “If I had to use one word to describe art it would be freedom,” she continued. “I’m curious about whether this freedom can be translated or facilitated by AI in a meaningful way.”

The same can be said for digital and business transformation. If work and technology are to serve the purpose of making businesses better in this digital renaissance, the question is, better for who? And what does better look like? What makes it more meaningful?

Like automation, the prompts most of us are experimenting with are rooted in what we know. Thus, the problems we’re trying to solve are based on how we see them today. With a more open mind, creativity, and human ingenuity, we can reframe our problems and also pursue previously unforeseen opportunities.

It’s really up to you to define what the future of business looks like, how it works, and where it can go. That’s the thing about the future, it hasn’t happened yet.

In an era of AI-first business transformation, in addition to automation and elimination, AI becomes a force multiplier for growth. The more successful companies will augment and empower employees and reimagine roles with artificial intelligence to outperform everyone else.

This means that shaping the future of business starts with a new blueprint. Many things have to be created to support an organizational construct that’s being necessitated in real-time. Evolution in a time of Digital Darwinism also means that many things we do today are outmoded or obsolete and must be shed and left behind to survive and thrive.

Reinventing ourselves — with AI

What we need is to rethink our very understanding of how businesses operate and how technology transforms the organization for a new future. Technology leaders must now redefine their roles, beyond information technology, ITSM, and ITOM, to support dedicated imperatives that also simultaneously drive business innovation.

Just think, in less than 10 years, organizations were pushed to evolve from industrial-era operations to become digital-first, data-first, and now AI-first companies. Where will your business be in 10 years? Future-proofing the workforce begins with understanding the effect AI will have on the emerging skills employees need.

Here are a few suggestions that will help you fly over silos in order to identify targeted opportunities aimed at aligning technology investments and topline goals:

  1. Understand C-Suite perspectives: Start by gaining a deep understanding of the concerns and objectives of the decision-makers. Top of mind among business leaders about AI is governance and security. Understanding their concerns will help you tailor your approach to address their specific pain points and priorities.
  2. Speak their language: Frame your arguments in terms of business outcomes and financial benefits. Focus on how technology can directly impact revenue growth, customer experience, operational efficiency, and competitive advantage. Avoid technical jargon and use concrete examples and case studies.
  3. Align with business goals: Clearly articulate how IT initiatives can directly support the broader business objectives of the company and help gain competitive advantages. Identify specific areas where technology can enable growth, such as customer and employee experience, optimized operations, supply chain optimization, and data-driven decision-making.
  4. Create a joint business + IT roadmap: Develop a clear roadmap in sync with the company’s strategic objectives, focused on business impact. Show the short-term and long-term benefits, along with the associated costs and risks. Break down the plan into manageable phases and highlight quick wins that can demonstrate early success and generate momentum.
  5. Quantify the value: Use data and metrics to demonstrate the potential return on investment (ROI) of IT initiatives. Also spotlight the other side of ROI (return on ignorance). What’s the opportunity cost of not doing these things? Estimate the financial impact of improved efficiency, increased sales, reduced costs, or enhanced customer satisfaction.
  6. Foster collaboration: Emphasize the importance of collaboration between IT and other business functions. Demonstrate how these collaborations accelerate business objectives. Highlight that technology is an enabler for innovation and that a cross-functional approach can lead to better solutions. Encourage open dialogue, seek input from decision-makers, and involve them in the decision-making process to gain their ownership and support.
  7. Continuous communication: Maintain ongoing communication with the decision-makers to keep them informed about the progress, milestones, and outcomes of IT initiatives. Provide regular updates, reports, and presentations that highlight the value created by technology investments.

Starting over

Technology leadership augmented with AI will unite stakeholders who have traditionally been separated by functional specialization and technical limitations, using proven strategies for change management and silo-unifying technologies. To earn the attention of business decision-makers and link IT strategy with business strategy, technology leaders must connect the role and value of digital transformation in driving business transformation.

Henry Ford observed in 1922, “Many people are busy trying to find better ways of doing things that should not have to be done at all. There is no progress in merely finding a better way to do a useless thing.”

More than ever, companies simply cannot afford to waste resources at a time when bringing together digital and business transformation is the only way to keep up with the quickening pace of innovation. And you can’t automate your way out of outdated processes.

Artificial Intelligence, Business, Generative AI]]>
https://www.cio.com/article/642826/generative-ai-wont-automate-your-way-to-business-model-innovation.html 642826
How to Craft a Cloud Experience Without Busting the IT Budget Wed, 21 Jun 2023 14:15:01 +0000

Today’s technology leaders grapple with a paradox.

They must do more with less while facilitating the work required to transform the business. That requires investing in digital capabilities that lead to desired business outcomes.

Data suggests IT leaders will spend despite a challenging macroeconomic environment that includes inflation, snarls in the supply chain and other financial pressures. Fifty-two percent of enterprises expect to increase spending on IT products and services in 2023, according to an ESG survey of senior IT professionals1.

Investing in IT without busting the budget is no mean feat, for even the most well-heeled organizations. But with costs soaring worldwide, it’s incumbent upon IT departments to focus on solutions that accelerate the business while creating cost efficiencies.


The Public Cloud’s Greatest Gift is Still OpEx


Some relief may be found in the playbook created by the public cloud market. In addition to rapid innovation, the public cloud helps startups and stable businesses alike scale up while navigating short-term financial challenges.

But like Father Time, the law of diminishing returns remains undefeated. As has been widely reported, the public cloud can also cost more than originally anticipated on a long enough timeline—and produce execution pitfalls and unwelcome regulatory surprises.

Yet the public cloud’s flexible financial model remains attractive to enterprises leery of large CapEx investments. Budget-conscious organizations are increasingly turning to pay-per-use subscriptions billed as OpEx.

Leveraging such a consumption-based model, IT departments can reduce overprovisioning by 42% and support costs by up to 70%, as well as realize a 65% reduction in unplanned downtime events, according to IDC research commissioned by Dell2.

Such consumption-based models, paired with flexible infrastructure, provide a cloud experience without the headaches associated with data locality and security rules. And such solutions appeal to organizations seeking to spread out payments while continuing to run and grow their business.


The Business Cases for Cost-based Consumption


Organizations that have embraced the shift attest to the benefits of flexible, pay-per-use infrastructure. For instance, switching to a consumption-based model as it replaced aging infrastructure has paid dividends for engineering and services firm NG Bailey.

In addition to greater cost control, the switch reduced the U.K. company’s IT restore time for critical business systems from 8 hours to just 30 minutes and decreased support calls related to infrastructure by 75%, freeing up IT staff to focus on other business priorities.

Such efficiencies dovetail with market research, which found that a consumption-based model can make organizations 38% more efficient overall, thanks to reductions in time decommissioning and retiring hardware, automated patching and other administrative blocking and tackling, IDC said.

Moreover, at a time when organizations are pushing for greater sustainability across their operations, the move helped NG Bailey cut its datacenter footprint in half—resulting in lower power consumption to support the company’s net-zero sustainability goals.

“For NG Bailey, it’s been transformational to know our IT costs and only pay for what we use,” said Stephen Firth, infrastructure manager for NG Bailey. “As a result, we can certainly control our costs more now. We know how much we’re paying and there are no hidden charges.”

NS Solutions, a group company of Nippon Steel Corporation, has leveraged the pay-per-use model to provide a managed cloud experience for customers who cannot migrate to the public cloud due to requirements for security and closer proximity to their data, as well as the need for low latency.

The move has helped NS Solutions cut the time to provision infrastructure from four months to two months and manage infrastructure more efficiently while reducing cost exposure from short-term cancellations.
In replacing traditional procurement with a consumption-based model, IT leaders surveyed by IDC spent on average $1.5 million per year less to run equivalent workload and application environments.

NS Solutions also removed the burden of IT management and operational tasks for its customers, freeing them up to attend to other business priorities. The company manages the entire IT environment from a single console and executes updates remotely—similar to how public clouds operate.


The Bottom Line

With cost and flexible computing benefits such as these, why isn’t everyone moving to a consumption-based operating model for IT?

Data suggests the shift is catching on, as 61% of organizations worldwide are interested in migrating to consumption-based models for IT investments, according to IDC.

With Dell APEX Flex on Demand, organizations can pick hardware and software configurations while paying only for what they use, with a single billing rate that helps them accurately predict future costs.

In a bear economy, more IT leaders will seek consistent cloud experiences that let them focus on supporting business stakeholders with digital solutions rather than focusing on managing infrastructure.

These IT systems must be tuned to deliver optimal performance, scalability, agility and control as leaders innovate to deliver optimal outcomes—and deliver business value.

Learn more about our portfolio of cloud experiences delivering simplicity, agility and control as-a-Service on-demand: Dell Technologies APEX.


12023 Technology Spending Intentions Survey, ESG, Nov. 2022
2The Business Value of Dell Technologies APEX as-a-Service Solutions, IDC, August 2021
Cloud Computing]]>
https://www.cio.com/article/642691/how-to-craft-a-cloud-experience-without-busting-the-it-budget.html 642691
ChatGPT is not your AI strategy Wed, 21 Jun 2023 14:13:28 +0000

Since its launch in December 2022, ChatGPT, together with Google Bard and other large language models (LLMs), has been the subject of articles in the most prestigious publications and on broadcast television, accumulated millions of posts and discussions worldwide, and sparked an overnight pivot in sales and investment strategy for many of the world’s largest organizations.

Employees, shareholders, customers, and partners are looking to organizational leaders to answer the questions: What is your AI strategy? What is your ChatGPT strategy? What does this mean for your workers?

This is a pivotal leadership moment. The approaches that worked for creating a digital strategy and a data strategy won’t work this time around, given the deeper questions raised by this technology together with the media attention it has received.

ChatGPT is a powerful tool, and within the context of the market imagined as a chessboard, it is like a pawn, capable of being promoted to one of the most powerful pieces on the board, but only if orchestrated together with the rest of the pieces.

An LLM is only one piece on the board

Understanding the capabilities of LLMs as one piece on the board is necessary to set a strategy for the future of the organization, and it anchors on the question of authority.

In layman’s terms, these language models take prompts such as “Create an AI strategy” and provide answers based on massive amounts of data that, at first glance, are surprisingly cogent.

At second glance, however, they distill information that already exists and recast it based on what it “seems” like the answer should be. They have no authority in and of themselves to tell you the actual answer.

If a researcher published a paper based on years of technical research, and a student with no technical experience summarized the paper in five bullet points, the summary may be accurate as rewordings of the underlying paper, but the student would not know whether it was accurate or be able to answer any follow-up questions without going back and quoting something else from the research that seemed like it might answer the question.

The image for this article is a great example. It was generated by DALL·E 2 based on this prompt: “A photo of an ornately carved pewter chess set on a chess board in front of a window at sunrise.” The generated image does seem like a chess set on a chess board, but any human – not even an expert, but any human who has ever learned how to play chess – can instantly recognize that there should not be three kings on the board.

Practical applications where LLMs can be applied retain human authority, such as systems in which experts can interact with archived institutional knowledge. For example, if a network engineer could describe a particular file she knew existed but for which she had forgotten the name and location, an LLM could help provide much more precise recommendations than previous systems.

The key ingredient to the successful application of these models is that humans remain the authority on whether something is accurate and true, with LLMs serving as accelerants for experts to navigate and generate information.

The rest of the pieces

LLMs are only one type of piece on the board, alongside deep learning, reinforcement learning, autonomous artificial intelligence, machine teaching, sentiment analysis, and so on.

Ironically, many of the other pieces on the board have more readily available and practical applications than LLMs despite the fact that fewer people are familiar with them.

For example, some companies have developed autonomous artificial intelligence systems to control machines where there was no historical data. To account for a lack of historical data, simulations were made of the environment and of the machine, paired with curricula created by the humans who operated the machine, and deep reinforcement learning was leveraged for the system to create its own data through simulated experience of what to do and what not to do to successfully control that machine.

Another powerful piece on the board is the application of artificial intelligence in real time to streaming data, moving organizations away from applying algorithms in nightly or weekly batches or even manual jobs to intelligence and learning applied in the moment.

These kinds of applications have strong economic potential, but because they cannot be accessed by anyone at home on a laptop or phone, they are not as well-known, and leaders are at risk of missing the signal of near-term value within the noise.

Autonomous, real-time, and generative AI all have valuable applications, and the most compelling can be found in combining them for exponential value. For example, when a customer calls a customer support center, real-time AI can analyze the customer’s voice for sentiment and transcribe their speech to text, which, up until recently, has then been used to perform searches and recommendations of knowledge articles to assist the customer care agent to resolve the customer concern within a matter of minutes.

The addition of generative AI to this picture means the transcribed customer speech can be leveraged as prompts to infer intent and generate more precise recommended responses to customer challenges, in seconds. Human authority can be maintained by embedding the underlying knowledge article(s) below the generated text for the customer care agent to validate generated responses.

Amid the sea of change, with AI pieces receiving varying degrees of investment and recognition, the leaders who create the most value for their customers and organizations will be those who can see the entire board and understand the value of each piece without losing sight of the broader strategy in favor of a quick tactic.

Strategy can’t precede vision

The answer to the question of an AI strategy that makes the most of all the pieces on the board starts with vision. What is the envisioned future of the organization? What is the envisioned and desired future of the market?

The inevitable answer that comes to mind for many is to research trends or to gather data. What does Gartner or IDC say is the future?

These resources and practices are valuable and have their place, but the responsibility of setting the vision for the future of the organization cannot be outsourced, and it should not be a reaction to a hypothetical trend envisioned by someone else based on investments other organizations are making.

Leaders must start with the hard but essential question of what future they want to create for their people, their partners, and their customers, and then work backward to the present as the starting point. This process clarifies what investments must be made to create that future, with LLMs and other technologies serving not as the basis of strategy, but as powerful tools making the strategy possible.

Learn more about DataStax here.

About Brian Evergreen

DataStax

Brian Evergreen advises Fortune 500 executives on artificial intelligence strategy. He’s the author of the book Autonomous Transformation: Creating a More Human Future in the Era of Artificial Intelligence, and the founder of The Profitable Good Company, a leadership advisory that partners with and equips leaders to create a more human future in the era of AI.

Artificial Intelligence, Machine Learning]]>
https://www.cio.com/article/642694/chatgpt-is-not-your-ai-strategy.html 642694
Using business technology to help Ukrainians in need Wed, 21 Jun 2023 14:01:10 +0000

War has come to your home. You’re forced to leave all you know and travel to a foreign land. You need food, water, clothing, and other life essentials right now. 

But you’re not sure where to turn in the new land. And even if you’ve heard about distribution centers, there could be challenges ahead, including the need to visit multiple centers and stand in long queues for those essentials—wasting time you don’t have.

And then there are the organizations trying to help. They’re also facing challenges such as identifying and reaching those in need and knowing what to provide, which can lead to wasting donated items, especially perishable goods. 

Those are just some of the challenges that the millions of displaced Ukrainians and the non-governmental organizations (NGOs) committed to supporting those refugees have faced since the war began.

A free solution on the go

EY, the multinational professional services enterprise, has created an application that provides assistance in the event of disasters and conflicts worldwide. Known as the Emergency Response Application (ERA), the free mobile solution was built using the SAP Business Technology Platform and acts as a conduit to match people with resources.

Making the connection 

Before the application’s deployment, there was no collective database of NGOs working in Poland. That made it difficult, as mentioned earlier, for refugees to locate what they needed and for NGOs to deliver. 

With EY ERA, NGOs enter data into the application about their products and services, including food, hygiene items, and clothes currently available in their warehouses, as well as the locations of their distribution centers. Based on that information, a consolidated list of NGOs has been created, which has been automatically translated into Ukrainian and can be sorted based on the application users’ (the refugees’) location.

Help at their fingertips

EY ERA has an intuitive interface that makes the application simple to use. Refugees can quickly and easily access the application on their cell phones, sort by NGO in their locations, and use Google Maps to get directions to the local distribution centers. 

By knowing what each center offers at any given time, where the center is located, and its hours of operation, the refugees can pinpoint their most convenient and appropriate donation area to obtain the material or assistance they need. They can do that before they head out, saving them precious time and helping them to avoid the frustration of leaving a center empty-handed. 

“Thank you for the application. I was able to find centers that offer all that I need during such a difficult time for my family and me. It is easy to use and helpful, especially the map locator. You don’t have to waste time visiting all the centers,” says Kateryna Karpenko, an EY ERA user.

EY ERA also helps NGOs by providing a vehicle to get the word out about their products and services and keeping track of warehouse stock—what’s in demand—as it’s being distributed.

Going viral

To promote the application, a social media and web campaign was created in Ukrainian and Polish, generating more than 1.8 million views. The first phase of the application was launched in Poland in April 2022, in less than six weeks. Currently, 114 distribution centers are using EY ERA.

 “The war in Ukraine changed everything,” says Axel Janz, director at EY Technology and the leader of the EY ERA project. “Almost every NGO in the countries bordering Ukraine has expanded or modified its profile to effectively support those affected by the war. EY ERA helps NGOs and refugees to distribute material support more efficiently. In the beginning, we launched it in Poland. In the next period, we plan to extend its operation to other countries.”

Based on its success and purpose of uplifting people in need, EY won the SAP Innovation Award for EY Emergency Response Application – Contributing to the humanitarian aid for Ukraine. The 2023 SAP Innovation Awards are now celebrating its 10th anniversary. To learn more about EY ERA, see their Innovation Awards pitch deck and LinkedIn article.

Digital Transformation]]>
https://www.cio.com/article/641736/using-business-technology-to-help-ukrainians-in-need.html 641736
7 key questions CIOs need to answer before committing to generative AI Wed, 21 Jun 2023 10:00:00 +0000

Some companies use generative AI to write code and some use it to create marketing text or fuel chatbots. And then there are others like SmileDirectClub, that create images in order to answer the question of how to better serve their customers.

SmileDirectClub, the Nashville-based teledentistry company, uses generative AI to create teeth. Or, more specifically, to help people understand how their teeth can be corrected.

“We have a platform called the SmileMaker platform,” says CIO Justin Skinner. “We take a photo of your teeth with your phone and we generate a 3D model representation and we can project with AI what a straightening plan would look like, how long it would take, and what it would look like when we’re done.”

Existing generative AI platforms like OpenAI’s ChatGPT, Google Bard, or Stable Diffusion aren’t trained on 3D images of teeth. Not that any of these were even available when SmileDirectClub started.

SmileDirectClub built their own generative AI, using their own data set, on their own servers, in compliance with HIPAA, GDPR, and other regulations.

The company started the project three years ago with an external partner. Then, when that didn’t work, it hired it’s own team to build the proprietary models it needed.

“There’s nothing like this out there to the level of accuracy we need,” Skinner says. “Teeth are very tricky. There aren’t a lot of distinguishing marks, so getting an accurate 3D model from your phone is a difficult task.”

The first generation of the tool went live in November last year in Australia, and in May this year in the US, and around 100,000 people have used it so far. The next release will include a photorealistic projection of what the new teeth will look like.

Today, the tool only offers a draft treatment plan for customers, Skinner says. They still need to see a dentist or use an impression kit at home for the high-definition impression. This may also change in the future as the technology improves.

But that’s not the only way SmileDirectClub looks to take advantage of generative AI.

“We’re exploring—for cost reduction and efficiency reasons—leveraging tools like ChatGPT and Bard, and we look forward to playing around with Microsoft Copilot,” Skinner says.

His company isn’t alone.

According to a recent poll of senior executives conducted by The Harris Poll on behalf of Insight Enterprises, 39% of companies have already established policies or strategies around generative AI and 42% are in the process of developing them. Another 17% plan to, but haven’t started yet. Only 1% of companies have no plans to develop plans for generative AI.

In addition to how SmileDirectClub answers the critical question about customer care, here are seven others that CIOs need to answer that can help them formulate generative AI strategies or policies.

Where is the business value?

According to the Harris Poll, 72% of executives say they plan to adopt generative AI technologies in the next three years in order to improve employee productivity. And 66% say they plan to use it to improve customer service. In addition, 53% say it will help them with research and development, and 50% with automating software development or testing.

And that’s just the tip of the iceberg as far as enterprise use cases of generative AI are concerned—and it’s changing quickly.

CIOs have to work hard to stay on top of developments, says Skinner. More importantly, CIOs have to understand how the possibilities of generative AI in general specifically applies to their business.

“That’s the first question,” he says. “Do I really understand these things? And do I deeply understand how to apply it to my business to get value?”

Given the fast pace of change, understanding generative AI means experimenting with it—and doing so at scale.

That’s the approach that Insight Enterprises is taking. The Tempe-based solutions integrator currently has 10,000 employees using generative AI tools and sharing their experiences so the company can figure out the good as well as the bad.

“It’s one of the largest deployments of generative AI that I know of,” says David McCurdy, Insight’s chief enterprise architect and CTO. “I’m on a mission to understand what the model does well and what the model doesn’t do well.”

The novelty of generative AI might be cool, he says, but it isn’t particularly useful.

“But we sat down and fed it contracts and asked it nuanced questions about them: where are the liabilities, where are the risks,” he says. “This is real meat and bones, tearing the contract apart, and it was 100% effective. This will be a use case all over the world.”

Another employee, a warehouse worker, came up with the idea of using generative AI to help him write scripts for SAP.

“He didn’t have to open a ticket or ask anyone how to do it,” McCurdy says. “That’s the kind of stuff I’m after, and it’s incredible.”

The number one question every CIO should ask themselves is how their company plans to use generative AI over the next one or two years, he says. “The ones who say it’s not on the table, that’s a bad mistake,” he adds. “Some people feel they’re going to wait and see but they’re going to lose productivity. Their boards of directors, their CEOs are going to ask, ‘Why are other companies loving this tech? Why are we not?'”

But finding opportunities where generative AI can provide business value at the level of accuracy it’s capable of delivering today is just one small part of the picture.

What is our deployment strategy?

Companies looking to get into the generative AI game have a wide variety of ways to do it.

They can fine tune and run their own models, for example. Every week, there are new open source models becoming available, each more capable than the last. And data and AI vendors are offering commercial alternatives that can run on premises or in private clouds.

Then, traditional SaaS vendors like Salesforce and, of course, Microsoft and Google, are embedding generative AI into all their services. These models will be customized for specific business use cases and maintained by vendors who already know how to manage privacy and risk.

Finally, there are the public models, like ChatGPT, which smaller companies can access directly via their public-facing interfaces, and larger companies can use via secured private clouds. Insight, for example, runs OpenAI’s GPT 3.5 Turbo and GPT 4.0 hosted in a private Azure cloud.

Another option for companies with very particular requirements but no interest in training their own models is to use something like ChatGPT and then give it access to company data via a vector database.

“The value is using existing models and staging your own data beside it,” McCurdy says. “That’s really where innovation and productivity are going to be.”

This is functionally equivalent by pasting documents into ChatGPT for it to analyze before asking your questions, except that the documents won’t have to be pasted in every time. For example, Insight has taken all the white papers it’s ever written, all the transcripts of interviews, and loaded them into a vector database for the generative AI to refer to.

Can we keep our data, customers, and employees safe?

According to a May PricewaterhouseCoopers report, nearly all business leaders say their company is prioritizing at least one initiative related to AI systems in the near term.

But only 35% of executives say their company will focus on improving the governance of AI systems over the next 12 months, and only 32% of risk professionals say they’re now involved in the planning and strategy stage of applications of generative AI.

A similar survey of senior executives released by KPMG, released in April, showed that only 6% of organizations have a dedicated team in place to evaluate the risk of generative AI and implement risk migration strategies.

And only 5% have a mature responsible AI governance program in place, though 19% are working on one and nearly half say they plan to create one.

This is particularly important for companies using external generative AI platforms rather than building their own from scratch.

For example, SmileDirectClub’s Skinner is also looking at platforms like ChatGPT for the potential productivity benefits, but is worried about the data and privacy risks.

“It’s important to understand how the data is protected before jumping in head first,” he says.

The company is about to launch an internal communication and education campaign to help employees understand what’s going on, and the benefits and limitations of generative AI.

“You have to make sure you’re setting up security policies in your company and that your team members know what the policies are,” he says. “Right now, our policy is that you can’t upload customer data to these platforms.”

The company is also waiting to see what enterprise-grade options will come online.

“Microsoft Copilot, because of integration with Office 365, will probably be leveraged first at scale,” he says.

According to Matt Barrington, emerging technologies leader at Ernst & Young Americas, about half of the companies he talks to are worried enough about potential risks of taking a full-stop approach to ChatGPT and similar platforms.

“Until we can understand it, we’re blocking it,” he says.

The other half are looking to see how they can build the right framework to train and enable people.

“You have to be cautious but you have to enable,” he says.

Plus, even the 50% who’ve put the brakes on ChatGPT, their people still use it, he adds. “The train has left the station,” he says. “The power of this tool is so big that it’s hard to control. It’s like the early days of cloud computing.”

How do we guard against bias?

Dealing with bias is hard enough with traditional machine learning systems, where a company is working with a clearly defined data set. With large foundational models, however, like those used for code, text, or image generation, this training data set might be completely unknown. In addition, the ways the models learn are extremely opaque—even the researchers who developed them don’t fully understand yet how it all happens. This is something that regulators in particular are very concerned about.

“The European Union is leading the way,” says EY’s Barrington. “They’ve got an AI Act they’re proposing, and OpenAI’s Sam Altman is calling for hard-core regulations. There’s a lot yet to come.”

And Altman’s not the only one. According to a June Boston Consulting Group survey of nearly 13,000 business leaders, managers, and frontline employees, 79% support AI regulation.

The higher the sensitivity of the data a company collects, the more cautious companies have to be, he says.

“We’re optimistic about the impact AI will have on business, but equally cautious about having a responsible and ethical implementation,” he says. “One of the things we’ll heavily lean in on is the responsible use of AI.”

If a company takes the lead in learning how to not only leverage generative AI effectively, but also to ensure accuracy, control, and responsible use, it will have a leg up, he says, even as the technology and regulations continue to change.

This is why transcription company Rev is taking its time before adding generative AI to the suite of tools it offers.

The company, which has been in business for nearly 12 years, started out by offering human-powered transcription services and has gradually added AI tools to augment its human workers.

Now the company is exploring the use of generative AI to automatically create meeting summaries.

“We’re taking a little bit of time to do due diligence and make sure these things work the way we want them to work,” says Migüel Jetté, Rev’s head of R&D and AI.

Summaries aren’t as risky as other applications of generative AI, he adds. “It’s a well-defined problem space and it’s easy to make sure the model behaves. It’s not a completely open-ended thing like generating any kind of image from a prompt, but you still need guardrails.”

That includes making sure the model is fair, unbiased, explainable, accountable, and complies with privacy requirements, he says.

“We also have pretty rigorous alpha testing with a few of our biggest users to make sure our product is behaving the way we anticipated,” he says. “The use that we have right now is quite constrained, to the point where I’m not too worried about the generative model misbehaving.”

Who can we partner with?

For most companies, the most effective way to deploy generative AI will be by relying on trusted partners, says Forrester Research analyst Michele Goetz.

“That’s the easiest way,” she says. “It’s built in.”

It will probably be at least three years before companies start rolling out their own generative AI capabilities, she says. Until then, companies will be playing around with the technology in safe zones, experimenting, while relying on existing vendor partners for immediate deployments.

But enterprises will still have to do their due diligence, she says.

“The vendors say they’re running the AI as a service and it’s walled off,” she says. “But it still might be training the model, and there might still be knowledge and intellectual property going to the foundational model.”

For example, if an employee uploads a sensitive document for proofreading, and the AI is then trained on that interaction, it might then learn the content of that document, and use that knowledge to answer questions from users at other companies, leaking the sensitive information.

There are also other questions that CIOs might want to ask of their vendors, she says, like where the original training data comes from, and how it’s validated and governed. Also, how is the model updated and how the data sources are managed over time.

“CIOs have to trust that the vendor is doing the right thing,” she says. “And this is why you have a lot of organizations that are not yet ready to allow the newer generative AI into their organizations in areas that they can’t control effectively.” That is particularly the case in heavily-regulated areas, she says.

How much will it cost?

The costs of embedded AI are relatively straightforward. Enterprise software companies adding generative AI to their tool sets—companies like Microsoft, Google, Adobe, and Salesforce—make the pricing relatively clear. However, when companies start building their own generative AI, the situation gets a lot more complicated.

In all the excitement about generative AI, companies can sometimes lose track of the fact that large language models can have very high compute requirements.

“People want to get going and see results but haven’t thought through the implications of doing it at scale,” says Ruben Schaubroeck, senior partner at McKinsey & Company. “They don’t want to use public ChatGPT because of privacy, security, and other reasons. And they want to use their own data and make it queryable by ChatGPT-like interfaces. And we’re seeing organizations develop large language models on their own data.”

Meanwhile, smaller language models are quickly emerging and evolving. “The pace of change is massive here,” says Schaubroeck. Companies are starting to run proofs of concept, but there isn’t as much talk yet about total cost of ownership, he says. “That’s a question we don’t hear a lot but you shouldn’t be naive about it.”

Is your data infrastructure ready for generative AI?

Embedded generative AI is easy for companies to deploy because the vendor is adding the AI right next to the data it needs to function.

For example, Adobe is adding generative AI fill to Photoshop, and the source image it needs to work with is right there. When Google adds generative AI to Gmail, or Microsoft adds it to Office 365, all the documents needed will be readily available. However, more complex enterprise deployments require a solid data foundation, and that’s something that many companies are still working toward.

“A lot of companies are still not ready,” says Nick Amabile, CEO at DAS42, a data and analytics consulting firm. Data has to be centralized and optimized for AI applications, he says. For example, a company might have data spread between different back-end systems, and getting the most value out of AI will require pulling in and correlating that data.

“The big advantage of AI is that it’s able to analyze or synthesize data at a scale humans aren’t capable of,” he says.

When it comes to AI, data is fuel, confirms Sreekanth Menon, VP and global leader for AI/ML services at Genpact.

That makes it even more urgent than ever to enable the enterprise for AI, with the right data, cleansed data, tools, data governance, and guardrails, he says, adding “and is my current data pipeline enough for my generative AI to be successful.”

That’s just the start of what it’s going to take to get an enterprise ready for generative AI, he says. For example, companies will want to make sure that their generative AI is explainable, transparent, and ethical. That will require observability platforms, he says, and these platforms are only starting to appear for large language models.

These platforms need to be able to track not just the accuracy of results, but also cost, latency, transparency, bias, and safety and prompt monitoring. Then, models typically need consistent oversight to make sure they’re not decaying over time.

“Right now, you need to be putting guardrails and guiding principles in place,” he says. Then companies can start incubating generative AIs and, once they reach maturity, democratize them to the entire enterprise.

Artificial Intelligence, Business IT Alignment, CIO, Data and Information Security, Generative AI, ICT Partners, Infrastructure Management, IT Leadership, IT Strategy]]>
https://www.cio.com/article/482235/7-key-questions-cios-need-to-answer-before-committing-to-generative-ai.html 482235
Sysco’s recipe for growth centers on IT Wed, 21 Jun 2023 10:00:00 +0000

When Tom Peck joined Sysco during the peak of the COVID-19 pandemic, his major goal was ensuring the survival of the world’s largest food service delivery company and helping its thousands of customers stay afloat.

The Houston-based multinational was still delivering food supplies to sparsely populated buildings, cafeterias, airports, and nursing homes across the US—and helping its customers “reinvent” their businesses with curbside check-in, touchless menus, and QR codes for menus.

“We were one of the most impacted industries in the pandemic economy,” says Peck, who joined Sysco as EVP and chief information and digital officer in December 2020. “The pandemic forced us to review our company and the entire industry.”

While the company was well into its cloud journey when the pandemic hit, such a seismic event for a food distributor called for a major overhaul of its strategic vision, R&D plans, and digital transformation, Peck says.

The blueprint, called ‘Recipe for Growth,’ was announced in May 2021, roughly a year after Sysco appointed to its CEO position Kevin Hourican, a former top exec at CVS Health and CVS Pharmacy.

Recipe for Growth, for which Sysco has earned a 2023 CIO 100 Award for innovation and IT leadership, is based on applying B2C principles to Sysco’s B2B business, and calls for the company to grow 1.5 times the size of the entire industry—estimated to be valued at $330 billion in the US alone, Peck says.

“Surviving the pandemic wasn’t enough,” Peck adds. “We needed to transform ourselves … and grow faster than our competitors and faster than our markets required. The Recipe for Growth has everything to do with how we run the business—the cloud and the underlying technology, how we deliver software and all the fundamental foundational capabilities that underpinned our strategy.”

Sysco’s key ingredient: IT

At its core, Recipe for Growth “relies heavily on Sysco being a great technology shop, getting rid of technical debt, migration to the cloud, delivering microservices and using artificial intelligence,” Peck says.

Having been very acquisitive over the years, Sysco found itself burdened with a lot of on-premise data centers and legacy applications. To modernize, it had to migrate and rewrite many applications for the cloud to gain efficiencies, speed production, and reduce tech debt, says Peck, insisting that Sysco uses and will continue to use all three major public cloud providers to support the scope of its business and the diversity of its expansive customer needs.

Aside from the cloud, the recipe has as its main ingredient a complex, homegrown e-commerce system called Sysco Shop that enables the application of B2C principles to a global B2B business—in particular, personalization and customization, which Peck says is delivered via an analytics strategy that centers around the company’s homegrown data warehouse, its Amperity customer data platform and Salesforce CRM, as well as Tableau for sales analytics and Tealium, which generates user clickstream analytics.

Like most companies, Sysco traditionally ran its B2B e-commerce business in a bulk reordering fashion.  But the ability to employ the agility and flexibility of the cloud, combined with personalization microservices for each customer, has been very good for business.

“We’ve been able to deliver in a more agile way, and every two weeks [roll out] new capabilities that are much more consumer-like; you don’t just transact, like reordering,” the CIDO says, adding the combination of analytics and e-commerce personalization tools, such as product recommendations, tools to manage inventory, curated menus, and loyalty programs, is expanding its value to enterprise customers. “We’re seeing bigger carts because we’re upselling and cross-selling products and making recommendations. It’s that combined with our investments and sales tools that are driving a lot of growth.”

Adding AI to the IT mix

Sysco’s programmers and data scientists used a range of tools, including JavaScript, Kafka, and Python, to build the company’s homegrown e-commerce and data warehousing platforms, and the company has deployed Blue Prism robotic process automation at its many distribution centers.

Sysco, which sits between the food suppliers and large customers, uses SaaS platforms when possible but its core technology stack is homegrown—and the IT team will build on that with emerging tools such as AI. “The base engine for the e-commerce and data warehouse is all custom code. but we use best-of-breed boutique solutions surrounding the core for everything else,” Peck says.

Using analytics from Salesforce and Tealium, as well as historical ordering data from each customer, Sysco’s goal is to continue making custom recommendations, offer more self-service tools and, with AI, a more refined product mix recommendation. Sysco currently uses AI to detect anomalies in purchasing habits and to determine its customers’ propensity to buy new products.

Sysco has also been implementing machine learning to help “smooth inventory forecasts by predicting customer behavior, inventory levels, and pricing,” Peck says.

Integrating advanced AI into its heavily robotic process automation shop, as well as edge computing, Peck says, are huge opportunities the company is now exploring. Deploying large language models (LLMs) will also allow Sysco to use a much greater abundance of data in the cloud to curate menus based on trends and to detect evolving purchasing behaviors.

“The next logical step is AI,” Peck says. “Machine learning was about comparing a lot of inputs. Large language models allow you to gobble up more data and scan through much more information, whether it’s in another cloud or on premise. We would be able to search and get feedback on restaurant or social trends about recipes and food and surf that back to us or to our customers.”

Catalyzing change

As complex as it is to write AI algorithms, technology is the easy part of Sysco’s next-generation Recipe for Growth, the CIDO says.

There are several challenges to implementing such advanced technology, namely, how to handle change management and how fast to scale, but the benefits far outweigh the challenges, Peck adds.

To that end, Sysco is teaching its sales teams about the benefits and time savings in eliminating mundane paperwork and tasks, and allowing them to pursue new leads and grow their business.

Peck says moves such as these are all about winning the hearts and minds of employees and getting them to embrace AI’s ability to suggest more predictive types of sales calls and order suggestions.

“Your sales teams may view it as a threat, but in reality, it’s not a threat,” he says. “It’s enabling them to spend more time on the customer relationship and nurture new business when they’re spending less time doing research and looking at pricing.” 

If implemented correctly, AI’s benefits are numerous for all employees and the company overall, he maintains.

“It helps us to service our customers better, having exact fill rates, so our trucks show up on time and it helps our company to focus,” he says. “Sometimes companies that want to digitally transform tend to be too broad and want to do everything. We’re very laser focused about key things we’re trying to do and it’s a rallying cry for the company—a morale booster. It helps us feel like we’re part of something special.” 

The breadth and depth of Sysco’s CIDO is well known in industry circles. Last month Peck received the annual MIT Leadership Award at the MIT CIO Symposium in Cambridge, Mass.

“While all of our finalists were doing exceptional work, Tom stood out for his deep knowledge of, and connection with, the needs of the business,” says George Westerman, a senior lecturer at MIT Sloan School of Management, who sits on the awards committee. “It showed in the way he talked, the topics he found important, and the results he and his team helped to drive.”

CIO, Digital Transformation, IT Leadership, IT Strategy]]>
https://www.cio.com/article/641578/syscos-recipe-for-growth-centers-on-it.html 641578
How Data is Changing the Media & Entertainment Industry Wed, 21 Jun 2023 00:47:14 +0000

In the media and entertainment business, success is engaging viewers and creating “stickiness.”  That happens when you understand viewer preferences and understand how audiences interact or consume content.  It’s key to make informed decisions from what can be massive amounts of data you manage effectively.           

Nearly every business in this industry collects massive amounts of digital information: gaming, cable, and streaming companies maintain detailed usage information on millions of subscribers; online advertising brokers make trillions of decisions daily based on customer clicks; and content providers live or die by meeting the entertainment and information needs of their customers.

Traditionally, media and entertainment companies only used data for basic operational reporting. Today, data is defining the industry. Now it’s about combining and analyzing historical and real-time operational data to personalize content, unlock hidden insights for new revenue streams, improve customer engagement, and create interaction and content where customers are almost “immersed” with the content.          

  • A plan for every customer

Today’s communications, media & entertainment companies provide multiple services.  One service is that of the internet service providers (ISPs), who have the unenviable task of providing in today’s world multiple  services that few customers understand well. When customers call, it’s usually because they have a problem, and figuring out the solution may require jumping through several technical hoops.

ISPs collect a lot of data about how customers use their services. That information could be used for root cause analysis, said Anthony Behan, Managing Director of Communications, Media & Entertainment at Cloudera.

“By proactively monitoring the network, ISPs can learn how many devices are connected, how big the downloads are, and how much streaming data is being used,” he said. That can make it easier for them to quickly pinpoint the source of a problem and have a solution ready, he added.

ISPs could even use the data they collect to proactively contact each customer with a custom plan or offer based on that person’s Internet usage that saves them money or improves service quality.

  • Protection earns trust

With data breaches, phishing attacks, and ransomware on the rise, ISPs could further enhance customer loyalty by monitoring customer equipment to proactively identify and block threats. Peace of mind then becomes a selling point.

Communication carriers and internet providers could also use the bounty of information they gather from customer equipment to conduct root cause analysis and identify scenarios in which an outage is likely. They are beginning to proactively troubleshoot problems before the customer notices them or alerting customers to the need for an upgrade or repair, thus avoiding a call to the contact center.

Creative data use also opens up opportunities to improve and personalize customer engagement for subscription TV and video-on-demand services. Combining usage information with anonymized demographics could create customized options for certain customer categories or even individual households.

“If they know there are children in the home, they can make sure certain types of shows aren’t available unless parents request them,” Behan said. They can also allocate bandwidth to make sure that slowdowns and freezes aren’t a problem during peak viewing hours in a household or for specific events. The need for real-time insights and monitoring is critical to customer satisfaction.

Finally, responsible data governance practices can keep service and content providers compliant with constantly changing rules around customer privacy. Creating and promoting responsible practices for data protection not only keeps media and entertainment firms out of the crosshairs of regulators and the media but reassures customers that they’re a business that deserves their trust.

Think big: How can creative data use change the rules in your business? Visit Cloudera to learn more.

Business Intelligence, Data Management]]>
https://www.cio.com/article/641831/how-data-is-changing-the-media-entertainment-industry.html 641831
iomart: Making the cloud straightforward Tue, 20 Jun 2023 20:05:17 +0000

Founded in 1998, iomart began providing cloud services as the new millennium arrived. In the quarter of a century since, the company has grown into one of the U.K.’s most successful and trusted providers of cloud services and solutions.

Today, the Glasgow-based firm has customers in both the public and private sectors, including businesses in virtually every industry. More than 400 employees and an extensive team of cloud experts work from six offices located within the U.K. and the 13 data centers iomart owns and operates throughout the country.

We recently caught up with Aaron Tebbutt, strategic vendor alliance manager, to learn more about iomart, get his thoughts on what it means for the company to be among the prestigious providers to have earned the VMware Cloud Verified distinction, and learn what he sees as the pivotal issues shaping the cloud.

“We like to make the cloud straightforward for our customers,” says Tebbutt. “That means making it clear which problems our technology services and solutions will help an organization solve. Our cloud experts have extensive experience working at every point in the cloud transition roadmap, so we have the knowledge needed to help decide what cloud strategy is best and then work closely with them to execute that vision.”

The company offers private cloud, including a dedicated enhanced cloud, public cloud and seamless integrations with the major hyperscalers, and extensive hybrid-cloud options. Some of its many additional services include Infrastructure-as-a-Service, Backup-as-a-Service, Disaster Recovery-as-a-Service, and a full portfolio designed to address enterprises’ security needs. These include robust, fully-encrypted networking solutions, an enhanced security operations center, and technologies like endpoint detection and response.

Notably, iomart also works closely with its strong ecosystem of partners. In addition to VMware, these include Microsoft, Barracuda, Cohesity, and Dell among others. Such collaborations ensure that enterprises have access to the technologies, software-defined capabilities, and hardware they require on-premises and in the cloud.

“We are very committed to strengthening our partnerships with industry leaders like VMware,” adds Tebbutt. “It ensures we can offer our customers the very latest innovations. Honors like receiving the VMware Cloud Verified distinction are also reassuring to our customers. It shows we know our stuff, and can support and deploy solutions that reflect the very highest standards of our strategic partners.”

Not surprisingly given the many hundreds of organizations that rely on iomart for their success in the cloud, Tebbutt also has visibility into what is motivating enterprises to go to the cloud, and where they need the most help.

“It is difficult for many organizations to decide where to start their cloud journey,” he says. “Our teams work closely with them to develop a clear timeline that plans exactly what will happen and when. It is important to remember there is no one-size-fits-all cloud product. It’s about finding the right approach to address the challenges and demands any individual organization faces.”

Tebbutt also notes that while there are important nuances to consider for specific industries and each organization is unique, several core challenges are particularly common. They include:

  • Data security: Organizations are handling more data than ever before. Identifying what is sensitive or deserving of special treatment is essential. Storing it appropriately and cost-effectively – both things the cloud is valuable for – is essential, particularly in heavily-regulated sectors.
  • Enabling remote workers: Remote work is essential in many industries, particularly now when it can be so difficult to acquire and retain valued employees. Many companies’ existing cloud assets and strategies are not up to the task.
  • Getting the most out of the cloud: The cloud radically reduces enterprises’ CapEx expenditures, but there is great pressure on businesses to wring every last drop of value out of their investments. Organizations want to fully optimize their existing cloud assets, make sure they achieve their full potential to impact the business, and generate a strong return on investment. New investments in the cloud must also directly address business challenges the organization faces.

And the future of the cloud?  Tebbutt sees it enabling business disruption rather than technology disruption.

“We will see a dramatic move to hybrid cloud as organizations address data protection and sovereignty requirements while still taking advantage of hyperscale systems that offer infinite-scale compute and storage capabilities. With this, data will become a currency and a differentiator that enables organizations to make data-driven decisions with confidence, measure and predict customer behaviors, and innovate at an unprecedented pace. Having the tools and horsepower to store, control, manipulate, and draw insight from this wealth of data will be the future of cloud even as data becomes increasingly distributed and unfathomable in size.”

Learn more about iomart and its partnership with VMware here.

Cloud Computing]]>
https://www.cio.com/article/641820/iomart-making-the-cloud-straightforward.html 641820
Enabling a sovereign cloud using a multicloud foundation: Technology executive considerations Tue, 20 Jun 2023 19:51:51 +0000

The adoption of multiple clouds by European business and public agencies continues to increase due to the need for competitive differentiation and growth through speed, quality, and the delivery of great customer experiences. To achieve these goals, IT and business executives must manage challenges across data governance, security, and compliance to protect sensitive customer, citizen, and country data using privacy, access, and security controls.  For further details, check out the IDC report on sovereign cloud here.

Data has become both a business and national asset. The ability of enterprises and governments to control data and run workloads while operating within legal jurisdiction, complying with multi-jurisdictional regulations, and protecting against unauthorized access requires a critical set of sovereign capabilities which are essential for customer trust and business growth. 

Given this transformational journey, sovereign clouds should be included as part of a multicloud strategy. Using common sovereign tenants and principles is becoming increasingly necessary while at the same time supporting capabilities that deliver efficiency, reduce complexity, and enable standardization. This approach provides a foundation from which IT and business teams can ensure that the necessary solutions are in place to control, secure, and store data in compliance with relevant regional, national, and (where applicable) international laws and guidelines. A multicloud architecture can provide layers to meet local and national regulations, and thereby give organizations greater choices and flexibility across multiple sovereign cloud environments. Fundamentally, a multicloud approach to sovereign cloud is about unlocking and supporting emerging data economies with as little complexity and uncertainty as possible. This approach empowers enterprises to focus more on serving their stakeholders through innovation and growth. Additionally, a multicloud approach to sovereign cloud enables legacy application and back-end infrastructure modernization.

Technology executives must understand that establishing a sovereign cloud is complex and difficult, especially without assistance from a partner or vendor with deep expertise. There are various complex dimensions spanning data security and data protection, understanding regulations and their impact on technology needs, and the complexity of driving standardization and controls across multiple clouds.  In addition, data classification for a sovereign cloud is essential for its success.  This complexity requires technology leaders to build expertise with strategic partners who have the depth and bench strength to deploy a sovereign cloud.  As part of a sovereign cloud foundation, multicloud tools enable organizations to tailor infrastructure to their specific needs, and respond in an agile way to data privacy, security, and geopolitical disruptions.

When it comes to vendor and partner support, customers should not expect to create a sovereign cloud on their own because of the required complexity and expertise. Let’s take two examples of sovereign cloud deployments using large global professional service partners, VMware and Broadcom.  VMware’s technology has been a critical foundation for driving innovation and scale for governments and public agencies in Europe for many years.  With a set of tools that can position customers to work across multiple clouds, VMware can enable the critical foundational requirements for a sovereign cloud, and has been an instrumental partner in the process of driving innovation for governments and public agencies in Europe. After its pending acquisition by Broadcom, VMware will be supported by Broadcom’s lengthy track record of significant R&D investments, an innovation-focused culture, and commitment to customers.  Broadcom’s acquisition of VMware creates opportunities for the new, combined organization to offer customers a more complete set of sovereign cloud capabilities.  Such a set of capabilities could help accelerate digital transformation across Europe while also furthering the needs and objectives of sovereign clouds.

Digging deeper into multicloud technology capabilities, enterprises must consider how to manage the necessary controls, security, and data transparency required for a sovereign cloud.  Without the right technology foundation that empowers these capabilities, the successful deployment of a sovereign cloud is simply not possible.  Additional key areas customers must consider enabling a sovereign cloud include:

  • Basing the technology architecture on a resilient and scalable architecture that takes advantage of process automation across application, service, and operational tasks and capabilities
  • Taking a focus on data and security policies that deliver layers of digital protection and sovereignty across the software development pipeline and service operations
  • Enabling processes that empower jurisdictional controls, and an ability to adjust to geo-political dynamics, enabling business and IT teams to manage and control confidential data via advanced methods and practices 
  • Enabling an organization to adopt country-specific regulatory, compliance, and data requirements, (regardless of the underlying cloud platforms) with data control points and reporting mechanisms 

Multi cloud solutions like those offered by VMware provide European enterprises and the public sector with a flexible, consistent digital foundation to build, run, manage, connect, and protect their most important and complex workloads.  Once Broadcom completes its pending acquisition of VMware, the combined company can make new and significant R&D investments, develop a stronger and broader set of innovations, and foster larger professional service partnerships focused on multicloud capabilities to power and enable sovereign cloud. 

Learn more about sovereign cloud in IDC’s Market Perspective: Considering a Sovereign Cloud? A Blueprint for IT and Business Executives

About Steven Elliot:

IDC

Stephen Elliot is Group Vice President at IDC, responsible for P&L and team management for multiple programs spanning IT operations, observability, AlOPs, ITSM, DevOps, automation, virtualization, multi-cloud management, FinOps, end point management, log analytics, container management, DaS, cloud native management and software defined compute. Mr. Elliot advises Senior IT, business and investment executives globally in the creation of strategy and operational tactics that drive the execution of digital transformation and business growth.

Multi Cloud]]>
https://www.cio.com/article/641808/enabling-a-sovereign-cloud-using-a-multicloud-foundation-technology-executive-considerations.html 641808
Minimizing the negative impact of IT through design and circularity Tue, 20 Jun 2023 19:24:00 +0000

In a previous blog, I described the three areas of product development and operation that HPE Aruba Networking focuses on when designing our products for IT efficiency and sustainable operations—like how products are made, how they work, and how they are being used.

But what about the product lifecycle itself?

With sustainability now a growing business imperative, product lifecycle has become a hot topic. Of keen interest is understanding what is being done to extend the lifecycle of deployed products, and how best to plan for a network transition fiscally and responsibly.

Product lifecycle begins with design. Innovating for resiliency and efficiency leads to longer lifecycles. Since our early days, HPE Aruba Networking has been purposefully building products to deliver secure, high-performance connectivity in challenging environments as a general practice, and we’ve invested in hardware and software advancements, such as AI automation and unified management and control, that deliver maximum operational efficiency without compromise.

On average, our products are designed to give you a minimum useful life of 10 years—5 years availability in-market, and 5 years of hardware warranty and software support. Exact timeframes vary by product, but a recent portfolio analysis revealed that many of our products have had lifespans of up to 14 years or more. With that said, businesses typically refresh products on a 3-to-5-year cycle.

While new features that accelerate business agility are highly attractive, our experience suggests that the decision to refresh a network often has more to do with a customer’s desire to transition depreciated assets off their books rather than to replace products that are no longer usable. So how do customers minimize the negative impact of their IT investments while continuing to optimize their balance sheets and drive business growth?

Shifting the Lifecycle Paradigm

Acquiring network assets has traditionally involved making large CapEx investments at the time of purchase, after which assets depreciate over time. Once completely depreciated, those assets remain on the company books at their salvage value and require ongoing maintenance and lifecycle management until they are dispositioned.

An increasingly attractive alternative to this model is to consume network infrastructure-as-a-service. Not only does this remove the burden of large upfront investments, but it also simplifies accounting activities, extends product lifecycles through circularity, relieves IT teams from managing the end-of-life transition of older equipment, and enables customers to sustainably migrate to new technologies to support their business trajectory.

HPE GreenLake for Aruba offers customers flexible network-as-a-service (NaaS) options that deliver these benefits. In addition to purchasing simplicity and proactive architecture management, our NaaS solutions also offer lifecycle circularity and responsible end-of-life disposition.

The circular economy is created by extending the life of assets. It is our aim to renew any product we take back for reuse as certified pre-owned equipment rather than disassembling it for recycling and destruction. Many assets within a customer’s network still have market value despite being fully depreciated. HPE’s Asset Upcycling program returns value to our customers for legacy IT assets that can create investment capacity to help accelerate their network transitions and fund new projects.

Customers that take advantage of HPE GreenLake for Aruba networking services can also leverage HPE’s Accelerated Migration Service, which shifts IT assets to a flexible usage model to help accelerate deployments without disruption and free up cash by recovering the value of equipment still in use.

The value gained is not trivial. To put it in perspective, over the last three years, HPE Financial Services returned approximately $1.3 million per day back to our customers with HPE Asset Upcycling and HPE Accelerated Migration services. That’s over $1.1 billion in the last 3 years directly back to customer budgets.

Most importantly, particularly for customers compelled to share environmental impact reports with stakeholders, HPE is the only company within our industry that can provide a custom Circular Economy Reportthat summarizes Scope 3 emissions, and provides details on materials returned, method of disposition, associated power savings, and year-over-year comparisons when available.

With global reach and one of the largest technology renewal operations in the world, in 2021, HPE processed more than 3 million assets weighing close to 27 million pounds. This amounted to approximately 85% of the material being renewed for a second life, with the remaining 15% responsibly recycled with a minimum amount of material directed to landfill.

By extending the useful life of tech assets through design and circularity, HPE aims to help our customers maximize the value of the equipment they acquire while minimizing the environmental impact of their IT investments.

Being a Force for Good

We believe technology’s greatest promise lies in its potential for positive change, and our team is passionate about being a force for good. April is HPE volunteer month, and to reinforce our commitment to supporting HPE’s sustainability goals, the HPE Aruba Networking division launched Earth Day challenges designed to empower our employees with a better understanding of HPE’s sustainability posture and to accelerate sustainability initiatives in our customers’ environments, our workplaces, and our communities. I am incredibly pleased and humbled by the tremendous enthusiasm and level of participation, as it is a great sign of things to come.

This blog was published on blogs.arubanetworks.com on April 19, 2023.

Additional resources

Sustainability at HPE Aruba Networking

Sustainability with HPE GreenLake for Aruba

Aruba Sustainability Through HPEFS

To learn more, visit us here

Zero Trust]]>
https://www.cio.com/article/641809/minimizing-the-negative-impact-of-it-through-design-and-circularity.html 641809
3 ways to advance sustainability in high performance computing Tue, 20 Jun 2023 17:39:18 +0000

Finding the answer to the world’s most pressing issues rests on one crucial capability: high performance computing (HPC). With HPC, complex questions that have puzzled humankind for centuries are being unraveled at record speeds–such as unlocking mysteries of the universe, finding cures for diseases, sequencing DNA, and mitigating the impacts of climate change. 

The supercomputers that power HPC, however, require more and more energy to operate. For instance, the power consumption of the world’s fastest supercomputer rose from 7.9MW in 2012 to 29.9MW in 2022. It’s no wonder that–per a recent study commissioned by Dell Technologies, Intel and NVIDIA–HPC operators have elevated the importance of sustainability to the number two priority, even surpassing price. 

We applaud and support the efforts of HPC operators to improve sustainability. That means we must collectively and continuously work to manage HPC’s power requirements in areas where we can have a measurable impact. There are three ways to do this:

  1. Maximize hardware energy efficiency 
  2. Consolidate infrastructure
  3. Use renewable energy sources

Deploying Energy-Efficient Hardware

The Hyperion Research study found three geo-specific motivations for the heightened prioritization of sustainability in HPC deployments. These motivations help shape geo-specific conversations about sustainability. As depicted in the figure below, 

  • The Asia-Pacific region is driven by environmental, social, and governance (ESG) goals
  • In Europe, organizations are concerned about rising, and thus unmanageable energy costs
  • North America is motivated to do more with less
Dell graph

Dell

Universally, 60% of respondents in the study said sustainability initiatives will impact their on-premises HPC budgets. Interestingly, half of those respondents said sustainability initiatives will reduce on-premises purchases; the other half indicated that sustainability initiatives will increase them.

For respondents anticipating an HPC budget reduction, the goal is to reallocate dollars to sustainability initiatives. Dollars are moving to purchasing new energy-efficient hardware or devoting resources to optimization efforts or changing where HPC workloads are run.

Dell Technologies, NVIDIA, and Intel are supporting these organizational efforts. For instance, Dell Technologies, a committed steward of sustainability, has worked to decrease energy intensity across its entire portfolio, achieving a 76% reduction since 2013.

Consolidating Infrastructure

Beyond the individual hardware components, designing and deploying HPC infrastructure is a sophisticated undertaking. Ensuring infrastructure is consolidated and set up for optimal performance is a key lever to accelerating sustainability initiatives. At Durham University, this principle is fundamental to providing HPC resources. Since 2001, Durham University’s tier 1 national supercomputing facility has supported scientists and researchers around the world and is using HPC to build a digital simulation of the universe, starting with the Big Bang.

During an infrastructure upgrade, Dell Technologies helped Durham University optimize and modernize its infrastructure to achieve performance enhancements and improved sustainability. The results include 18X faster data backups, 72% less power, and a reduction of 60 tons of CO2 per year.

Using Renewable Energy Sources

In addition to reducing the energy intensity of hardware and consolidating HPC infrastructure, sustainability initiatives are also advancing with the use of renewable energy sources. A great example of that is atNorth and BNP Paribas. When it came time to update its infrastructure, BNP Paribas, a leading bank in the European Union, turned to atNorth and Dell Technologies to help it expand responsibly and build a “future-proofed” HPC infrastructure.

BNP Paribas moved a portion of its data center operations to atNorth’s facility in Iceland, an economical, energy-efficient data center operating 100% on renewable energy. As a result, BNP Paribas lowered its total cost of ownership (TCO) with 50% less energy and 85% less COoutput. In addition, BNP Paribas increased power efficiency at higher compute density, achieving future-proofed, more environmentally-responsible HPC resources.

“By using only renewable energy sources and decreasing our carbon footprint by 85%, BNP Paribas is realizing its dual mission to reduce its environmental impact and better serve our customers,” shared Ricardo Jantarada, Global Head of Telecom & Datacenter at BNP Paribas. 

Advancing Environmentally Sustainable HPC

HPC holds great promise to unlock some of humankind’s most pressing, complex issues. At the same time, deploying HPC sustainably has risen to a top priority, as shown by research with organizations around the world. To advance sustainable HPC deployments requires action in three major areas: hardware energy efficiency, infrastructure consolidation, and renewable energy sources. Starting today helps accelerate sustainability initiatives to create a better tomorrow for all. 

Want to learn more about HPC’s amazing possibilities? View The Seven Wonders of the HPC World here.

Learn more about Dell Technologies’ environmentally sustainable HPC solutions here

______________

Sponsored by Dell Technologies, Intel and NVIDIA, the Hyperion Research survey had worldwide coverage with a strong focus on international sites and sustainability questions relating to HPC, artificial intelligence (AI), quantum computing, and the results of using HPC. Data was collected via direct interviews and surveys of HPC and Cloud Service Provider (CSP) data center managers, directors, or leads as well as scientists, researchers, and engineers. 

IT Leadership]]>
https://www.cio.com/article/641797/3-ways-to-advance-sustainability-in-high-performance-computing.html 641797
Start with digital documents to make your workplace more accessible Tue, 20 Jun 2023 16:07:53 +0000

In today’s rapidly evolving work and customer landscape, accessibility is a crucial consideration in ensuring employees and customers can fully participate in the experiences brands provide – and generally part of being a responsible corporate citizen.   

However, a recent Adobe survey found that only about half of brands are investing in making experiences more accessible for customers (51%) and employees (48%). While there is a myriad of benefits companies will not realize if they don’t make their experiences more accessible, there are also some very real consequences. For example, the same survey found 85% of consumers say they will decrease spending if companies do not make their customer experiences accessible, and 29% say they’ll refuse to spend any money at all.

There are nearly endless initiatives technology leaders can undertake to make both the workplace and their customers’ experiences more accessible. Once place every organization can start is the digital documents their employees and customers use every day.

Design with broad accessibility in mind

Improving the digital experience begins with design considerations that meet the needs of all customers and employees, including those with disabilities. Regardless of their age, ability or experience, everyone reads differently and can benefit from a more tailored reading experience.Adobe formed The Readability Consortium with University of Central Florida (UCF), Readability Matters, and Google to help make digital reading and reading comprehension more equitable for all people across the globe with cognitive research, open-source tools, and user testing across a wide swath of age groups and abilities levels.

As part of this work, Adobe research scientists recently published two papers that provide user-centered, inclusive design recommendations based on user and reading tests to populations with and without dyslexia. 

The research reveals that all readers benefit from alternative reading formats and custom interfaces, rather than a one-size-fits-all format. Between character, word, and line spacing, preferences vary from person to person.

Technology leaders who prioritize tools that provide both flexibility and support for a myriad of reading styles and abilities will create an environment where stakeholders can participate fully and thus both receive value and give value back to the brand.

Bring AI to PDFs

About 30 years ago, Adobe invented the PDF format and then outsourced it to the world. Today, our experts put the total number of PDFs in circulation at a conservative 3 trillion. What started as a way to preserve formatting and intent has developed into one of the foundational formats and enablers of digital transformation. Ensuring that the loads of PDFs in your organization are accessible to the broadest range of employees and customers possible presents an opportunity to increase both productivity and the quality of relationship with your most important stakeholders.

Basic accessibility tools like screen readers and tagging technology have existed for years, but bridging the digital accessibility gap has continued to be a manual, time-consuming process requiring extensive training and experience. Also, government regulations like the European Accessibility Act and user expectations are only increasing, which is important for companies that want to compete globally.

While this may feel daunting, artificial intelligence now provides a way for technology leaders to successfully tackle what have been expensive, difficult – and sometimes even impossible – accessibilities initiatives.

Adobe has continued to innovate PDF accessibility for decades, introducing Acrobat features such as PDF Read Out Loud and readability features in Sensei AI-powered Liquid Mode for Acrobat Reader Mobile. Recently, we introduced the Adobe PDF Accessibility Auto-Tag API to help companies automate the process of making digital documents more accessible. The API also leverages the power of Adobe Sensei AI to improve PDF accessibility for customers who use screen readers.

The new API automates and scales the process of tagging content structure and reading order inside PDFs—from long-form text to mixed-content documents across different languages—helping individuals with disabilities like blindness, low vision, and dyslexia who use assistive technologies navigate a PDF. It also enables developers to apply the API to large backlogs of existing PDFs saving time and budget, while complying with the latest accessibility regulations. 

Early adopters are already getting value from the API, with results showing up to 100% improvement in the accessibility and usability of a PDF for people with disabilities and ability to tag up to 100 pages in less than a minute. For example, a global financial firm automated 70-80% of its process to make slide decks accessible – a process that initially took more than 9 hours for each deck.

Bridging the gap

Today, technology leaders have many tools available to help make their customer and employee experiences more accessible and enjoyable. From how technology is designed to how information is consumed, if tech leaders step up they can play a starring role in creating an environment where all employees and customers can participate fully and without barriers.

For more information about how to make PDFs more accessible, read Adobe’s checklist to PDF accessibility.

Employee Experience]]>
https://www.cio.com/article/641788/start-with-digital-documents-to-make-your-workplace-more-accessible.html 641788
Unbiased third-party testing is critical for network security Tue, 20 Jun 2023 14:11:26 +0000

Today, CIO and CISO teams are tasked with multiple business-critical initiatives like securing and connecting work-from-anywhere employees, moving applications to the edge or the cloud, and securing operational technology (OT) and IT environments. At the same time, the threat landscape continues to evolve and cyber risk is escalating for all organizations. Cybercriminals are finding new ways to weaponize technologies at scale to cause more disruption and destruction. And they’re spending more time on reconnaissance to evade detection, intelligence, and controls.

As cyber risk continues to escalate, CIOs and CISOs need to be just as nimble and methodical as their adversaries.

Determining how to provide adaptive and comprehensive protection against today’s evolving threat landscape is complex. Cybersecurity products like next-generation firewallssingle vendor secure access service edge (SASE), and Zero Trust Network Access (ZTNA) are the best way to protect enterprise data and employees. But with so many vendors to choose from as well as layers of marketing hype, footnoted claims, and qualified conditions, it’s not surprising that people get confused about selecting the right cybersecurity solutions for their business. 

Choosing a solution is challenging enough, but then after it’s deployed, if the product doesn’t meet the promised claims, it leads to trust issues and frustration. And when you think you’ve purchased a proven and reputable security solution and it doesn’t deliver, the results can be catastrophic.

The good news is that there are objective sources of information that can help organizations make more informed purchasing decisions. Third-party testing and validation can help CIOs find security products that do what they say they do and meet the specific infrastructure needs of their organization.

Third-party testing and validation

Unbiased, third-party testing involves evaluation by qualified, independent researchers with data-driven guidance to help organizations select effective security across a broad spectrum of solutions. Because organizations often don’t have the time or resources to do in-depth testing on their own, third-party testing gives them objective data to make informed decisions about the products they need to protect their critical assets. 

Common cybersecurity product testing issues

Cybersecurity products and services are specific to the needs of an organization’s rapidly changing environment, and testing often doesn’t properly cover new and emerging issues. Even worse, some technology testing firms still allow vendors to manipulate their methodologies to skew the test results in their favor. Because industry tests often lack standardized measurement criteria, the results can vary wildly. It’s impossible to accurately compare solutions from different vendors when the tests don’t have the same parameters. 

Why third-party tests are different

Legitimate third-party testing companies are disincentivized from inflating their results because their professional reputations are directly tied to the quantifiable reliability of the tests they conduct. And because third-party testing companies aren’t influenced by vendors, their testing may expose weaknesses in a solution that the vendor wants to obscure.

Independent testing is also the only way for customers to accurately cross-compare solutions from different vendors because the testing measures performance across the same environmental and security challenges for an “apples-to-apples” comparison. 

With a good independent third-party test, organizations can qualify products not only in the context of their networks but also against the rapid changes in the threat landscape. 

Selecting a third-party testing company

All third-party testing companies aren’t created equal. They each measure different criteria or have different objectives. Some are granular and others are broad. The research testing company you select needs to ensure their tests measure the most critical criteria for your organization. Be sure to select a third-party testing company that is open about its methodologies and replicates your organization’s environment and challenges as closely as possible.

Unbiased, ethical testing

A few organizations perform comparative testing and reporting on how different products measure up under real-world conditions. CyberRatings.org, for example, has stepped in to conduct ethical testing without vendor influence and manipulation. In the wake of the closing of the independent testing organization NSS Labs in 2020, CyberRatings.org is also now the custodian of previous NSS Labs results.

The type of competitive benchmarking, certification, and validation performed by companies like CyberRatings.org provides open and transparent industry information that levels the playing field. Unbiased testing is critical to the health and future of the cybersecurity and networking markets not only because it provides clarity to customers but also because of the value it drives for the companies whose products are tested. 

Testing that’s free from meddling can help incentivize vendors to release the best possible cybersecurity solutions, and for customers, a vendor’s lack of participation can serve as an important red flag. 

A commitment to independent testing and validation

For years, Fortinet has been committed to independent testing and validation. Rigorous and reputable outside evaluation is critical to raising the bar for the security industry as a whole and helps ensure that our customers can make informed buying decisions.

To that end, we participated in the latest CyberRatings.org test for Enterprise Firewall, and the Fortinet FortiGate 600F next-generation firewall received CyberRatings.org’s “Recommended rating.” Fortigate earned the highest AAA score in the threat prevention, SSL/TLS functionality, stability and reliability, and routing and access control testing categories, with a 99.88 security effectiveness rating. These results highlight the effectiveness of the solution’s artificial intelligence, machine learning, and threat intelligence capabilities and underscore the fact that FortiGate has the industry’s highest return on investment (ROI).  

Learn more about the Fortinet FortiGate or download the full CyberRatings.org 2023 Enterprise Firewall report to read the results.

Security]]>
https://www.cio.com/article/642695/unbiased-third-party-testing-is-critical-for-network-security.html 642695
Getting ahead of cyberattacks with a DevSecOps approach to web application security Tue, 20 Jun 2023 13:42:34 +0000

Web applications are foundational to a company’s business and brand identity yet are highly vulnerable to digital attacks and cybercriminals. As such, it’s vital to have a robust and forward-leaning approach to web application security. With an estimated market size of USD $30B by 2030, the term “application security” takes on numerous forms, but one area of heightened relevance in today’s world is the DevSecOps space.

While the formal practice of DevSecOps dates back to the late 1970s, its adoption across the IT and infosec landscape has become much more prominent as the world has become more interconnected and “app-focused.” According to GitLab’s 2023 Global DevSecOps Report, 56% of organizations report using DevOps or DevSecOps methodologies, growing roughly 10% from 2022, for improved security, higher developer velocity, cost and time savings, and better collaboration. 

What is DevSecOps?

DevSecOps is used to describe the integration of security practices into the DevOps and application development processes. DevSecOps seeks to build security into applications, not just build security around an application.DevOps is a methodology that focuses on the collaboration between development and operations teams to create, test, and deploy software quickly and efficiently. By integrating security practices into the DevOps process, DevSecOps aims to ensure that security is an integral part of the software development life cycle (SDLC).

Benefits of DevSecOps

Identify vulnerabilities early: DevSecOps processes help to identify security vulnerabilities early in the software development process. GitLab’s report found that 71% of security professionals reported that at least a quarter of all security vulnerabilities are being spotted by developers, up from 53% in 2022, by incorporating this approach.

Grow budget and reputation: By integrating security testing into the development cycle, developers can identify and fix security issues before they become costly and damage the brand. According to IBM, a single data breach costs $9.4 million USD for an average business in the United States. As modern application programming can draw from a wide array of open source and commercial tools and libraries that will have varying degrees of vulnerabilities (published and unpublished), such as the high-profile Apache Struts, Spring4Shell or Log4j exploits – it’s critical that a well-defined security process be implemented in the SDLC to avoid supply-chain compromise.

Release faster with confidence: By making security a default part of the DevOps process, teams can ensure that security is not overlooked or forgotten in the rush to deliver software quickly. Traditionally, application testing was implemented during the last phases of development, before being sent to security teams. If an application did not meet quality standards, did not function properly, or otherwise failed to meet requirements, it would be sent back into development for additional changes. This caused significant bottlenecks in the SDLC and was not conducive to DevOps methodologies, which emphasize development velocity. 

By integrating security testing into the development cycle and working closely with the development teams, often other bugs and defects that may impact the quality of the software can be found. Nearly 74% of security professionals said their organizations have either shifted security into the earlier stages of development or plan to in the next three years. 

Implementing DevSecOps

Building an effective security program around software development in an organization is often less about the specific tools that are used and more about culture and process. Selecting amongst various Static and Dynamic Application Security Testing (SAST/DAST) tools is typically the purview of the DevSecOps team, just as development teams typically control their CI/CD and IDE tooling. 

While it’s important to choose the right tools that will deliver the most benefit, it’s critical to ensure that the right processes are set up to ensure collaboration and compliance. Friction can occur where some traditional Infosec teams may operate solely with a “red team” mindset that relies on scanning or discovery-only to call out problems. However, DevSecOps team should be invested in mitigation as well, and be useful in assisting with remediation of their findings. Not only does this help break down team silos by fostering better collaboration, but understanding the mitigation efforts or effects means that the Infosec or DevSecOps teams also better understand the impact their findings make.

As an example, an automated scan may produce a result that shows a vulnerability in a particular piece of code or software package. But if the security team doesn’t have the proper context about how and where the code or package is used, it limits their ability to help with remediation, and adds to a developer’s workload – plus slows dev teams’ velocity. Efficient workflows come when one team can identify system weaknesses, launch test attacks, conduct vulnerability scans, and implement a stronger defense system. Effectively, one team can play the red and blue team role, gaining buy-in from the development team while allowing the DevSecOps teams to ship code faster while still adhering to the proper security protocols.

Other best practices of DevSecOps include incorporating threat modeling into the process. Popular threat models and kill chains that have demonstrated effectiveness over time include the STRIDE framework and MITRE Att&ck matrix. In the web application space, a cloud or CDN-delivered advanced Web Application & API Protection (WAAP) solution, such as Edgio’s,  enables organizations to perform virtual patching for back-end systems that have underlying vulnerabilities or that may take time to fix or upgrade. 

For organizations that are new to embracing DevSecOps in their processes, starting small with a pilot project is often the best approach. While the multitude of automated tools and scanners are effective at identifying potential vulnerabilities, having similar automated methods of tracking and closing issues and providing measurability is equally important in reducing overhead and friction with development teams.

Wrapping up 

DevSecOps is a valuable approach to identifying vulnerabilities early, releasing faster with confidence, and improving overall code quality. Effective implementation of DevSecOps requires the selection of appropriate tools, the establishment of a collaborative culture and compliance processes, and the incorporation of threat modeling. As organizations increasingly prioritize security in their software development, DevSecOps will continue to play an important role in ensuring the integrity and safety of software applications.

Edgio, a web application and API platform, makes it easy to build effective security into modern web applications, innovate faster and mitigate risks with unified alert management. Talk to an expert to implement DevSecOps into your business today.

Software Development]]>
https://www.cio.com/article/641732/getting-ahead-of-cyberattacks-with-a-devsecops-approach-to-web-application-security.html 641732