Why Swarm Intelligence is a Smart Solution for Data Privacy: Q&A With Hewlett Packard Enterprise’s Mark Potter

essidsolutions

“Swarm learning has powerful applications across autonomous vehicles, hospitals, and banks for which, data privacy at the source is a requirement.”

Hewlett Packard Enterprise’s (HPE) chief technology officer, Mark Potter, shares his views on how swarm intelligence provides the framework for data privacyOpens a new window . He discusses the role of AI and blockchain to improve data privacy and more.

At HPE, Potter is responsible for designing and developing business and data privacy strategies. He offers critical insights on blockchain and other emerging technologies for swarm learning. In this exclusive with Toolbox, Potter talks about the factors that drive technological innovationOpens a new window in organizations.

Key Takeaways From This Tech Talk Interview on Swarm Intelligence:

  • Top tips for companies to use swarm intelligence for data privacy and security
  • Best practices on swarm learning and memory-driven computing
  • Insights on how developers can improve their innovation skills for swarm learning

Here’s the Edited Transcript of the Interview with Potter and His Views on Swarm Intelligence:

Mark, to set the stage, tell us about your career path so far and what your role at Hewlett Packard Enterprise entails.

I currently serve as the Chief Technology Officer for HPE and the Director of Hewlett Packard Labs, which is the company’s advanced research organization. In this role, I work closely with the business and sales to set the strategy, help incubate new businesses and accelerate the company’s innovation agenda.
Prior to my current role, I served as the general manager over key business areas at Hewlett Packard for more than a decade, including the server and infrastructure software businesses.

Can you explain what swarm intelligence is and how engineers can learn from it to ensure data privacy?

Today’s approach to machine learning from new data collected by IoT and edge devices has a couple of significant limitations. The data is collected at the edge and selectively sent back to the hybrid cloud or data center, where machine learning analysis takes place. The first major limitation is the bandwidth required to move the data.

The second is that data privacy might not even allow the movement of the data to the cloud. A common example is autonomous vehicles attempting to send data back to the data center but grappling with thin bandwidth and intermittent connectivity. There’s also a very real issue around biased data creating biased models.

To help solve the problem, we are exploring new methods of enabling decentralized machine learning at the edge. With HPE’s swarm learning approach, we envision gathering edge devices to learn as a swarm on their own, cutting out the necessity for a central cloud-based coordinator for the model training process.

Privacy compliance is assured because in this model, the data doesn’t move to the cloud for training, the training comes to the data at the edge. From a privacy standpoint, only incremental or group learnings are shared, not individual data. This technique also allows very diverse training data sets to be assembled easily, which allows us to reduce or eliminate bias in the final model.

How does AI and blockchain improve data privacy? What is the best framework for data engineers to follow?

Since a foundational component of our approach is the lack of central control, a private blockchain is used for coordination of the swarm and to ensure the continuous integrity of the model. Only incremental learnings are shared amongst members of the swarm, rather than individual data, so data transmission requirements is significantly reduced while preserving data privacy. Only finalized models are sent back to the cloud or data center.

Swarm learning has powerful applications across autonomous vehicles, hospitals and banks for which data privacy at the source is a requirement.

For example, imagine two hospitals. One sees a lot of tuberculosis cases, the other sees a lot of pneumonia patients but fewer with tuberculosis. They both use AI to improve diagnosis accuracy from radiography data; their models will be biased towards the illness they see most often, but they’re forbidden to share data because there are legal barriers to exchange of such information.

Swarm learning offers a solution to improve learning because the data never has to leave the hospital and the local models in each hospital share only the key important attributes needed for improving patient outcomes for both cases.

What are the 3 top tips for organizations to use blockchain for data privacy and security?

Developers in all industries, especially financial services, are now exploring modifications to applications based on blockchain technology to create a faster, more secure and transparent way to process and manage data; however, we have found that generic infrastructure and public clouds alone might not make the best environment for developing and testing new applications. Without proper testing, enterprises cannot confidently deploy new applications, especially for use with sensitive data, such as financial and healthcare information. I recommend that companies think about the following approaches to evaluate blockchain or any other emerging technology:

  • Current blockchain platforms are unproven for mission-critical applications, but that will soon change. In the meantime, companies should think through potential use cases and start experimenting.
  • For companies considering blockchain applications, it’s crucial to establish a proof of value, a working model that can help validate the business value. A proof of value will also inform the discussions about data sovereignty and other requirements, which will drive decisions about infrastructure in the future.
  • Because blockchain is an emerging technology, organizations often express concerns about internal blockchain knowledge and developer talent. Before committing to a blockchain implementation, organizations should determine the skills required to develop smart contracts or distributed applications.

Learn More: How the Role of the CTO Influences Digital Transformation: Q&A With Deloitte’s Bill BriggsOpens a new window

You’ve saidOpens a new window that, “We believe Memory-Driven Computing is the solution to move the technology industry forward in a way that can enable advancements across all aspects of society.” What are the 3 takeaways for developers from this insight?

Memory-Driven Computing puts data at the center of future computer architectures and unlocks innovation with faster and more efficient solutions to accelerate insights from that data. Memory-Driven Computing brings together new applications for photonics, memory, CPUs, next generation of workload accelerators, and software to create not just a new machine, but a whole new architecture for data optimized computing. The discoveries that Memory-Driven Computing will make possible are profound, and we want to bring programmers on that journey with us.

This new architecture offers developers:

  1. Enormous performance boosts for certain workloads by the reduction or elimination of data-flow bottlenecks.
  2. Ability to easily tap into Composable Infrastructure that is efficient, and workload optimized.
  3. Simplification in writing apps by removing layers of complexity and allowing access to dramatically larger pools of memory and ability to optimally unlock the performance of the right workload accelerators for the task at hand.

For people who are interested in exploring Memory-Driven Computing and large pools of persistent memory for themselves, we have built a developer toolkit.

To simplify the developer experience, we are exposing familiar programming environments like Linux and Portable Operating System Interface APIs with programming languages like C/C++ and Java, making the performance advantages of massive memory on fabrics available quickly for developers.

We also have our cloud-based Memory-Driven Computing Sandbox, where customers’ advanced development teams can perform full-throttle experiments to address their most intractable problems.

Learn More: Cloud Security Basics: AWS, Google Cloud, IBM Cloud, Microsoft Azure & OracleOpens a new window

What are the top 3 factors that can drive technological innovation in organizations? How can our readers work to improve their innovation skills?

  • For the last half century and more, Hewlett Packard Labs has been charged with fueling engineering innovation. Labs focuses on placing long-term bets that will propel us forwardOpens a new window . As the leader of that organization, I’ve made a few observations about innovation over the years:
  • Innovation should be big and aspirational. At Labs, we aim to solve problems others deem to be out of reach, continually testing ourselves in the process.
  • Innovation should have a higher purpose. One must have a core belief in the journey undertaken; at Labs, that mission-based belief is our true north, a direction anchored by diverse ideas and inputs as we follow a serpentine path to the truth. Every day is about learning and taking a new step forward.
  • We know this path to innovation, or a breakthrough isn’t always predictable. Some say Fleming discovered penicillin by mistake, or Arthur Fry’s Post-It Note was a lucky invention. Neither is true. Fleming spent thousands of hours filling hundreds of petri dishes with experiments before unearthing the wonder of penicillin. And the Post-It Note was the byproduct of Fry’s countless attempts to uncover the impossible. Many times, innovations are the unexpected byproduct of our explorations.

Memory-Driven ComputingOpens a new window falls into the latter category. As we built bigger and bigger computers, we were paralyzed by the law of diminishing returns. The same architecture has been in place for decades, and we needed a change to optimize for a world where everything is connected, intelligence is everywhere, and data growth is exploding. So, we learned from but ultimately broke free of 70 years’ worth of incrementalism, and applied the lessons learned and insights gleaned to ignite a new era of open innovation that places data at the heart of the next computer architecture.

Neha: Thank you, Mark, for sharing your invaluable insights on what swarm intelligence is. We hope to talk to you again soon.

About Mark PotterOpens a new window :

Mark Potter is the Chief Technology Officer (CTO) for Hewlett Packard Enterprise and the Director of Hewlett Packard Labs, the company’s advanced research organization. Prior to his current role, Potter served as the general manager over key business areas at HP for over a decade. Potter currently serves on several company as well as non-profits boards.

About Hewlett Packard EnterpriseOpens a new window :

Hewlett Packard Enterprise is a global technology leader focused on developing intelligent solutions that allow customers to capture, analyze and act upon data seamlessly from edge to cloud. HPE enables customers to accelerate business outcomes by driving new business models, creating new customer and employee experiences, and increasing operational efficiency today and into the future.

About Tech TalkOpens a new window :

Tech Talk is a Toolbox Interview Series with notable CTOs from around the world. Join us to share your insights and research on where technology and data are heading in the future. This interview series focuses on integrated solutions, research and best practices in the day-to-day work of the tech world.

Would you like to share your thoughts about the future of swarm intelligence? Find us on TwitterOpens a new window , FacebookOpens a new window , and LinkedInOpens a new window . We’d love to hear from you!