Utilities and Big Data

data-vol

Big Data Jolts Utility Providers: Why Harnessing Big Data Will Help Companies Surge

The following article orginally appeared in POWERGRID International, September 11, 2014. Big Data is everywhere and utility providers are creating more of it every single minute of the day. Delivering power is all about efficiency, which is why the energy industry is among those which have fully embraced the concept that Big Data can unlock never-before-known insights in to how energy is distributed and consumed. Applying analytics to Big Data is important, but in the rush to parse and dissect mountains of information, many organizations skip right over a practical but important step in the process. They fail to address the big problem created by Big Data: there is simply too much. As the old saying goes, “you can’t have too much of a good thing,” but that’s just not true with data. Too much data is a drain on many important resources and necessitates a thoughtful Data Volume Management strategy to ensure a high level of system performance. The Importance of Data Volume Management Businesses without a Data Volume Management plan typically see three problems begin to take hold. One is the strain that existing processes put on the Information Technology budget. Data grows quickly – after all, it is being created by every aspect of the operation all the time – and processing speeds grind to a halt. Users grow frustrated with how long it takes to search for data and run standard reports. Two is that as companies adopt innovative data-intensive initiatives, such as smart grid programs, new, complex operational issues emerge that the company must learn to manage. Three is the time it takes for IT administrators to complete backups and restore instances. As data grows, systems get bigger and require more maintenance. Energy is a 24x7 business and every minute the system is locked due to system maintenance has a direct impact on day-to-day operations. These issues become even more problematic as organizations look to migrate to powerful in-memory systems, such as SAP® HANA. In-memory systems are capable of processing more data at a faster speed than ever before to quickly return new analytics, but without a Data Volume Management strategy, the cost of in-memory systems can become so great that it becomes difficult for businesses to see a Return on Investment in a reasonable time. However, there is a practical roadmap that energy providers and utility companies can use to evaluate their data strategy, whether they plan to continue on a legacy system or invest in an exciting platform such as SAP® HANA. Determining Data’s Value When implementing a data strategy, it’s important to remember that all data is not created equal. Some data yields a high level of business value while other data yields a low level of business value. It’s important that the cost of storing data is on par with the value it provides to the operation. Mapping data to short-term and long-term business goals and legal requirements will clarify which data sets carry high versus low value. Such an assessment will also determine where data originates, how it grows, how frequently it is used, and the interrelationships it has with other types of information. The end result is a blueprint that identifies the most important data from static, less valuable information.  Storage Architecture Matters Once the data blueprint is defined, a data storage architecture can be put in place to move data from online, to nearline, and then to archive storage as it matures and decreases in value for the business. Data deemed to have a lower value – such as that which is infrequently accessed or stored for compliance purposes – does not need to be stored in the online system. This data can be relocated to a lower-cost storage tier, using archiving. Users can easily search for and retrieve data from the archive through the ERP system when it is needed. Eventually, when data reaches its end of life and has met its legal and compliance purposes, it can be purged. More valuable, static data can be moved to nearline storage. Nearline storage allows for high-speed access to information while keeping the Total Cost of Ownership in line with budget projections. Only the most current, most valuable data needs to remain in the live ERP system. Benefits of Archiving – Case Study This multi-tiered data storage model was recently used by one of the largest electric utility companies in the United States to improve the Return on Investment in technology as it migrated to the next generation SAP® HANA ECC and BW systems. This utility company services nearly 14 million people spread across 50,000 square miles of territory. It was intrigued by the promise of SAP® HANA to offer faster reporting, faster data loading times, reduced maintenance costs, and decreased development costs compared to its existing SAP systems. The ultimate goal was to achieve new analytical capabilities it could not currently implement. However, the company was concerned about keeping the Total Cost of Ownership of the system in check. Specifically, the cost related to HANA’s high performance “pay as you grow” in-memory storage meant that the company had to develop a plan to control its database growth. The utility company opted to partner with Dolphin to architect a solution that included data archiving and PBS Software’s nearline storage. The end result was a full 50 percent reduction in growth as well as the ability to seamlessly execute queries that retrieve information from both HANA and nearline storage as appropriate. The calculated Return on Investment was reduced from more than 15 years to an impressive two and a half years. Continual Benchmarking Once a data volume management strategy is in place, continual benchmarking ensures that data growth is managed, the system performs as expected, and IT costs remain low. Too frequently, organizations – both in and out of the utility industry – implement a data volume management strategy, but fail to put a process in place to monitor and manage data growth over time. The initial success that comes from mapping, filtering and sorting the existing information, witnessing a jump in processing speeds as terabytes of information are freed up, creates a sense of accomplishment. However, strategies need to be regularly evaluated, updated, and improved as business tactics change and new analytics are introduced.   Benchmarks for data volume management should support the needs of a wide range of users from the IT Manager to the CFO and CEO. High-level executives need to understand the impact that Big Data is having on the business today and in the future, so they can formulate effective business plans.   The IT team, on the other hand, needs data that will help them keep data volume growth under control and let them adjust their budgets and take corrective action before problems arise. This requirement for continuous monitoring underscores the need for a real-time Information Lifecycle Management dashboard that benchmarks data volume management initiatives not only against internal goals for processing speeds and reporting times, but against the performance of other businesses too. With this information at hand, companies can understand how Big Data is impacting the business over time and make an informed business case for continued investment in new processes and technology.   Make Archiving a Priority There is more data being created today than at any other time in history and the cost of storing this information can outpace IT budgets and cause performance bottlenecks. Advancements in enterprise computing technology, with SAP® HANA leading the pack, represent a monumental leap forward in how fast data can be processed and analyzed, but this new technology does not eliminate the need to properly manage data – it only increases it. As 2015 approaches, all companies -- even companies that are not yet ready to migrate to a next generation platform -- can benefit from developing a data volume management plan that makes peak performance and lower IT costs top priorities. About the author: Dr. Werner Hopf is CEO of Dolphin Enterprise Solutions Corp. He has more than 20 years of experience in the information technology industry, including 15 focused on SAP technologies. Dr. Hopf is a well-known expert in data volume management and data archiving strategies and solutions. To learn more, visit www.dolphin-corp.com.

Utilities and Big Data

Big Data Jolts Utility Providers: Why Harnessing Big Data Will Help Companies Surge

The following article orginally appeared in POWERGRID International, September 11, 2014.

Big Data is everywhere and utility providers are creating more of it every single minute of the day. Delivering power is all about efficiency, which is why the energy industry is among those which have fully embraced the concept that Big Data can unlock never-before-known insights in to how energy is distributed and consumed.

Applying analytics to Big Data is important, but in the rush to parse and dissect mountains of information, many organizations skip right over a practical but important step in the process. They fail to address the big problem created by Big Data: there is simply too much.

As the old saying goes, “you can’t have too much of a good thing,” but that’s just not true with data. Too much data is a drain on many important resources and necessitates a thoughtful Data Volume Management strategy to ensure a high level of system performance.

The Importance of Data Volume Management

Businesses without a Data Volume Management plan typically see three problems begin to take hold. One is the strain that existing processes put on the Information Technology budget. Data grows quickly – after all, it is being created by every aspect of the operation all the time – and processing speeds grind to a halt. Users grow frustrated with how long it takes to search for data and run standard reports. Two is that as companies adopt innovative data-intensive initiatives, such as smart grid programs, new, complex operational issues emerge that the company must learn to manage. Three is the time it takes for IT administrators to complete backups and restore instances. As data grows, systems get bigger and require more maintenance. Energy is a 24×7 business and every minute the system is locked due to system maintenance has a direct impact on day-to-day operations.

These issues become even more problematic as organizations look to migrate to powerful in-memory systems, such as SAP® HANA. In-memory systems are capable of processing more data at a faster speed than ever before to quickly return new analytics, but without a Data Volume Management strategy, the cost of in-memory systems can become so great that it becomes difficult for businesses to see a Return on Investment in a reasonable time. However, there is a practical roadmap that energy providers and utility companies can use to evaluate their data strategy, whether they plan to continue on a legacy system or invest in an exciting platform such as SAP® HANA.

Determining Data’s Value

When implementing a data strategy, it’s important to remember that all data is not created equal. Some data yields a high level of business value while other data yields a low level of business value. It’s important that the cost of storing data is on par with the value it provides to the operation.

Mapping data to short-term and long-term business goals and legal requirements will clarify which data sets carry high versus low value. Such an assessment will also determine where data originates, how it grows, how frequently it is used, and the interrelationships it has with other types of information. The end result is a blueprint that identifies the most important data from static, less valuable information.

 Storage Architecture Matters

Once the data blueprint is defined, a data storage architecture can be put in place to move data from online, to nearline, and then to archive storage as it matures and decreases in value for the business. Data deemed to have a lower value – such as that which is infrequently accessed or stored for compliance purposes – does not need to be stored in the online system. This data can be relocated to a lower-cost storage tier, using archiving. Users can easily search for and retrieve data from the archive through the ERP system when it is needed. Eventually, when data reaches its end of life and has met its legal and compliance purposes, it can be purged.

More valuable, static data can be moved to nearline storage. Nearline storage allows for high-speed access to information while keeping the Total Cost of Ownership in line with budget projections. Only the most current, most valuable data needs to remain in the live ERP system.

Benefits of Archiving – Case Study

This multi-tiered data storage model was recently used by one of the largest electric utility companies in the United States to improve the Return on Investment in technology as it migrated to the next generation SAP® HANA ECC and BW systems. This utility company services nearly 14 million people spread across 50,000 square miles of territory.

It was intrigued by the promise of SAP® HANA to offer faster reporting, faster data loading times, reduced maintenance costs, and decreased development costs compared to its existing SAP systems. The ultimate goal was to achieve new analytical capabilities it could not currently implement. However, the company was concerned about keeping the Total Cost of Ownership of the system in check. Specifically, the cost related to HANA’s high performance “pay as you grow” in-memory storage meant that the company had to develop a plan to control its database growth. The utility company opted to partner with Dolphin to architect a solution that included data archiving and PBS Software’s nearline storage.

The end result was a full 50 percent reduction in growth as well as the ability to seamlessly execute queries that retrieve information from both HANA and nearline storage as appropriate. The calculated Return on Investment was reduced from more than 15 years to an impressive two and a half years.

Continual Benchmarking

Once a data volume management strategy is in place, continual benchmarking ensures that data growth is managed, the system performs as expected, and IT costs remain low. Too frequently, organizations – both in and out of the utility industry – implement a data volume management strategy, but fail to put a process in place to monitor and manage data growth over time. The initial success that comes from mapping, filtering and sorting the existing information, witnessing a jump in processing speeds as terabytes of information are freed up, creates a sense of accomplishment. However, strategies need to be regularly evaluated, updated, and improved as business tactics change and new analytics are introduced.

 

Benchmarks for data volume management should support the needs of a wide range of users from the IT Manager to the CFO and CEO. High-level executives need to understand the impact that Big Data is having on the business today and in the future, so they can formulate effective business plans.

 

The IT team, on the other hand, needs data that will help them keep data volume growth under control and let them adjust their budgets and take corrective action before problems arise. This requirement for continuous monitoring underscores the need for a real-time Information Lifecycle Management dashboard that benchmarks data volume management initiatives not only against internal goals for processing speeds and reporting times, but against the performance of other businesses too. With this information at hand, companies can understand how Big Data is impacting the business over time and make an informed business case for continued investment in new processes and technology.

 

Make Archiving a Priority

There is more data being created today than at any other time in history and the cost of storing this information can outpace IT budgets and cause performance bottlenecks. Advancements in enterprise computing technology, with SAP® HANA leading the pack, represent a monumental leap forward in how fast data can be processed and analyzed, but this new technology does not eliminate the need to properly manage data – it only increases it. As 2015 approaches, all companies — even companies that are not yet ready to migrate to a next generation platform — can benefit from developing a data volume management plan that makes peak performance and lower IT costs top priorities.

About the author:

Dr. Werner Hopf is CEO of Dolphin Enterprise Solutions Corp. He has more than 20 years of experience in the information technology industry, including 15 focused on SAP technologies. Dr. Hopf is a well-known expert in data volume management and data archiving strategies and solutions. To learn more, visit www.dolphin-corp.com.