Data Security Archives | HealthTech Magazines https://www.healthtechmagazines.com/category/healthcare-it-security/data-security/ Transforming Healthcare Through Technology Insights Thu, 08 Aug 2024 14:04:07 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.1 https://www.healthtechmagazines.com/wp-content/uploads/2020/02/HealthTech-Magazines-150x150.jpg Data Security Archives | HealthTech Magazines https://www.healthtechmagazines.com/category/healthcare-it-security/data-security/ 32 32 Ransomware Preparedness for Healthcare: Enhancing Resilience Amid Growing Threats https://www.healthtechmagazines.com/ransomware-preparedness-for-healthcare-enhancing-resilience-amid-growing-threats/ Fri, 13 Oct 2023 13:46:00 +0000 https://www.healthtechmagazines.com/?p=6881 By Dilip Nath, AVP & Deputy CIO, SUNY Downstate Health Sciences University Ransomware attacks have become a significant problem in

The post Ransomware Preparedness for Healthcare: Enhancing Resilience Amid Growing Threats appeared first on HealthTech Magazines.

]]>

By Dilip Nath, AVP & Deputy CIO, SUNY Downstate Health Sciences University

Ransomware attacks have become a significant problem in the healthcare sector. These criminal operations have become a formidable foe that needs a concerted response from healthcare groups. The recent high-profile cyber-attacks on prominent healthcare institutions have highlighted the critical need to tighten cyber security in the healthcare business. Proactive efforts are required since ransomware attacks not only endanger patient data, but also considerably raise the risk to medical care.

Preparing for Ransomware Attacks

In view of the increasing threat of ransomware attacks, healthcare companies must take a proactive approach to preparation. Thorough risk assessments are a critical component of this planning. These evaluations provide the core of ransomware mitigation strategies. At this stage, organizations in the healthcare industry carefully identify system flaws and analyze the dangers associated. Organizations can build a plan for delivering successful mitigation measures by appropriately identifying potential vulnerabilities in their cyber security architecture.

Developing a robust response strategy is also critical to their ransomware preparedness. Healthcare institutions that use this strategy will be directed like a compass through the turbulent waters of a ransomware attack. In the event of an attack, it provides precise, logical procedures that must be taken. This action plan includes procedures for isolating contaminated systems, an efficient reporting method for law enforcement, and an effective patient and staff communication approach. It is impossible to overstate the importance of having a well-organized reaction strategy since it ensures a coordinated and effective response when time is of the essence.

Furthermore, the value of the human aspect in cyber security cannot be emphasized. Employee training initiatives are a key priority for healthcare firms to equip their first line of defense. These courses provide healthcare personnel with the knowledge and skills to recognize specific ransomware threats. Employees are trained on how to identify phishing emails, which are regularly used as entry points for ransomware attacks, and how to report any suspicious behavior immediately.

Last but not least, proactive security deployment is critical for mitigating ransomware attacks. For this, reliable technologies like firewalls, antivirus software, and intrusion detection systems must be used. These layers of defense increase detection and mitigation, making it more difficult for hackers to infiltrate the system.

Responding to Ransomware Attacks

In the unfortunate event of a ransomware attack, a rapid and well-planned response is critical to reducing damage and regaining control.

Isolating Infected Systems

The first line of defense is to isolate vulnerable systems as quickly as possible. This precaution is required to prevent ransomware from spreading throughout the network. By isolating the susceptible systems, healthcare institutions can limit the attack’s reach and prevent further data compromise. Isolation is the first step in regaining control of the situation.

Collaborating with Law Enforcement

Cooperation with law enforcement is critical when responding to ransomware attacks. Their knowledge and resources aid in investigating, mitigating, and monitoring of cyber criminals, which helps the overall reaction and pursuit of justice while also avoiding new attacks.

Effective Communication with Stakeholders

Managing the aftermath of a ransomware attack necessitates open communication and fast information sharing. Personnel and patients must be informed about the incident’s impact on data security and medical services as soon as possible. Maintaining confidence, managing expectations, and ensuring a coordinated response contribute to a lower overall effect.

Data Restoration from Backups

Reliable data backup and recovery are critical for mitigating the effects of ransomware. They enable data restoration in order to sustain care and minimize long-term impacts. Updated backups serve as a safety net, allowing for recovery without falling into hackers’ demands and, eventually, resuming normal corporate operations.

Balancing Innovation and Security

While technologies such as generative AI and data modernization have immense potential, it is critical that cyber security is not jeopardized in the process. Given the rapid speed of technological advancement, the healthcare business cannot afford to remain complacent regarding security. Finding this balance is critical because it allows healthcare organizations to adopt developing technology while also ensuring the availability, confidentiality, and integrity of critical data and services.

In this day and age, healthcare must emphasize effective cyber security. This necessitates the deployment of cutting-edge technology, thorough risk analysis, and stringent standards. Leveraging innovation without accepting unnecessary risks is made feasible by improving cyber security while embracing technology. This harmony preserves patient information, maintains trust, and ensures the continuance of healthcare services.

Healthcare facilities are becoming vulnerable to ransomware attacks, a major problem that must be addressed immediately.

Strategies for Ransomware Preparedness

A comprehensive strategy for ransomware preparation includes several critical tactics:

  • Rigorous Risk Assessment: Identifying vulnerabilities and threats via rigorous assessments is the cornerstone of resilience.
  • Effective Response Planning: Prepares for every ransomware event. Regular response strategies should be developed and maintained.
  • Employee Education: Ongoing training programs enable employees to proactively spot and resolve hazards.
  • Robust Security Infrastructure: Investing in cutting-edge security practices and technology builds strong protection against cyber threats.
  • Patient-Centric Approach: Maintaining patient trust and resolving patient concerns about data security are crucial in the healthcare sector.
  • Continuous Improvement: Continuous improvement is made feasible by frequently reviewing protection, detection, reaction, and recovery capacities.

Healthcare facilities are becoming vulnerable to ransomware attacks, a major problem that must be addressed immediately. Proactive actions must be taken to build defenses against these dangers in order to protect patient care and the general public’s health. By conducting thorough risk assessments, developing specific response plans, educating employees, implementing advanced security measures, addressing patient concerns, and embracing technology while fortifying cyber security, healthcare organizations can successfully prepare for and respond to ransomware threats. These strategies are critical to ensuring that the healthcare business remains a reliable guardian of patients’ well-being in the face of evolving cyber threats.

The post Ransomware Preparedness for Healthcare: Enhancing Resilience Amid Growing Threats appeared first on HealthTech Magazines.

]]>
The Case for the Healthcare Data Scientist https://www.healthtechmagazines.com/the-case-for-the-healthcare-data-scientist/ Tue, 27 Jun 2023 13:38:11 +0000 https://www.healthtechmagazines.com/?p=6643 By Chris Kelly, Associate CMIO for Data and Analytics, MultiCare Health System No one in healthcare will forget March 2020,

The post The Case for the Healthcare Data Scientist appeared first on HealthTech Magazines.

]]>

By Chris Kelly, Associate CMIO for Data and Analytics, MultiCare Health System

No one in healthcare will forget March 2020, staring down the worst pandemic in living memory. Society shut down. People started dying at unheard-of rates in Italy, and shortly later NY City. MultiCare Health Systems, an 11-hospital healthcare system in the Pacific Northwest, near where the first US cases were reported, needed to know what to expect.

The data science team at MultiCare addressed the problem by modeling case increases in the communities we serve, first with exponential growth models, but within two weeks, we realized logistic (S-shaped) growth models fit the data better. We were not hit hard in that first wave, and we have continued to model Covid cases across our system through subsequent waves, giving advance notice of a surge and providing our leaders with data-driven insights.

The future of healthcare is big data. But big data by itself just sits in an enterprise data warehouse and runs up storage fees. Healthcare data scientists are essential in moving beyond reports and static dashboards. Data needs to be turned into actionable intelligence for a healthcare system to benefit.

Healthcare data scientists can help an organization get the greatest return on their data infrastructure investment.

Endless Opportunity. The problems that can be addressed with data science are essentially endless. Will a patient be readmitted after discharge? Which patients will have a prolonged hospital course? Can we identify those patients who will develop sepsis earlier and start lifesaving treatment sooner? These problems are readily amenable to predictive modeling and are already commonly deployed in hospitals across the country.

This is just the leading edge of what predictive modeling can do. Many systems are large enough to provide comprehensive datasets on a wide range of patients and conditions. Each disease can be approached using predictive modeling. For many common ailments, a patient’s journey can be mapped along a pathway, with each node in the pathway representing a decision point. It is not hard to imagine a future where dozens, even hundreds of pathways guide a patient’s care, with each downstream step in the journey modeled, and the optimal course of action presented for each individual.

Healthcare is undergoing a generational change as we move away from fee-for-service and towards shared savings and population-based care. Therefore, the need for accurate predictions will increase: who is most likely to be admitted to the hospital? Who will benefit from an intervention to keep them out of the emergency room? Should that intervention be an additional visit with their primary care doctor, transportation assistance to a specialist or another intervention, like a home health visit?

And, as great as the opportunities are in clinical care, as data becomes more and more available, improving a healthcare system’s operational processes may have just as much potential.

Business Analyst or Data Scientist? Do we really need data scientists? A healthcare system’s core competency will always be healthcare delivery, isn’t an analyst enough?

There are a number of steps involved in turning data into a true understanding of the problem. Querying data is often surprisingly challenging: the databases of some electronic medical records (EMR) are composed of more than 20,000 unique tables. Data needs to be aggregated, its quality assessed, and presented in a way that it can be understood by end users. Advanced analytics include machine learning (ML), forecasting, cluster analysis and the ability to hypothesis test. These skills usually require an advanced degree, although not necessarily these days given the availability of online training. But with these abilities, a data scientist can provide insight beyond what you can gather from a dashboard.

The Role of the Vendor. Can’t advanced analytics just be purchased from specialized companies? Certainly, the level of sophistication needed to develop deep learning algorithms is not something many healthcare systems will be able to support. Natural Language Processing (NLP) in particular has made a lot of progress in the last few years and will soon be pulling knowledge out of free text. Third party vendors will be helpful here, but fully realizing their potential will require people who understand both the algorithm and the use cases. 

In a larger sense, data science can help an organization develop insight long before it gets to the level of an RFP. Many people in healthcare have deep knowledge about esoteric fields. They may want to explore a hunch with genuine financial and clinical implications. Having access to a data science team, people who leaders know personally and can connect with to talk through a problem, can make all the difference.

Additionally, healthcare organizations need to develop the sophistication to evaluate a third party’s claims. Does a purchased model accurately predict what end users think it does? Even models published in peer-reviewed, academic journals cannot be assumed to be accurate on a specific organization’s population. A model needs to perform for an entire population, but also needs to be evaluated for bias on the vulnerable groups a system cares for.

The Clinician Data Scientist? Two decades ago, it was hard to imagine that we would have doctors who specialize not just in patient care but in optimizing the use of the electronic medical record. Now, over two thousand doctors are board certificated in clinical informatics, with hundreds more becoming certified every year. The value proposition for clinician data scientists may be even greater.

Big Data and Data Science are Essential to the Future of Healthcare. Data science is not an add-on, but a process to integrate into healthcare decision-making. As organizations make bigger and bigger investments in enterprise data warehouses and data aggregation platforms, healthcare data scientists are the best way to assure return on that investment.

The post The Case for the Healthcare Data Scientist appeared first on HealthTech Magazines.

]]>
Transitioning to Patient-owned Data https://www.healthtechmagazines.com/transitioning-to-patient-owned-data/ Fri, 23 Jun 2023 16:02:54 +0000 https://www.healthtechmagazines.com/?p=6674 By Dustin Hufford, SVP & CIO, Cooper University Health Care Healthcare’s primary problem is not the lack of data, but

The post Transitioning to Patient-owned Data appeared first on HealthTech Magazines.

]]>

By Dustin Hufford, SVP & CIO, Cooper University Health Care

Healthcare’s primary problem is not the lack of data, but the lack of fidelity and usability of the available data. The challenge is collating, interpreting, and distilling data to a usable state and getting those data to the right person, at the right time. Additionally, healthcare data is unwieldy and fractured, continuing to fracture exponentially as more care channels and options emerge. New channels generate new data silos that make safe, effective care difficult. Compounding this issue, costs of health insurance, care delivery, and medications continue to rise, placing adequate healthcare out of reach for many. As a result, the quality of care across the US continues to lag significantly behind other countries.

Patients are the reason healthcare exists, and yet, they are rarely centered in their information or care.

Also, the US healthcare is experiencing an unprecedented period of change, brought on by industry pressures, which makes an already complex system more cumbersome and perilous.

These pressures include: 

  • Dramatic shifts in consumer expectations: Younger consumers are not satisfied with traditional healthcare, and consumers of all age groups are more willing than ever to try non-traditional services. Millennials and gen Z, who make up 42% of the population and 21% of healthcare services, expect convenience, affordability, transparency, and quality and are redefining how they engage in every stage of their care.

  • Fragmentation: The delivery of care through established service paths (e.g., doctors, clinics, and medical centers) must now compete with non-traditional service paths that represent emerging types of service delivery (e.g., walk-in or retail clinics, outpatient surgery hospitals, virtual health, on-demand services, in-home services, or digital therapeutics).

  • Increased regulatory pressures: The burden of new and existing laws regulating healthcare—such as HITECH, HIPAA, ACA, FDASIA, and MACRA/MIPS—affects providers by increasing their administrative load and by adding or increasing penalties for services that do not meet a set of prescribed quality, interoperability, and performance criteria. These burdens slow the delivery of care and reduce patient interface time with doctors and their clinical staff, alongside a host of other factors that can negatively impact care delivery, patient outcomes, and provider reimbursement.

  • Hyper-specialization as the knowledge about diseases accelerates: Medical research continues to reveal the complexity behind disease causes and treatments. As research unravels the genome, microbiome, and proteome, referred to as multiomics, to understand their role in health and wellness, physicians become more specialized to turn discoveries into better outcomes for patients.

Data is duplicated and conflicting due to issues with standards

Because most health record systems do not consolidate information, numerous patient and provider-reported health records result in duplication, retention of outdated information, and leave room for error. Also, payer data often inaccurately reflects patient care and services provided due to the complicated nature of billing practices. Often, to ease workflow, patient services are billed based on a short list of memorized codes or the first code to populate a search, resulting in loss of fidelity. Therefore, providers don’t uniformly have access to accurate reference records which creates an overwhelming burden on providers trying to find the information needed to make recommendations.

Projects to enable interoperability are costly and time-consuming

Traditional data transformation and sharing methods are complex and deduplicating of the data with any precision is time-consuming and risky. Important changes in a patient’s record can take weeks or even months to emerge, as the data integration does not happen in near real time. There are existing methods of sharing more cleanly within like EMRs, but even in that method, there are issues reconciling data due to the differences in system setup (x field in system A is blood pressure, whereas it’s y field in system B).

The patient is never in control of their data

Patients are the reason healthcare exists, and yet, they are rarely centered in their information or care. They have little control of their data and, in most cases, have no concept of how the data is used and where it’s shared. Also, all too often, patient-provided information fails to be integrated with the patient’s record thereby ignoring critical pieces of information. By ignoring the patient as a vital part of healthcare and its interoperability, data sharing and cleansing become complex and diminishes the capabilities of healthcare providers to make data-based diagnoses and treatment decisions.

What can and should be done?

We should strive for a single, golden record for every person on Earth that is updated in real-time as changes happen and allows for notification of significant events to be delivered to the right person at the right time. And that record should be owned by the individual, not the system.

There have been many barriers to this in the past, but the most significant challenge has been to uniquely identify each person and all of the entities and assets they interact with.

Luckily, technologies and tools emerging on the market now can systematically address these issues through AI and machine learning. Tasks considered nearly impossible, like merging 20 medical records and distilling the information down to a single record, can now be done at scale, with the patient owning the overall outcome.

One emerging company consolidates health and wellness data into a single, standardized record under secure control of the patient that facilitates seamless data exchange amongst healthcare and life science constituents. The technology leverages syntactic, structural, and semantic interoperability techniques in addition to patient-level interventions when AI cannot resolve the data cleansing automatically.

In addition to focusing on rich medical data, this company continuously fills gaps with real-time, real-world data from multiple sources (e.g., wearable and medical devices), in conjunction with social determinants of health (SDOH) and patient-reported information.

How do we get there?

While companies like this are working to solve this problem, there are cultural barriers in the US that need to be overcome. To truly achieve patient-owned interoperability, data hoarding and profiteering concepts need to be addressed. Healthcare needs to be democratized for a more equitable landscape. Platforms that democratize health information shift people from passive to active participants within their own health outcomes. After all, it is ultimately the patient that bears the burden of adverse health outcomes, not the providers.

The post Transitioning to Patient-owned Data appeared first on HealthTech Magazines.

]]>
Data Governance enabling Transformation, Interoperability and Privacy (TIP) https://www.healthtechmagazines.com/data-governance-enabling-transformation-interoperability-and-privacy-tip/ Mon, 06 Feb 2023 14:52:39 +0000 https://www.healthtechmagazines.com/?p=6430 By Doug Graham, Director of Enterprise Data Governance, Mercy Health While introducing data governance to a provider onboarding summit, I

The post Data Governance enabling Transformation, Interoperability and Privacy (TIP) appeared first on HealthTech Magazines.

]]>

By Doug Graham, Director of Enterprise Data Governance, Mercy Health

While introducing data governance to a provider onboarding summit, I asked the question, “What do you think of when you hear the term Data Governance?”

The response… “run the other way!”

By the end of the presentation, the teams were relating to the “purpose” of data governance…

  • It seeks to break down organizational barriers that lead to silos who work independently and don’t “trust” their peers from other business units as it relates to the data.

  • It aims to involve the community of professionals who share common data subjects like Providers and Locations into a unified language set that allows them to understand one another.

  • It introduces the potential to interoperate data between organizational boundaries when the terms used to describe that data are the same.

  • It facilitates understanding of the footprint of data throughout the organization and organizes it in ways that can be understood by those who are on the front line of maintaining that data.

  • It empowers the caregivers on the front line with the right tools to heal valuable data assets.
Transformation Strategy Alignment

Good Enterprise Business Architecture practices will identify an organization’s strategic objectives alongside its’ core capabilities. Where capability gaps exist, it will seek to close those gaps with process improvement efforts. Process improvement efforts invariably involve data. Therefore, to close business capability gaps, data governance fits into the strategic plan.

Transformation transformation

The prioritized list of projects for the strategic plan can be organized into a new grouping of Data Governance Subject Area’s for an organization. Health care can be organized by core subject areas (Domains) Providers, Consumers/Patients, Payers, Locations, Clinical Services, Finance, etc. The project’s focus will generally fall into one of these core domains and the critical data that needs defining can often be broken down by project. Where there are common critical data defined across multiple projects, then the unification of the definitions becomes imperative. Define once then reuse the definitions, independent upon where they are “prioritized”. A new and separate but parallel body of works begins to take formation. This is referred to as the Data Governance Working group. These working groups define the common data across organizational business areas in a collaborative setting. With projects set aside, these groups pre-define the data that becomes the scope of the future project. The beauty of the new process falls in the realization that in defining it, you have come face to face with a decision on “where” it is to be maintained. This is where interoperability begins to register among the business process stakeholders.

The core technology enablers for transformation through governance are the Glossary, Catalog, and Data Quality Platforms.

Staging the Interoperability Affect

Interoperability affects everything in an organization. When we interoperate, we serve patients according to our strengths and training. When we interoperate with data, we inform consumers according to reliable and “agreed upon” standards. Agreement does not happen in a vacuum. It is a very intentional effort that requires executive attention and focus and resolves down to the operational imperatives. This is where the Data Governance Framework comes into play. Distilling the organization that spans multiple regions and departmental boundaries can seem to be a daunting task. The discovery of the natural tendencies of leadership influence may prove to align rather well with the common data domain constructs for health care. Searching out the leadership councils that drive business priorities and shifting the lens to the data domain focus may result in reasonable outcomes.

Now that we have the organizational focus on data, we must consider some key technology enablers to accomplish this transformation. The core technology enablers for transformation through governance are the Glossary, Catalog, and Data Quality Platforms. Glossaries align business meaning to the Domain-based framework across departments and include data “classification”. Catalogs organize physical data constructs across organizational departments and can automate the classification as agreed on by the Domain authorities in the glossary. The third leg in this Governance construct is the data quality connection that extends the domain decisions regarding classification and reference data authorization and applies measures that report back to the domain authorities regarding progress toward standardization.   

Privacy Play

The technical tooling that supports data governance implementation, coupled with a renewed awareness of business processes that can be standardized to drive value and strategic alignment, now presents new opportunities to address organizational data sprawl problems. Governance artifacts deliver common business meaning to terms that tell the tools what to look for and how “sensitive” those assets are when discovered. Once the catalog tool finds the assets, which are associated to the glossary where meaning and role accountability were determined by the data governance working group, it can then report back to the data owners regarding who is consuming the data asset. This then enables the policy to apply the appropriate data treatment rule. Business processes that were duplicated around data maintenance can now reorganize with confidence and trust in the resulting operational efficiency as master data and data quality practices begin to weave into the new data fabric.

Whereas there are disparities in perception of the necessities and appropriate focus for data governance as a practice, balance around value delivery will necessitate that it be holistic in purpose and integral in the corporate strategic plan to involve the people, processes and technologies that transform organizational effectiveness.   

The post Data Governance enabling Transformation, Interoperability and Privacy (TIP) appeared first on HealthTech Magazines.

]]>
Delivering Technology in a Rapidly Changing Ecosystem https://www.healthtechmagazines.com/delivering-technology-in-a-rapidly-changing-ecosystem/ Fri, 07 Oct 2022 12:25:04 +0000 https://www.healthtechmagazines.com/?p=6256 By Pete D’Addio, Director, Enterprise Technology, Moffitt Cancer Center Every day there is a new need for technology. New use

The post Delivering Technology in a Rapidly Changing Ecosystem appeared first on HealthTech Magazines.

]]>

By Pete D’Addio, Director, Enterprise Technology, Moffitt Cancer Center

Every day there is a new need for technology. New use cases and opportunities continue to challenge the status quo, if the status quo as we know still exists. The iPhone, now celebrating its 15th anniversary, revolutionized the world. But the newest and youngest generation does not know life without these devices and expectations of delivery of information are rapid and instant. Healthcare needs to also deliver overall in this space.

In a business sector that has traditionally been behind on the transformation maturity scale, how do you pivot to the quicker and more effective delivery of technology? Among many things that came from the pandemic, one important finding is that healthcare professionals’ understanding and awareness of technology has increased substantially. This drastic maturity forward in how technology is delivered now brings a crossroads. The maturity model now must include more clinical involvement. What do nurses need? What do doctors need? What do patients need? How can clinicians deliver patient care most effectively? What is the true lexicon they work in now? Internally, we developed a council that brought stakeholders from different areas in the organization to discuss foundational technology and roadmap. The most important deliverable is to focus on those business needs and collaborate on technologies and transformations that bring the most value. This feedback loop is essential to understand how rapidly the demands have grown for these areas, which challenges the need to deliver quicker.

Traditional Infrastructure and Operations has to give way to stronger business partnerships and understanding of needs, especially as technology has aligned so closely to patient care. With a technology focus on business, it is important to grow the depth of technology, beyond the traditional data center, endpoint, and mobile. So the new challenge is how traditional I&O teams engineer and administer these newer technologies in the same ecosphere as traditional technology. Today’s transformational journeys of healthcare organizations are full steam ahead. This is the maturity of transformation. But what is the lift needed to not just grow from the legacy systems, but to accelerate that journey quickly? A careful balance. What brings this together is a fully realized roadmap that must be matured to adopt new architectures and platforms that focus on the new deliverables and innovation needed. 

Foundational technology is a building block for the delivery of services, whether it is through the traditional data center or cloud partners and connectivity and technology.

Now, an expanded focus on delivery foundational architecture leads to outcomes for today and tomorrow for delivering better patient care. But what does an organization do to provide focus on sunsetting the current and technology platforms? Years ago, it was the ability to virtualize servers; then it was the ability to build in the cloud; then it was building containers to deliver quickly. But what if an organization is still using a large number of physical servers? This is an essential challenge for I&O leaders. Here is where the partnerships between IT and healthcare stakeholders must provide focus on what is needed to move forward. Steering committees and councils are habitually the avenues to discuss this. My organization focuses on this balance and need. We have developed roadmaps with achievable timelines so delivery of new architectures can be completed in congruence with twilighting the right legacy systems.

As this journey of digital transformation endures, infrastructure and technology teams can accentuate the opportunities to deliver quickly to support patient care. The technology roadmap to move maturity must prevail to be successful. Skillsets and capabilities must also mature. 

With the expansion of patient care virtually through the pandemic, one area that continues to accelerate care is the ability for home health. The different technologies for home health, such as wearables and other monitoring solutions, bring a new focus. My organization is working collectively to connect all these different devices seamlessly and securely, beyond the traditional means. Self Service and automation accelerate these capabilities, delivering critical success factors for patient care. This does not go without a challenge. How do we balance the safety and security of technology with enablement? The internal collaboration and partnerships allow for the most appropriate architecture to be created, especially with Cyber Security. New partnerships need to be formed to deliver self-service models and conjoin the interoperability opportunities up and down the stack. In the end, it is important to continue to be patient-focused.

This portfolio of technology expansion increases the need of interoperability. As my organization continues to grow and expand, the demand for additional smart devices, such as patient beds and RTLS, brings more avenues for data needs. The effort is finding the right strategic partners for foundational technology to support these additional data, considering the needs to transport this data effectively. It is imperative to solidify wired and wireless network architectures for the increased density of data, but also magnify other technologies like Bluetooth and IR. What types of devices do patients interact with during their in-person care and how to deliver these interoperability mechanisms are the new focus.

Foundational technology is a building block for the delivery of services, whether it is through the traditional data center or cloud partners and connectivity and technology. It is important to continuously evaluate the technology maturity roadmap. The rapid pace of delivery and maturity can certainly bring a loss of focus on the important milestones needed on the journey. How does traditional technology I&O balance the need for 5×9’s or better, while also supporting a growing digital transformation practice where trial and error is key to find the right solution? This is addressed with the right collaboration and expectations set.

There is an excitement about the possibilities of digital prospects. The importance of organizational vision and support drives how technology decisions bring new value. The foundational technology must support the needs of clinicians, researchers and especially patients in this digital age.

The post Delivering Technology in a Rapidly Changing Ecosystem appeared first on HealthTech Magazines.

]]>
Thoughts on Cyber Intelligence and Blockchain https://www.healthtechmagazines.com/thoughts-on-cyber-intelligence-and-blockchain/ Thu, 26 May 2022 15:08:34 +0000 https://www.healthtechmagazines.com/?p=6013 By Rishi Tripathi, Chief Information Security Officer, Mount Sinai Health System Cyberattacks occur worldwide almost every day, yet it is

The post Thoughts on Cyber Intelligence and Blockchain appeared first on HealthTech Magazines.

]]>

By Rishi Tripathi, Chief Information Security Officer, Mount Sinai Health System

Cyberattacks occur worldwide almost every day, yet it is challenging to learn the type and target of an attack in real time. A vast amount of cyber intelligence goes untapped; even if it could benefit everyone, companies do not want to share sensitive information broadly. Doing so, they may expose themselves to legal or regulatory scrutiny. 

Companies certainly see the need and benefit of sharing real-time cyber intelligence if there was a way to share more information without revealing too much. Perhaps, we should look at creating a Blockchain-based cyber intelligence platform in conjunction with:

  1. Zero-knowledge proof that separates data verification from the data itself. One party (the prover) can prove to another party (the verifier) the possession or existence of some information without revealing all the detailed information.

  2. Multi-party computation on data sets can reveal how many companies have been impacted by a similar attack without revealing the company’ details. This method can allow multiple parties to make calculations using their combined data without revealing their input.

  3. Homomorphic encryption allows users to perform computations on its encrypted data without first decrypting it, protects data, and lets users run queries on the data to gain insights.

This method may enable—privately and safely—shared, real-time attack metrics, which analytics can use to uncover trends in the attack’s location, type and sophistication. I would encourage further financial and technical studies of this method to ensure it’s effectiveness and efficiency. More people must collaborate to address this challenge.

Companies worldwide may be able to share real-time data about cyberattacks, while using a key to protect details. Sharing the keys can also become a path to commercialization, where attack details are transmitted via smart contracts with agreed-upon customers, government agencies, and regulators. 

Currently, no major player is utilizing Blockchain to share intelligence and trends around cyberattacks. They still use legacy, information sharing methods—often outdated or inaccurate—utilizing data exchange or Application Programming Interface (API). 

The establishment of this type of Blockchain could directly connect to a company’s cybersecurity defense infrastructure that can ingest relevant pieces of information flowing through the Blockchain—protecting the company from a new type of cyberattacks.

In cybersecurity, once you’re able to gather verifiable, accurate information about cyberattacks, it becomes extremely valuable to ingest that information into existing technologies deployed to protect the company. 

This method may crowdsource cyberattack defenses. Attacks seen in one part of the world on an individual computer could be transmitted almost in real time using Blockchain. This global communication could allow defensive measures to be set up in near real time. Global sharing can thwart the creation of new hacking groups, as their initial attacks will not succeed, and they will require more time to grow.

Exciting trends and new technologies are emerging to help address cybersecurity challenges, Blockchain being one of them. Several use cases come to my mind involving Blockchain and cybersecurity. For example, the above approach can also be utilized to safely share data with the third parties a company does business with; other combinations of Blockchain and Cryptography may provide unique use cases in cybersecurity.

The best solution may vary by person and organization. Instead, I encourage provoking conversation that, perhaps, inspires others to develop leading-edge solutions to solve cybersecurity issues that were once difficult and challenging years ago—well before the technological and innovative advances we are able to leverage today.

The post Thoughts on Cyber Intelligence and Blockchain appeared first on HealthTech Magazines.

]]>
Managing the Evolving Data Landscape in Cancer Care and Research https://www.healthtechmagazines.com/managing-the-evolving-data-landscape-in-cancer-care-and-research/ Fri, 08 Apr 2022 14:07:32 +0000 https://www.healthtechmagazines.com/?p=5903 By Theodora Bakker, Director, Data Stewardship and Integration and Atti Riazi, SVP & CIO, Memorial Sloan Kettering Cancer Center With

The post Managing the Evolving Data Landscape in Cancer Care and Research appeared first on HealthTech Magazines.

]]>

By Theodora Bakker, Director, Data Stewardship and Integration and Atti Riazi, SVP & CIO, Memorial Sloan Kettering Cancer Center

Atti Riazi, SVP & CIO

With a heavy concentration on translational and clinical research at Memorial Sloan Kettering Cancer Center (MSK), there is an ever-growing need to leverage data across the clinical, research and education missions. While as a cancer center, our organization has a singular focus of disease, our data and technology needs are consistent with the needs of the larger healthcare industry. And our overall outlook reflects the demands of larger society. Atti Riazi says, “We must go through a radical revolution in terms of how we view IT— no longer seeing it as things and products, but instead focusing on the experience, intelligence, and insights from all the technology we deploy.”

The advances in differentiating types of data storage and federation, as well as the ability to create an access and delivery layer across disparate data sources, has fostered the emergence of a different way to think about data—the data fabric. There are a few transformative core components of our data fabric, all housed in a strong metadata layer including a concept-driven catalog, data lineage, master and reference data management, and data de-identification. These contribute to the advancement of healthcare by providing clarity and transparency while also protecting sensitive data classes like PHI (Protected Health Information) and PII (Personal Identifiable Information).

Our catalog and use of standard ontologies in biomedical research and patient care allow our fabric to provide transparency to the meaning of our data—clarity previously obscured by the myriad of independent transactional systems used in healthcare. Since clinical and administrative data is often spread across multiple systems, it is challenging for users to understand what data means and how it connects to each other across systems. A billing system might provide data about a patient’s diagnosis and comorbidities using standard billing codes, while the impact of drug interactions is housed in the EHR, and outcomes are buried in provider notes. The context of this integrated information is critical in both the clinical and research realms. By extracting the meaning of each of these domains of data and representing them in an integrated catalog, users can find new pathways of care and create new insights for research. 

While the focus of healthcare must always remain in the provider-patient relationship, the administrative functions of healthcare enable better care. Operations must look at data in the aggregate, which lays bare the inconsistencies and quality issues across a medical center. A data fabric allows for data to be selected and managed through the metadata, providing the ability to track data through its lifecycle and pinpoint the opportunities for its quality improvements. With a robust data stewardship program, an organization can use master and reference data management to create a unified picture of data, allowing operations to manage interactions organization-wide. The regulatory and ethical considerations around the privacy of an individual’s data are continuously advancing, and technology is emerging to automatically de-identify data as it moves through systems. Our data fabric transforms our ability to use near real-time data while protecting data privacy. There is no longer the requirement to send unique datasets through manual de-identification code, delaying the use of data at the moment. 

Adoption across the organization is varied, with some areas showing reluctance to adopt the change, while others are racing to embrace the new technology.

Present day, the value of data in understanding and controlling infectious disease is on the forefront of many people’s minds. Atti Riazi says, “What is the benefit of knowing about COVID-19 or Ebola a month earlier, or understanding that a few less inches of rain will create drought, food shortages, unrest, and instability in a region the following year? Data has great value in providing insight into so many social, health, and environmental issues; by sharing information freely, we can better predict such disasters and take much more effective action. Tech companies can help governments, NGOs, and civil society with big data projects through funding and providing expertise, tools, and data itself.”

The technology behind our integrated data fabric layer contributes to the transformation of our industry by enhancing the meaning of data and enabling more flexible use, with the data constantly in motion through its lifecycle—although, as with any transformative program, it is not without its challenges. The technology is still being conceived, and a stable, integrated technology has yet to emerge in the industry. Today, organizations that have a fully realized fabric have invested millions of dollars and years to achieve those ends, and a fabric platform approach to data management is outside the reach of many medical centers.

With any new technology, the early adopters will suffer the wounds of the ‘bleeding-edge’ to enable true transformation of the industry; it requires the foresight and will of those institutions to lead healthcare, and clinical and translational research, into the next era. In addition to the burden of commitment, any transformation bears the delicate challenges of change management. Our approach uses two main components—education through data literacy initiatives, and tool training as users bring specific use cases—to transform how we think about an organizational data platform.

Adoption across the organization is varied, with some areas showing reluctance to adopt the change, while others are racing to embrace the new technology. The former group poses the challenge of requiring tactical and significant resource commitment to help our users adopt the new approach to thinking about data. The latter group poses the challenge of demanding changes faster than the technology can be built. To remain on a successful trajectory, our program has adopted a concentrated approach to change management, a change we feel is a departure from a traditional ‘build it and they will come’ mentality. Through these efforts, and mindful of the overall benefit of data globally, we are trying a wide variety of outreach, training, and communication strategies, and measuring the success of each so we can continuously optimize not only the technology we are building, but also the enterprise-wide adoption. 

The post Managing the Evolving Data Landscape in Cancer Care and Research appeared first on HealthTech Magazines.

]]>
Being data-driven leads to a new organizational structure https://www.healthtechmagazines.com/being-data-driven-leads-to-a-new-organizational-structure/ Tue, 22 Jun 2021 15:11:32 +0000 https://www.healthtechmagazines.com/?p=5122 By Patrick McGill, MD., EVP & Chief Analytics Officer, Community Health Network Many organizations aspire to be “data-driven’ or “analytics-focused,”

The post Being data-driven leads to a new organizational structure appeared first on HealthTech Magazines.

]]>

By Patrick McGill, MD., EVP & Chief Analytics Officer, Community Health Network

Many organizations aspire to be “data-driven’ or “analytics-focused,” but what do those terms really mean? Community Health Network was no different, embarking on a data and analytics improvement journey which resulted in a unique organizational structure: forcing form to follow function. “Form follows function” is a principle typically associated with industrial design, meaning the shape of an object should relate to its intended function or purpose. To extrapolate, how an organization is shaped ideally is directly related to the intended outcome.

After years of frustration with the lack of data or insights regarding how the business was performing, Community Health Network (CHNw), in the summer of 2018, engaged with Gartner to undergo an enterprise-wide evaluation of the function and future state vision regarding analytics as a strategic asset. Prior to this assessment, information was scarce despite data being abundant. The Analytics department, led by a Business Intelligence Director, was a subset of the Information Technology Department. There was a limited connection to the overall strategy of the business. Prioritization of projects and resources was either based on “first-come, first served” or by whichever Executive could strong-arm reports to be generated. Ultimately, this prioritization method led to operational units hiring their own data analysts which resulted in more silos as they did not have access to the central data sets.

As a result of the Gartner assessment, recommendations were made to create a freestanding Department of Network Analytics, headed by a new position; the Chief Analytics Officer (CAO).  Additionally, as part of the new Analytics department, a Center of Excellence was formed. With the CAO reporting to the CEO, it ensured data and analytics had representation at the organization’s highest levels. The Analytics Center of Excellence was charged with reimagining how data was utilized within the organization and moving from mostly descriptive-analytical reporting to more advanced analytics, including predictive models powered by machine learning. Ultimately, the goal was to tie the network strategy to analytic output to inform business decisions with data.

After approximately nine months under this structure, additional organizational structures were put in place. Previously, all Information Technology functions reported under the Chief Operations Officer (COO) and Clinical Informatics activities fell under the Network Chief Physician Executive (CPE). In order to continue in advancing the desires of a data-driven organization, Information Technology, led by the Chief Information Officer, moved under the CAO. Additionally, Clinical Informatics, led by the Chief Medical Information Officer (CMIO), also moved under the CAO, along with Regulatory Reporting and all Enterprise Services including Business Process Management and Continuous Process Improvement.

Ultimately, after said changes, the Chief Analytics Officer has operational oversight of all network data and analytics (excluding finance), all technology and digital transformation, clinical informatics, process improvement activities, and regulatory reporting. This allows for direct connections between network strategy with analytics, technology, and business processes. In the truest sense of form following function, the organizational structure at Community Health Network is shaped to allow data to be a driver of network strategy by driving business outcomes.

After the pandemic passes, technologies such as electronic check-in, early hospital discharge programs, and remote chronic disease monitoring will quickly become the expectation for many patients and health systems.

In a year such as 2020, this structure has been incredibly beneficial. Early in the pandemic, data was scarce and technology needs were changing rapidly. For example, the conversation from a purely in-person care delivery model to a near 100% virtual model required expedient implementation of technology to support. The needs of measuring this new business delivery were no less important. Being able to predict volumes of covid infections and align with inpatient operational changes was critical. The strategy focused on preparation and taking care of infected patients while protecting our healthcare workforce. 

As the pandemic continues to alter healthcare delivery, thus requiring continued use of virtual care and digital tools, the alignment of the support services with enterprise strategy will continue to be of utmost importance. After the pandemic passes, technologies such as electronic check-in, early hospital discharge programs, and remote chronic disease monitoring will quickly become the expectation for many patients and health systems.

Silos have long existed in many industries; healthcare is no different. Often, organizational structure contributes to the continued difficulty in communication and collaboration. As a result, three interventions will assist in the journey.

First, look at the organizational structure and understand if it is truly working. Historical and legacy structures, if in place for long periods should be examined. This is not to advocate that reorganization for the sake of change is the answer; however, taking a deeper examination into current structures is well worth the exercise.

Second, explore if non-traditional reporting structures might benefit the organization. Certainly, having the CIO report to the CAO is not typical. However, it is effective in our organization.

Third, ensure team members have a proper understanding of the strategy and vision of the organization. This is often an overlooked function with services typically considered supporting such as IT and Analytics. Additionally, helping them recognize the value in their work will also improve employee engagement. 

Fourth, these functions need a consistent voice at the highest level of the organization. In order to be data-driven and truly digitally transformed, the employees responsible for this change must be earnestly represented throughout the process, not merely in appearance.

Finally, organizational structures are typically complex with legacy personalities. Finding the right structure to be data-driven can be challenging and demanding. It requires crucial but honest conversations. When performed with the organizational strategy and best interest at the center, the outcomes will be incredibly rewarding for all involved.

The post Being data-driven leads to a new organizational structure appeared first on HealthTech Magazines.

]]>
5 Common Pitfalls to Avoid on Your Organization’s Data-Driven Journey https://www.healthtechmagazines.com/5-common-pitfalls-to-avoid-on-your-organizations-data-driven-journey/ Fri, 28 May 2021 13:55:50 +0000 https://www.healthtechmagazines.com/?p=4945 By Jason Buchanan M.D., M.S., Clinical Informatics Officer, Baylor College of Medicine Data is the new currency of the times

The post 5 Common Pitfalls to Avoid on Your Organization’s Data-Driven Journey appeared first on HealthTech Magazines.

]]>

By Jason Buchanan M.D., M.S., Clinical Informatics Officer, Baylor College of Medicine

Data is the new currency of the times and it will be the reason why organizations either sink or swim. The ubiquitous use of electronic health records combined with the arrival of data-rich players such as genomics, precision medicine, and the internet of things has caused a deluge of data. While maintaining high standards in carrying out their missions, healthcare organizations are treading water in attempting to collect, analyze, and translate data in impactful and actionable ways. In healthcare, as well as most other industries, data is the key driver of the decision-making. It is the compass which guides operational and financial decisions on the journeys that organizations embark upon to satisfy their stakeholders and those whom they serve. This being known, there are five general pitfalls to be mindful of before setting sail on your data analytics journey.

1. Not Determining if Your Journey is One of Exploration or One of Destination

One of the beauties, as well as one of the pitfalls of data and data analytics, is that it can lead you down many different paths. There are two general approaches to looking at data. The first is an exploratory approach in which you have a broad question, with the liberty to see where the data leads you. The second approach is more targeted, where there is a specific question and the analytics can be tailored precisely to answering that question. Each approach has its benefits, drawbacks, and optimal use scenarios.

Additionally, each approach can potentially lead you to the same conclusion, but with greatly differing expenditures of time and resources. Therefore the “captain” (leader(s) in charge of the project) needs to clearly convey the question to be answered and the type of approach expected to be used. This will be particularly important for those leaders who are new to an organization, assuming a new role, or have an unseasoned analytics team.

2. Not Having a Complete Crew

Once a question to be answered has been conveyed and the approach to analysis  has been determined, the temptation is to start drafting and implementing the data analytics plan immediately. Before data analysis, consider taking the opportunity to pause and examine your team. Perhaps there is a subject matter expert within, or affiliated with, your organization who can provide great insight and a different perspective that will help focus your mission? Does your analytics team have the resources and expertise for this particular project? Are you using other departments (such as finance, billing and coding, purchasing, social services, etc.) who may have different data sources that can provide a more organizationally holistic outcome to your analysis? Interdepartmental collaboration helps avoid myopia, frequently provides a more robust outcome, and helps ensure that the data analytics project is well aligned with its organization’s mission.

3. Not Checking the Integrity of the Vessel

Fidelity and integrity of data are paramount. Much occurs behind the scenes in the collection, aggregation, curation, and display of data. Errors are always a possibility, and you do not want to set sail on your project with an unrecognized hole in the hull of your ship. Therefore, it is critical to ensure that the data being analyzed is valid (correct, accurate, and reliable) before its use in organizational decision-making. The type of data validation testing will vary widely depending upon many factors, including organizational resources, size of the data sets, project type, and types of data to be analyzed. Most importantly, the organization should have a documented and vetted data validation plan in place, which is being consistently followed.

4. Lack of Communication Among the Crew

It is well known that overcommunication typically is the mantra in successful, high functioning, and high-reliability organizations. Communication breakdowns are one of the top causes of project failure and will derail your mission. Frequent team communication, using multiple different mediums (email, text, video, etc.), helps to ensure clarity and alignment with the goal. Communication is particularly important in highly data-driven teams as you will find that some of your most technically talented team members may need a bit of extra attention when it comes to communicating their progress and needs. This centrality of the importance of communication has only been magnified during the current Covid-19 pandemic, which has caused separations of distance, time, and connectedness among the employees of many organizations.

5. Losing Track of Your Position

Data and all of its possibilities are intoxicating. It is quite easy for the waves of data, and their associated eddies and currents, to sweep your project into unintended areas. This will cause your project to steadily drift away from the initial goal. Therefore, it is important to continually focus on the question presented and let the data be your guide. Just as one does not sail without tracking one’s position, your data team should periodically revisit the initial question presented, to ensure that the data analysis is still heading towards the intended goal. Additionally, a data project may not seem fruitful at its conclusion. However, keep in mind that the analyzed data still can be used to reshape a prior question, or formulate a new question, whose analysis brings your organization one step closer to achieving its overall objective.

Communication breakdowns are one of the top causes of project failure and will derail your mission. Frequent team communication, using multiple different mediums (email, text, video, etc.), helps to ensure clarity and alignment with the goal.

These are fascinating times as we see the data and health IT landscape rapidly change before our eyes. The above pitfalls provide a set of common mistakes to avoid as your organization sets forth on its respective data-driven journey into uncharted waters teeming with rich and untapped possibilities.

The post 5 Common Pitfalls to Avoid on Your Organization’s Data-Driven Journey appeared first on HealthTech Magazines.

]]>
Efficient, Reliable, and Faster Data – A Hierarchical & Modular Approach https://www.healthtechmagazines.com/efficient-reliable-and-faster-data-a-hierarchical-modular-approach/ Wed, 26 May 2021 14:58:01 +0000 https://www.healthtechmagazines.com/?p=4955 By Victor Bagwell, Associate Chief of Information Science, Hackensack Meridian Health Advancements and increased availability of real-time data, medical imaging,

The post Efficient, Reliable, and Faster Data – A Hierarchical & Modular Approach appeared first on HealthTech Magazines.

]]>

By Victor Bagwell, Associate Chief of Information Science, Hackensack Meridian Health

Advancements and increased availability of real-time data, medical imaging, and big data analytics has created a rapid increase in healthcare data volume. In parallel, there is more demand for access to the data and applications leveraging more sophisticated business intelligence, data science, and artificial intelligence (AI). At each hospital, this demand includes the need for clinical, hospital claims, patient behavior and sentiment, social determinants of health, R&D, regional, national public health, and other data. Analysts, data miners, data scientists, researchers, and others are demanding more data and at an accelerated pace. 

The Problem

Collecting data is not enough. The data must be put in a form that can be utilized. To do this, data must be processed, normalized, cleaned, aggregated, and prepared so it can be used effectively. Regrettably, different applications store their data architectures, database systems, and there are myriad standards and interoperability issues with which to contend. Ultimately, the heavy lifting of data and managing the data pipeline (Figure 1) is the job of data integration teams that build data warehouses and operational data stores. The data integration is at the heart of the issue. It is a complex process that requires standardized data architectures and the development of extract, transform, load (ETL) code to move and modify the data to conform to the architectures.

Figure 1
Data Integration Pipeline

Further, data integration requires the management of changing data sources, end-user requirements, access, and business needs. This involves thousands of ETL jobs and hundreds of terabytes of data. Consideration must be given to the order that data is processed, loads on the system infrastructure (e.g., CPU, memory, disc I/O, storage), and the time when data needs to be refreshed and available to end-users. The time and resources to achieve successful data integration can be risky, challenging, and costly. How can the data be delivered in a safe, secure, reliable, scalable, timely, and cost-effective way?

Traditional Approach

The traditional method is to create ETL code to handle each work request. Sometimes code is copied from a prior similar request to accelerate the development of the new request. The ETL is then tested and scheduled. Any time a change is required, the entire code needs to be re-tested (front-to-back).  Any time a data source is changed, “all” of the ETL code that uses that data source must be modified and tested, one by one, until all of them are verified. This method is inefficient, requires enormous resources, and does not provide scalability. Because of its complexity, such code, over time, is difficult to understand. Even with documentation, there is a risk that changing code will produce unforeseen effects.

The Solution: Hierarchal & Modular Approach

In search of a solution, a modular approach was considered where the data pipeline is designed with different ETL modules that are isolated and work independently. In some ways, this is analogous to an object-oriented or microservices-based system architecture. By isolating the ETL functionality into hierarchical and modular components, master ETL jobs simply re-use existing ETLs from the master library without redeveloping and testing them. New ETL can be added, reviewed, and considered for inclusion in the hierarchical modular library. When changes are needed, only the affected ETLs need to be changed. By leveraging the hierarchical and modular approach, centralized core ETL modules help ensure a single source of data truth available to the enterprise.

Examples of modular ETL library functions are de-duplication, patient demographic, diagnosis, procedure, lab, and admission/discharge/transfer (ADT).

Figure 2
Non-modular ETLs to modular ETLs

Figure 2 compares the traditional method of creating non-modular ETLs to modular ones. The red boxes indicate the new ETL development work required. The green boxes represent existing library ETL modules. The modular example shows that the MASTER ETL leverages the existing library and only adds ETL code segment (CS) where they do not already exist in the library. In this hypothetical example, the traditional method requires 15 code segments, and the modular requires only 7. Generalizing, the modular approach reduces the work by over 53%.  

We mapped the function of new and legacy ETL code (core, primary, secondary, etc.) that already existed in the environment through team collaboration and investigation. Many of the functions were easily grouped into hierarchies based on their function and logical processing position in the data pipeline. Next, we looked “within” each ETL and found cases where complex coding was heavily duplicated across many ETLs. We communicated with end-users and coders and deconstructed these into their root/component functional parts. Each ETL code segment was then added to its appropriate hierarchy based on its function. This was a large project. By working with end-users, we prioritized which ETLs were most important and targeted them first. In this way, we provided operational relief to those areas as quickly as possible. 

Benefits

In practice, utilizing the hierarchical and modular ETL model resulted in a recognized 45% decrease in time to delivery and FTE labor. Further, due to a reduction in duplicate storage, the overall required disc storage decreased 30%. Future modifications to the new ETL code will only require the centralized library module to be modified and tested. This will result in future savings by avoiding extensive across-the-board code modifications and retesting. During the pandemic, the ability to accommodate changes quickly is an important win. Over time, as the ETL hierarchical module library increases in size, the need for additional modules will decrease. This will provide additional future efficiencies and return on investment.

What we learned

Collaboration, communication, taking a step back to take a leap forward, build on a solid, scalable, and extensible foundation were all important components of obtaining the solution. Optimization is key.  When possible, avoid redundant development, testing, and storage. Look at the systems from a heuristic perspective. Spend a little more time the first time around to do it right and avoid having to refactor and/or invest much more time to modify or scale up later. 

The post Efficient, Reliable, and Faster Data – A Hierarchical & Modular Approach appeared first on HealthTech Magazines.

]]>