Next Generation BI Platforms

Challenges with Existing BI Platforms and Solutions

For the last few years, Enterprise BI Portal Platforms like SAP Business Objects, MicroStrategy and MS SQL Reporting Services have dominated BI and Analytics platforms. These BI platforms had the following key features and characteristics:

  • Development of customized reports on top of the Enterprise Data Warehouse
  • The Data Warehouse was a point solution built based on custom reporting needs of the Enterprise
  • IT teams of these Enterprises would write a lot of custom code to extract data from various disparate application databases in the Enterprise to create this Date Warehouse
  • They would provision these reports for end users on a central collaboration portal provided by the BI platform
  • The data security and access control was tightly governed by IT teams

Enterprises faced several challenges with these BI platforms:

  • The line of business (LOB) users had to completely rely on IT teams for their reporting and analytics requirements
  • The IT team had to translate complex business requirements provided by LOBs to custom reporting and analytics solutions. This led to many implementation gaps, especially for enterprises in niche domains such as life sciences, healthcare, banking, etc.
  • With the advent of cloud, enterprises had some data residing on the cloud and some data residing on the premises. The data load solutions provided by these platforms would not be able to manage this problem gracefully.
  • Traditional BI vendors tried hard to address these challenges by integrating user-driven data discovery solutions with the rest of their stack.

However, since the overall approach was IT-centric and not user-centric, their solution failed to effectively capitalize on disruption in technologies such as  in-memory databases, semantic model and next generation visualization solutions.

Next Generation BI platforms

The likes of ‘Tableau’, the flag bearer of next generation BI platforms, took these challenges head on. They revolutionized  Enterprise BI with a paradigm shift to make it user-centric’ instead of IT-centric. They provided interactive and advanced analytics to end users without requiring them to have BI skills. They moved away from a centralized IT-driven approach to a decentralized user-centric approach, thus empowering business users with greater control over visualization and data.

This had to be delivered without sacrificing data security and ensuring appropriate access control. The platforms provided data security controls to the IT teams using governance workflows built at various levels for governed data discovery, granular data provisioning, user based security and application based security.

The key parameters on which  enterprises are judging these solutions include:

Intelligent Data Discovery

  • End users ability to search, capture, store and promote diverse semantic and traditional meta models
  • Ability to create user-defined sets, groups and hierarchies
  • Date lineage and data blending from varied data sources, including multi structured data

Interactive Exploration and Visualization

  • New interactive visualization controls such as heat and tree maps, scatter plots and geo-spatial views
  • Analytics friendly visual controls like gauges and sliders

Platform and Data Administration

  • Multiple levels of security at the discovery and provisioning stages with tight user and application level access control
  • Managing scale, performance and high availability

Ability to support existing data sources

  • Be-spoke or ad-hoc query support
  • Re-usable semantic layers to enable users to navigate available data sources
  • OLAP support for grouping, slicing/dicing data and ‘what if’ analysis

Mobile and Social Enabling features

  • Interactive and published content should be available on Mobile
  • Interactive analysis should support native controls and features including camera, location awareness and natural-language query

SDK with APIs

  • Ability to plugin analytics in the application workflow using APIs

 

Analysis of Next Generation BI Platforms used @ Nitor 

This section provides a brief analysis of the ‘strengths’ and ‘weakness’ of next generation platforms used at Nitor.

Next Generation BI platforms

Nitor has also been closely tracking other next generation BI products like Qlik, microstrategy and SAP Lumira (integrated with SAP Business Objects and SAP Hana) with its customers. We will enhance our analysis as we continue to explore newer next generation BI Platforms with our customers.

Note:

  • Primary Source: Nitor’s experience in analyzing and implementing these tools
  • Other Sources:Websites of various next generation BI platforms, Gartner reports and other publicly available information

“Predicting the future”

The amount of healthcare data is growing at breakneck speeds, due in part to the the pace of technological innovation and the proliferation of patient engagement solutions/technology and strategies for population health management. , It is  therefore getting difficult to gather and generate valuable information. As a result, the same data is captured from various sources. This leads  to a redundancy of data and information. The ability to extract relevant and important data is becoming a challenge.

The data is captured from various inputs such as  (Electronic Health Records), Patient Portal, and ancillary systems – eRx, Labs, PMS, Billing, Claims management, Insurance plans, Wellness & Preventive. How can an organization make the most benefit out of this? How can the government identify trends in the population? The key here is to SPOT THE TREND.

How can that be achieved? One option is to build a robust system that will have all the modules/ workflows integrated. Is this possible?  Yes, it is. The main question that arises is of feasibility. Can the hospital scrap existing software and buy new software? No, the hospital will not. Another approach could be to build a platform that integrates multiple sources and generates meaningful information. This is a feasible and viable option. The sources can be integrated onto a platform to collect the data. The data can be massaged and scrubbed to generate, analyze, and predict trends..

Is predicting trends as simple as it seems? Not really. It requires in depth healthcare knowledge as well as awareness of which trends would add value and benefit the consumer.

The healthcare world is moving at a pace at which each IT company needs to be a on a fast track. It would not be wrong to predict  that the Healthcare industry will yield the maximum benefit by collecting, analyzing data and predicting trends!

Top 6 concerns of Technology Development Managers and CTOs at EHR ISVs

Electronic Health Records (EHR) have been gaining traction. EHR systems and technologies, including processes for EHR implementation and integration, have captured the attention of ISVs.

We have been interacting with CTOs, technology mentors, and heads from various EHR ISVs from North America. We realized that there are many common concerns on the development front.

While we would comment on methodology, communications, code management and engagement level topics in separate blogs, here we enumerate  business concerns shared by top executives.

  • In most cases, the techie mindset prevails and hence it lacks business and regulatory views. The development teams, either in-house or third-party, do not really go to the depths of understanding of business aspects. They feel happy about the technology depth and build something that never gets user acceptance.
  • The healthcare industry related knowledge transfer overheads while developing specifications that customers expect vendors to know anyways; leading to compounded time, efforts, costs and unhappy users.
  • Difficulty in integrating data with existing/new healthcare systems leads to creation of another island app. It is a fact that everyone has something implemented and that investment needs to be capitalized and not written off. The approach must consider integration and migration needs of data in a secure manner as mandated by regulators.
  • Adapting to changing dynamics in regulatory needs such as MU1/2/3 etc. is important., Regulator’s guidelines on MU1/2/3 have a bearing on the way EHR products are built. It is important that business aspects are well understood by product and data engineering teams.
  • No data engineering for insights and decision-making, less empowered users- EHRs collect a lot of data. Such data is wealth and must be used for crunching to identify insights, patterns etc. and aid decision making and make predictions.
  • Lack of knowledge of clinical vocabularies lead to the funny situation of “right hand does not what left hand is doing”- Product teams must know clinical terms, references etc. to an adequate depth to translate the knowledge into a robustly engineered product.

In the next blogs, we would articulate some areas of data engineering and also engagement trends.

EHRs in future…

In the last few years, the healthcare industry experienced a great makeover due to radical changes in Healthcare IT. In 2010, the United States federal statute signed “Obamacare”, the most significant regulatory overhaul.  After this, the entire healthcare industry experienced rapid changes in technology and compliance. In 2010, the goal for most of the physicians and large hospitals was to implement EHR systems and technologies.  As we enter  2015, most physicians and big hospitals have moved from paper to EHRs. So what is next on the menu for EHR?

The focus of Healthcare IT for the next two years will be more than EHR development and implementation. Changes in healthcare compliance until 2020 and secondly the new technologies will open new avenues to the Healthcare IT business. Currently, the pain area healthcare professionals are facing is interoperability between multiple healthcare entities such as patients, hospitals, labs, payers, pharmaceuticals, etc.  Healthcare firms will focus on coming up with great solutions for connections  among these entities. Healthcare IT also will start to gear up to implement the third stage of Meaningful Use .  The Clinical Decision Support integration with EHR systems will be a key area for Meaningful Use 3 implementation.

‘Data Analytics’ will be an area which will be the most watched arena for the complete Healthcare IT industry. The effective layer of BI and analytics around EHR systems will be bolstering to deliver cost effective and quality care to patients. By 2020, for healthcare industry, EHR systems will be a point of convergence to expand in multiple directions and all EHR systems will  mature by implementing the patient centric model.  This will achieve the  federal government’s target to offer quality and cost-effective healthcare services for all citizens.

Money (benefit???) is here….. RCM!

Healthcare as a business is evolving rapidly, partly driven by  changing reimbursement models. Health information professionals expected to be on the front lines of initiatives aimed at improving population health, increasing patient engagement and reducing overall healthcare costs.

That, in turn, is having a profound effect on payments and governance. As providers adjust their processes in reaction, it is important for the revenue cycle to keep pace with value-based models that increasingly tie reimbursements to payer definitions of quality and clinical appropriateness.

Healthcare revenue cycle management solutions can also help practitioners ensure they receive incentive payments for participating in government programs such as the Physician Quality Reporting System and E-Prescribe, further increasing a practice’s revenue. Other coming advances in revenue cycle management include charity screenings and propensity to pay. These measures can help physicians decide how to handle patients who may be unable or unlikely to pay their medical bills without compromising the financial success of practices. Outsourcing this process makes it more cost-effective to provide these services, as administrative staff does not have to spend their billed hours dealing with these tasks. Thus, RCM systems, methodologies, and technologies provide viable solutions to all players involved.

Data Modeling Standards and Guidelines

Logical data models and database models are essential components of technology today. The following lists data modeling standards and guidelines in addition to outlining some data modeling principles.

Conceptual Data Modeling (CDM)

  • CDM consists of data entities and their relationships.
  • CDM describes key business information by subject area from a data perspective.
  • CDM should be divided into subject areas of manageable size. In practical terms, this means a model usually has between 4 and 15 entities per subject area.
  • Every subject area must have a unique title.

Logical Data Modeling (LDM) Standards and Guidelines

  • A corresponding logical data model also has associative classes to resolve many-to-many relationships, is fully attributed, and is normalized to Third Normal Form (3NF).
  • If a CDM was used as a foundation for adding details to develop a logical data model, then the non-specific relationship line between entities will be replaced with identifying or non-identifying relationships.
  • A LDM also shows all native (that is, non-foreign key) primary key attributes and non-key attributes in the attribute area.
  • A fully attributed logical data model will be in Third Normal Form (3NF). This means that each entity instance has exactly one unique record. All non-key attributes fully depend on primary key attributes, and no non-key attributes depend on any other non-key attributes.
  • Depending on the particular logical data modeling methodology and tool used, there are a number of acceptable ways to indicate cardinality or multiplicity on the ends of relationships between two equally important entities.

Physical Data Model (PDM) Standards and Guidelines

  • Designate a unique primary key column for every table.
  • Each column name should contain all of the elements of the logical attribute from which it was derived, but should be abbreviated to fit within the maximum length.
  • Do not use hyphens in table or column names because some programming languages interpret hyphens as subtraction operators.
  • Implement table and column names in a way that is supported by all target DBMS tools.
  • The physical model will assign lengths and data types to all columns. Data types should be specific to the target DBMS tool.
  • The physical data model will, at a minimum, provide examples of possible values for identifier, indicator, and code columns.
  • A certain amount of demoralization is usually necessary when implementing the physical data model.
  • Estimate the expected storage requirements for each table based on the size of each row, expected growth, number of rows, and archiving requirements.
  • Understand the capabilities of the specific database product. Performance improvements may be realized by taking advantage of features such as clustered indices, caching, and index optimization.

Top 8 Considerations of Data Modeling

Recently, we published a whitepaper on the Data Modelling approach and processes involved. There, we had discussed the top eight considerations of standard and logical data models. The following is a summary of the important data modelling guidelines.

Data Modelled Well:

  1. Aligns with business very well
  2. Connects with data and scales for the future
  3. Enables good governance and integrity of data across the organization

The following diagram shows the top eight considerations:

 Data Modeling v1

Model Correctness:

    • Ensure that the model accurately captures the material. the material?
    • Make sure that the design represents the data requirements.
    • Ensure the correctness of data elements with different formats than industry standards.
    • Fix incorrect cardinality and keys defined incorrectly

     

  • Model Completeness:
    • Does the scope of the model exactly match the requirement?
    • Can a model be complete yet incorrect? Incomplete yet correct?
    • If relationships are not shown, then they shouldclarify any ambiguously defined terms.

     

  • Model Structure:
    • Standard modeling practices, independent of content
    • Entity Structure Review
    • Data Element Review
    • Relationship Review

     

  • Model Flexibility
    • Ensures that the correct level of abstraction is applied to capture new requirements.
    • Achieves the right level of flexibility.
    • Proves there is value in every abstraction situation.

     

  • Modeling Standards & Guidelines
    • Ensures correct and consistent enterprise, conceptual, logical, and physical level as per standards & guidelines.
    • Uses the correct names and abbreviations

     

  • Model Representation
    • Optimal parent and child entities placement
    • Intelligent use of color in grouping or highlighting entities
    • Proper relationship lines crossing each other or through unrelated entities
    • Optimal use of subject area
    • Maximizes readability and understanding

     

  • Physical Design Accuracy:
    • Ensures that the design is for the real world & also specific to application
    • Considers null values
    • Uses partitioning
    • Utilizes proper indexing and space
    • Considers denormalization

     

  • Data Quality:
    • Ensures that the design and actual data are in sync with each other.
    • Determines how well the data elements and their rules match reality.
    • Avoids costly surprises later in development.

Is Interoperability the new bubble?

The EHRs developed in past decades are expected to be interoperable – is this is a “huge expectation” or is this  “just another wish” or is there “nothing so great about it”?

We have become so accustomed to carrying medical records from one hospital/physician to another and going through  grueling rounds of medical tests and reports that we never questioned the status quo.

Why didn’t my current physician share my medical records electronically with my new physician? – Are the EHRs not scaled  to share data? Or is there some hidden secret of not sharing data between eligible entities? Are the standards to be blamed for not accommodating this feature ?

I think these barriers will surely be overcome in the near future. Multiple initiatives or approaches are being taken to resolve the medical records interoperability issue. Among them, a few are:

  • HIPAA compliance makes it mandatory to make healthcare data portable.
  • At the same time, hospitals start seeing the benefits they receive by being interoperable.
  • An angle added to the benefits is that the hospitals will not get monetary benefits if they aren’t interoperable in terms of reimbursement.
  • The claims denial from insurance companies if the same tests are prescribed by multiple physicians. Insurance companies may deny claims if the same tests are prescribed by multiple physicians.
  • Acceptance of tests conducted by hospitals in the ACOs.

Healthcare Revolution

I am confident that the Medicare and Medicaid EHR Incentive Programs for the meaningful use of certified EHR technology will take the exchange of medical records to the next level.

 

Creating Responsive UI in Cross Platform

This blog will elaborate the means of  creating a responsive UI during cross platform app development.. As we know, cross  platform or multi-platform applications are developed to reuse code on different platforms.

Adaptive web design involves using framework CSS for a responsive UI design. As the platform varies, it introduces different screen sizes where your developed HTML runs. A mobile device comes in different sizes and one cannot predict all ranges of the device screen size and do the HTML coding accordingly. In such scenarios, in which  you don’t know the screen size of device to run on, responsive design comes into picture. The term responsive design itself indicates that HTML will have the same look and feel on all devices, bringing a consistent user experience for all mobile OS platforms.

When developing Cross platform applications, one should take care of the below points while creating an HTML UI for the application. One should also follow the below guidelines while creating CSS classes.

Do not Give Hardcoded Sizes for HTML Elements

Never give hardcoded height and width to any HTML element. Consider a scenario in which you develop an application for a device size of 360×480 pixels. You define CSS with width 360px and deploy it on the same device. However, when you deploy the same application on a device with a different screen size, your HTML element won’t fit on screen and your UX goes for a toss.

Instead, create a CSS class in which you give size in terms of percentage for an HTML element. This class will be manipulated as per the required size of the element on screen. For example, if I create an HTML element that needs to fit to the complete width of the device, I will set the width as 100%, which will runtime calculate 100% of available screen space and will bring the same user experience on all devices.

Have Control on Your Margins

Always try to set margins in percentages instead of pixels. Many times, it collapses the responsive width and height of HTML components and makes the element  go out of screen bounds. For example, consider a case in which you create an HTML element with width equaling screen size and with margin of five pixels. This  will cause the element to move out of bounds and will add an unnecessary scroll to the UI.

Instead, divide the element space in terms of 100%;   set size and margin to sum to 100, so that it will fit on screen.

Make Images Flexible

When images are added onto an HTML UI, try to make them more comfortable with screen size. A hard code sized image can vary the feel on different screen sizes.,  In such a case, to make the look and feel consistent, give the image a CSS class. This makes it comfortable to fit on screen. The CSS for an image should give the height and width of the image as a  percentage.  This will make the image more flexible in case of screen size and orientation changes.

Set  Font According to Screen Size

Font size in pixels varies when pixel density of the mobile device changes. A font that looks perfect on one screen can look very small  other screens with a high pixel density. In such scenarios, set the font size either in EMS. Otherwise, considering a smaller screen size targeted for applications, give the minimum font size in EMS and give the font size in pixels. For example, CSS class for label should be like:

.label class

{min-font-size: 10px; font-size: 9 %;}

Use of Media Queries

Use media queries when you want to have different responses for different screen sizes. Consider an application that runs on tablets and on smartphones and needs a different UI for tabs and phones. In  such cases, media queries bring up a great deal of handling UI. Media queries are like writing if/else condition and telling the browser how to render UI on different screen sizes.

For example

@media screen and (max-width 700px)

{

#header

{

Height: 90px;             font-size: 15px;

}

}

@media screen and (max-width 400px)

{

#header

{

Height: 50px;             font-size: 10px;

}

}

Performance of Cross Platform Application

In this blog, we will introduce you to the measures of improving the performance of cross platform mobile applications. Before moving ahead, let me first demonstrate the cross platform application. Mobile cross platform applications are applications which are developed using HTML5 and JavaScript, and have the capability to run on multiple mobile operating systems.

In multi-platform application development, it is very important to ensure a responsive UI design. It is necessary to make your applications design responsive, for example in the case of optimal adaptive web design and an intelligent mobile responsive design.

The architecture below explains the working of cross platform mobile applications and can help in improving performance of cross platform apps:

 Performance of Cross Platform Applications v2

The architecture showcased above demonstrates the performance of cross platform applications. Response time for the application increases with the hardware acceleration needed. In addition, increases in the script load imply more time to get a response from the application. .The user experience is therefore degraded if cross platform apps are  not architected and developed according to the best practices of performance.

The below mentioned guidelines can be helpful in improving the performance of cross platform mobile applications and can achieve a better user experience:

  1. Write minimal Script code: Write a small amount of script code for UI and data processing. This is because if the script load is increased, it will consume more time to execute and load scripts. Always have JavaScript optimized for the cross platform applications.
  2. Load Scripts Runtime: Do not give static reference to script files using the <script> tag. As the number and size of the script files increases, it will increase the load time of the application page. Always load the script runtime once the basic HTML page and body is loaded on the browser. Using JavaScript, create the Script DOM element and the setting resource of the Script file. It will reduce the launching time of the application and can result in a better user experience.
  3. Optimize your HTML UI
  • Do not add any script loader tags in HTML. When the required HTML is loaded, it will also load the script specified in the HTML UI component, resulting in increasing UI response time.
  • Never have a ladder of multiple <DIV> tags. Use the minimum hierarchy of HTML DIV.
  • Do not specify any inline CSS tags. Have all your CSS in a separate CSS file.
  • Never load the generated UI components from the server. Instead, write a script to create the UI from server responses and append it on your HTML page.
  • Always have a single HTML page (divided into subpages) in your application, instead of creating a separate HTML page for each application page.

4  Have your CSS optimized

  • Create a reusable CSS classes for UI components.
  • Use CSS gradients instead of images for the background, as the images require a large amount of application memory, and consume time to load.
  • Never use fixed sized elements at the bottom of the page, as these take more time to move when mobile device orientation is changed. When required, specify absolute positioned CSS for elements at the bottom.
  • Use as few fixed position elements as possible, as these elements require more time to move in case of orientations and causes UI flicker.
  • Always load CSS files runtime using a loader script, as this will reduce the application response time.

5   Use customized UI components instead of jQuery mobile: jQuery mobile has a wide range of UI components. This, however, has its own drawbacks of flickering and low performance, and cannot be customized easily. Instead of jQuery mobile, create your own customized UI component with the help of CSS and HTML DIV. This can help in creating customized UI components as per application requirements, and one can create optimally performing UI components.

6   Hardware interaction: For hardware interaction, write a plugin to interact with the device’s API layer. Accomplish this with minimum script and optimized native code. Use PhoneGap 3.0 and include only required plugins as it will help in getting reusable plugins and these will have a better performance.