Are you ready for HIMSS19? We Are!

We’ve been SUPER focused prepping for what will be our fourth year in a row exhibiting at the HIMSS Global Conference & Exhibition. Nitor is excited to connect and collaborate with colleagues around the world.

I can’t wait to share how our partnerships and business have evolved and are taking on new innovative and dynamic forms! In the last two years, our Healthcare business has grown by a staggering 200%. We have been able to achieve this by inculcating disruptive thoughts to bring about transformation at the forefront of healthcare. This time around, we promise to continue our innovation with Peer Product Management. Our idea is to help build the talent capabilities and putting in place the right healthcare product operating model and infrastructure, tailored for your product context.

We are also proud to be the first company to introduce the concept of Research as a Service for Healthcare at HIMSS19. The primary value propositions of our RaaS for the Healthcare organizations will be – innovation, disruption, scalability, flexibility, cost-effectiveness and much more.

I am excited that Nitor’s experienced team will be at HIMSS, the industry-leading conference for professionals in the field of healthcare technology. The conference will bring together more than 40,000 health IT professionals, clinicians, administrators, and vendors to talk about the latest innovations in health technology.

This year, HIMSS will aim to provide solutions to your biggest challenges – cybersecurity, connected health, precision medicine, disruptive technologies, population health and more – with exceptional education sessions. Additionally, it will uncover innovative solutions that enable seamless, secure, interoperable health information exchange and improve individual and population health at the HIMSS Interoperability Showcase. Interestingly, our two core offerings Peer Product Management & Research as a Service aims to provide answers to many of the above challenge.

This year’s conference will be held at the Orange County Convention Centre in Orlando, Florida from February 11 – 15, 2019. You can visit us at booth #7447. We will be highlighting, How Data can drive healthcare transformation. Additionally, we will highlight our Peer Product Management & Research as a Service capability.

Nitor’s Guide at HIMSS19

If you plan to be at HIMSS19, Nitor would like to connect with you. For our ISV customers and others who want to help usher a new era of healthcare, we will be showcasing various activities at our booth #7447.

Here is a quick summary of some of Nitor’s activities at HIMSS including a dedicated Peer Product Manager.

At the booth, we will display our offerings spread across different stages of a data journey, guiding you on the road to digital transformation, which includes Modernization & Digitization, Integration & Transformation.

We will highlight how we leverage platform oriented strategic partnerships to deliver data-driven transformations. How is that we along with our solutioning strategy strive to deliver secured healthcare interoperability and digitalization? Nitor experts, who will answer questions regarding solutions, app development, interoperability and much more, will be there at the booth.

A Peer Product Manager only for you!

Talk to your Peer Product Managers – Priyank Chopra and Pushyamitra Daram to find out how they can help you improve your cost & efficiency around product development. The Peer Product Managers will help you create and scale your product management function to set and achieve ambitious product goals. Let them walk you through our Peer product model, which can help you bridge the gap between Enterprise and IT Vendor.

To know more about our Peer Product model & managers click here

On-Demand Demos:

We will have on-demand demos about how our accelerator frameworks can transform your data, and unlock your organization’s full potential. We will be running Demos every day on various topics, some of the topics include:

  • Chatbot
  • MIPS Rule Engine
  • Patient Portal
  • Progress Health Cloud Demos

You can schedule and find more information about our Demos here.

Raffle Prize

For all the attendees who are ready to celebrate HIMSS in style and love to take selfies, there is a treat in store. Join us at our booth #7447, just ask for our Peer Product Manager, take a selfie with them, tag @NitorInfotech on Twitter/LinkedIn and stand a chance to win an exciting stress tracking gadget ‘The Pip’.

This year is going to be Nitor’s biggest year ever at HIMSS. Above all, I am excited about the Healthcare future and are committed to making positive contributions – today and tomorrow – that will benefit the world in which we live and future generations alike. Let’s Connect at HIMSS and let’s transform healthcare together.

You can find Nitor’s full HIMSS Schedule  here

See you in Orlando!

Demystifying usage of AI and ML with Azure Server-less

As I made my way into the city of Mumbai, I started realizing that it is actually happening, the big day you have been waiting for so anxiously is finally here.

It was an honour to deliver a session at 2019 Azure AI tour in Mumbai. I was invited as a speaker and held a 45-min session on “AI/ML with Azure Server-less”. The Conference was held at the Microsoft office, Santacruz East, Mumbai. This was my first time attending the Microsoft Azure conference. It was great fun and the conference was very well organized.

Here are a few stats from the conference; it had around 70 attendees from different states and 7 speakers delivered 7 sessions.

Event kicked off on bright note,  Noelle LaCharite, the developer experience lead for Applied AI at Microsoft covered various aspects of AI, with ease of learning and provided code base for developers. She also presented handful demos of few cognitive services.

Gandhali Samant (Sr. S/W Engineering Manager – Financial Services Cloud architect at Microsoft), presented different business case studies where Microsoft Azure AI was widely and successfully being used. Along with the informative slides presentation, she also presented few videos documented as a part of Artificial Intelligence implementation.

I had the pleasure to deliver a 45-min session and it was wonderful interacting with a lot of Azure architects, .Net Devs and Data science experts.

 Here are the quick highlights of my session:

My session started with Azure AI computer vision service, custom vision service and Azure Functions. Furthermore, I demonstrated the usage of services in Azure functions by bindings and Azure Signalr. I also spoke about code in .Net and Python.

Some of the highlights of my session are below:

  • Serverless architecture: The What & Why

The ‘What’ part included:

  • managed compute service (FaaS)
  • Leverage SaaS products

The ‘Why’ part included:

  • Reduced Ops, Focus on Business Logic, Reduced Time to Market

I further explained Solution implementation using Azure function .net and python sdk and Azure signalr.

  • Azure function

Triggers Timer, Http, Blob, Cosmos Db, Queue, Event Hub

  • Bindings

Blob and signalr

One of the most pivotal things I displayed, were the use cases for Computer Vision Service. It customises your own state-of-the-art computer vision models for your unique use case. Just upload a few labelled images and let Custom Vision Service do the hard work. With just one click, you can export trained models to be run on device or as Docker containers. The use cases included the following:

Use case 1:

Requirement:

When image is uploaded:

  • identify applicable tags
  • identify face, gender, age
  • suggest possible description
  • send real-time notification

Use case 2:

Requirement:

Predict anomaly

  • trained using scikit-learn
  • saved using pickle on Azure blob
  • solution to be available as service

My overall objective was to let know people that Serverless computing is a relatively new paradigm in server technology which will helps organizations convert large functionalities into smaller discrete on-demand functions that can be invoked and executed through automated triggers and scheduled jobs. Additonally, I enjoyed the overall event as it shared valuable & informative session on AI/Cognitive services/IoT/ Serverless & Cloud concepts.

Overall, the enthusiasm among all the attendees was commendable with utmost excitement to learn about Artificial intelligence. I thank Microsoft and Azure India for providing me with this opportunity. Let us learn and grow together!

Additionally, I would love to connect with you people on topics related to Serverless, Artificial intelligence. Please feel free to connect with me at akshay.deshmukh@nitorinfotech.com

You can find the detailed information about my session by clicking on https://bit.ly/2shrlCz

About Akshay Deshmukh:

Senior Lead Engineer – Nitor Infotech

Blogger, MVP @C Sharp Corner

https://www.c-sharpcorner.com/members/coder30

Author @Dot Net Tricks

https://www.dotnettricks.com/mentors/akshay-deshmukh

LinkedIn – https://www.linkedin.com/in/akshaydeshmukhis

Love to use Azure ML, IoT, Microsoft Bot Framework, .Net Core, Angular

Love to code in C#, Python, Scala, TS, JS

Microsoft PowerApps – Build your Business Apps Faster & Smarter

One Platform- Unlimited Benefits

Traditional approaches to business seems to be collapsing, and companies are trying to develop innovative solutions. Furthermore, in today’s fast paced environment you need tools that work faster, perform better, and can scale with your business.

No matter where we go to a meeting, maybe even on airplanes, work happens on our tablets, laptops and phones. Mobile technology, cloud, skilled expertise and limitless computers have transformed the way we do business. Yet, the apps we use to do business are slow to stay pace with business demand.

While organisations are turning more and more towards SaaS solutions for precise scenarios like HR, hospitality and travel utilizing services like Microsoft Dynamics, Concur or Workday, most business app scenarios still remain bolted on premises, dependent on corporate connected PCs.
`

Too often, they are not easily integrated with other services like virtual meeting tools, HR applications and many more not accessible when and where people need them most – on the system they want to use in that moment. The business application classification is always a step behind consumer applications, the primary reason being the richness and ubiquity that the latter provides.

Microsoft PowerApps has an exclusive answer to these issues. PowerApps is an enterprise service for technology frontrunners enabling them to connect everywhere, create and share business apps with their team on any device in minutes. Additionally, PowerApps benefits anyone in the enterprise to unlock new business agility.

So what exactly are PowerApps?

Fundamentally speaking, Microsoft PowerApps is a Platform as a Service (PaaS). It enables you to create Mobile apps that run on Windows, IOS, Android etc. – with almost any Internet browser. PowerApps is a platform for developing and using custom business apps that connect to your data and work across the mobile and the web, without the expense of custom software development in a short period.
Not just a platform, PowerApps can also be a standalone mobile app as well! Traditionally, mobile app development was all about creating apps for each operating system. This was a headache as it used to triple up an organization’s development time, eventually tripling up the cost. Furthermore, organizations would require more resources to create business apps.
Everything created in the Powerapp will function through and within the PowerApps Mobile App. This reduces the gap between the operating systems and allows you to run your apps. In simple terms, it is a bridge that provides mobile apps an easier pathway to function across mobile platforms.
PowerApps also has the web version. It is the same concept but runs through any modern web browser instead of a mobile app.
The highly productive platform has made its mark in the market and it helps organizations deliver business smarter and faster. Let us look at a few benefits, which form a great user experience and benefits businesses.

One Platform, Unlimited Benefits:

Mobile-First

Microsoft’s PowerApps embodies a withdrawal from its earlier strategy in which PowerApps were designed to be used on mobile devices. It is irrelevant if you use an Apple device, a Windows phone, an Android, or a tablet. You can still utilize an app designed with PowerApps.

Cost effective

For the organizations that outsource their app development this is extremely important. PowerApps enables you to build in-house – a move, which will save your organization from taking a beating. Additionally, this allows your present employees to focus on ensuring that line-of-business users have a unified app experience.

Makes Data easy to Manage

Many organizations have various solutions supporting their business with data stored in different locations. Eventually it comes with its own risk in terms of management and getting all that data working in agreement all the time can prove tough.
With PowerApps, you have the magic of its connectors. It has over 230 of them and they are growing every day. Salesforce, Dropbox, Smartsheet, Oracle are just a few and you can flawlessly use all of these without having to write any code.

Incorporating Multiple Platforms

Incorporating different stages and applications has reliably been a testing task. A few ventures have slowed down due to inability or high expenses related to building the interface for the platforms. With PowerApps and its connectors, organizations can integrate them with multiple platforms. Office 365, Salesforce, Mailchimp and many more can be used effectively and integrated with ease.

Having read all the pros about Microsoft PowerApps, it may seem infallible. However, it also has few cons.

The ‘NO’s’ of PowerApps

PowerApps are essentially Business Mobile Apps – which means internal use. You are not going to make a PowerApps that you can share with everyone. These are not intended for consumer consumption, mostly due to technical limitations it has while sharing with external users and licensing model.
Additionally, majority of the usefulness in PowerApps is “no-code.” So, your in-house designers are restricted and cannot include any custom HTML, JavaScript or add a hackable element to it.

Conclusion

It is crystal clear that PowerApps helps us to create apps platform with ease, which leads to less development time and effort helping organizations to automate their processes. Organizations can connect it to different cloud services like Office 365, Dynamic CRM, Salesforce, Dropbox etc. PowerApps accelerates how business apps are built which results in time efficiency.
Nitor is an early adopter of Microsoft PowerApps. Our development teams are working to utilize PowerApps to develop a range of solutions for businesses. We at Nitor can help your business hop on to the new platform quickly. Our experts can assess and identify the need gaps and recommend the best pathway.

For more information, please contact marketing@nitorinfotech.com

Progress Kinvey – Build Better & Faster Applications for Tomorrow

The current scenario

Mobile existence is indispensable to remain in any game in the long run – a fact, organizations have now learnt, and most associations have built a mobile presence somehow. Whether that presence is a mobile enabled website or a mobile application appears to rely upon differing variables, like the spend strategy, range of abilities, prioritization, and understanding client needs.

A portion of the key activities driving the mobile economy have been broadening or replacing client benefit by means of self-service, expanding field labourer efficiency, going paperless, faster issue resolution at a lower cost, and better client commitment and trust. Many organizations first attempting at mobile presence have missed the mark regarding both business and client expectations, and have been unsuccessful to provide the strategic business value or help attain digital business goals.

A number of organizations lack developer bandwidth, as they are clearly required for fixes, enhancements and to keep up with latest upgrades. Additionally, organizations find it difficult to build a feature rich app experience with the tools, teams and infrastructure on-hand.

Each of these organizations actively sought a better way to achieve their digital business strategy via their mobile apps. They evaluated several approaches and chose Kinvey’s Backend as a Service.

Kinvey – The Future is Bright

Kinvey is a pioneer in mobile Backend as a Service (mBaaS), inventing the category more than six years ago. It uses unified Application Programming Interfaces (APIs) and Software Development Kits (SDKs) to connect mobile, web, and IoT apps to backend resources in the Cloud. Kinvey mBaaS can also be used to federate enterprise and Cloud services and provides common app features such as data mashups, push notifications, business logic, identity management, social networking, integration and location services.
Its sole aim is to reduce Time to Market of new mobile application development by around 50%. Kinvey enables developers by completely decoupling and extracting the server-side infrastructure. Frontend developers have a unique protocol, data format, and query language to access any organization or cloud system.

Benefits for you

Following are some of the benefits Kinvey can offer:

1. Server-less Architecture
Enables deployment on the server-less platform – a developer favourite. It also has Cloud portability – an architect’s first choice.

2. Run in the Cloud
Allows to build and run applications without having to manage the infrastructure on Cloud

3. Secure, Data-Rich Apps
Enables secured, data-rich apps through no-code and low-code enterprise system integration

4. NoSQL Storage
Kinvey uses NoSQL (MongoDB) and allows users to store all types of data like collections (Tables) or Blobs (Files)

Benefits for your developers

• Deliver features and capabilities needed to achieve your business goals faster
• Provide whatever you can imagine without technology or resource constraints
• Ensure that you meet your time to market goals
• Reduce time from ideation to delivery and more enhancements per release
• Create flexibility by allowing the use of any development resource
• Guarantee zero delay in getting your project started and access data from any application or data source from within mobile apps

When software development teams leverage the abilities of the Kinvey platform, the fundamental roadblocks to development agility are cleared and you can gain through agile development processes, including the power of responding to user feedback rapidly and efficiently. With the use of Kinvey, organizations can significantly cut their development release cycles.

Business Value

The business value of Kinvey can be distilled down to some of these factors:

Kinvey provides a fully managed service with pre-built frontend and backend mobile application development accelerators and built-in operational intelligence for rapid troubleshooting of user issues. There is no need for customers to develop their own mobile app delivery foundation, since Kinvey provides all of the services that enable customers to focus on what is important- viz. value added features and rapid response to user issues.

By abstracting future backend system changes through the Kinvey platform, development teams will no longer need to know the nuances of enterprise systems data access paradigms, allowing them to focus 100% on frontend work. Backend engineers will provide controlled access to enterprise systems via a reusable service catalog that developers need to set up just once.

And finally, how is Nitor leveraging the Kinvey platform?

With over 30,000 applications and 85,000 developers in their community, Kinvey is the leading mobile application Backend as a Service (mBaaS) for the digital enterprise.

We at Nitor started with Kinvey by primarily migrating backend for some of our mobile applications. We were amazed at the ease with which we were able to implement it. Nitor’s experienced team by leveraging Kinvey platform helps enterprises create feature rich application with almost 40% to 50% less Time to Market.

Performance Engineering – Ensure Reliable, Scalable and Seamless Application performance

Being a developer involves a lot more than just coding. As extremely distributed applications turn out to be more mind boggling, developers need to guarantee that the end product is easy to understand, secure, and as scalable as possible. With the correct resolution, software teams can categorize possible performance issues in their applications prior in the development cycle and make steady astounding fixes.
Everything from systems administration, frameworks, running a cloud infrastructure, to assembling and analysing more UX information requires your software teams to fuse solid testing methods throughout your application’s development stage.
Effective performance engineering is the way to go forward. Performance engineering does not allude just to a particular role. For the most part, it alludes to the arrangement of abilities and practices that are systematically being comprehended and embraced across organizations, to focus on accomplishing more elevated level of execution in technology innovation, in the business, and for end users.

Why is Performance Engineering important?

Performance engineering entails practices and abilities to build quality and superior execution throughout organization, including functional necessities, security, ease of use, technology platform, gadgets, outsider administrations, the cloud, and more. The goal is to deliver better business value for the organization by discovering possible issues early on in the development cycle.
Performance Engineering is a vital part of software delivery, yet many IT organisations find it an expensive and a challenging thing to do. Despite big performance failures that have been making continuous headlines, Performance Engineering has been unsuccessful in getting the attention and budget spending it deserves in many companies.

How to make the most of performance engineering?

Here are things to keep in mind when incorporating the performance engineering process into your model.

1. Build a Strategy

Building a Performance Engineering approach is a vital part of the process and you need to be sure about how to align it into your organisation and delivery model.
– Identify the SMEs and the touchpoints that you will require in your development lifecycle.
– Comprehend what are the quality gates and in what capacity they will be administered.
Always remember that it all starts with the requirements. If your product owner recognizes what level of performance they want from the system, then it gets easier for engineer to meet the system requirement.

2. Plan the Costing

One thing is for sure, it takes a good sum of amount to build a high-end performance engineering practice. As you are building up your execution guide, you may need to experience various spending cycles with a specific end goal to get all the infrastructure and tools ready.
– Remain solid and positive
– Utilize the failures which organizations faced in the past to persuade the stake holders of the significance of Performance Engineering

3. Classify Crucial Business Workflows

If you do not have information about the right tools then get in touch with the vendor as it can turn out to be costly and time-consuming.

Always remember it is better to spend time on creating workflows that are critical to the business and that have the maximum throughput.

4. Find the Baseline and Test Regularly

The next stage is to benchmark the performance pattern with an arrangement of execution tests. These test can be used on numerous occasions.

– Set up a history of your production runs marked by trends to check for patterns in framework execution. In an ideal scenario, this needs to be done in every release and every integration. If the trend examination can be automated as part of a CI/CD process, nothing like it.

5. Use the best Tools and Hardware

You will require the best possible APM, diagnostic and testing tools for Performance Engineering. It’s imperative that you distinguish things you’ll require and those you won’t, to legitimately run tests and analyse bottlenecks.

Production-like settings are usually costly. Preferably, you’ll have one for your Performance Testing in any case. If you are testing frequently with each deployment, then the pattern in any case point to a bottleneck that the engineers need to be vigilant about.

6. Have Data Strategy in place

As you will test frequently, you should have the capacity to make test information rapid and effective. It is imperative that the information you have is alike to the production environment. Remember, if you are not using representative data set then query plans will be different.


What are the Business Benefits?

As you can clearly see, the above steps are vital when it comes to incorporating a performance engineering process into your business model. These steps ensure that your organization benefits out of it.

Listed below are some of the benefits of performance engineering from an organization’s perspective:
1. Decreased burden: Reduced vulnerability of applications when the anticipated load is high

2. Optimal utilisation of resources through performance engineering: The infrastructure may be over-provisioned or under, PE lets us know the utilisation graphs and helps in making strategic decisions.

3. Guaranteed support: Ensured level of commitment for an application to perform in the given supported criteria

4. Future ready: Helps in taking future decisions for scaling the applications

5. Increased adaptability: Helps in determining the application design and in case if you want to do incremental changes in the applications

What can we conclude?

It is quite clear that performance engineering helps in benchmarking the application performance and allows organizations to identify all business-critical scenarios for performance testing. Additionally, it helps to determine the extent of availability and reliability of the application, while instilling mechanisms to constantly advance application performance.
In short, Performance engineering should be a priority before releasing any software or an application. It should be executed early on in the development phase to catch more bugs in advance and increase user satisfaction while saving you time and money down the line.
Nitor possesses proficiency at providing exquisite user experience through reliable application performance. It encompasses various frameworks and tools in order to test, monitor and streamline performance and optimise infrastructure cost.

To know more please drop us email at marketing@nitorinfotech.com

BDD – Be Agile, Create Value & Build Highly Visible Test Automation

Everybody likes to complete things in their own specific manner. However, when it comes to software programming, it would always be beneficial in having a set of principles for each phase of software development.

Opening the discussion and keeping various technical teams on the same note can allow software to work seamlessly. As organizations move towards coding phase they need to adjust the procedures to fit their present work processes. So what is it that can define user behaviour prior to writing test automation scripts?

That is called BDD (Behaviour Driven Development).

What is BDD?

BDD is a development process, which explains the behaviour of an application for the end user. It is an extension of TDD (Test Driven Development). In BDD, the behaviour of the user is defined and converted to automated scripts to run against a functional code. These test scripts are written in a business readable and domain specific language known as Gherkin, which ultimately reduces a risk of developing a code. Following are some of the points, which clearly outline the value of BDD.

1. BDD is not testing, it is a process of developing a software. It considers questions like ‘where to start in the testing process; what to test and what not to; how much to test in one instance; what to name the tests and how to understand when and why it fails. It is what can be called a rethinking of unit testing and acceptance testing.

2. Before BDD, TDD had tests that were developed first and failed until a functional code was arrived at. This point was when a test was considered to have passed. This enhanced with BDD, where the tests were written in a specific format.

3. Since the language used in BDD was domain specific the requirements are now more real time and meaningful, where all stake holders are on the same page, as opposed to the earlier ‘ only developer and tester friendly’ ones.

4. BDD does not change/replace traditional UI automation tools like selenium or appium.

5. In terms of test automation, it represents a presentation layer, in other words it can present data in a clear-cut manner and in a standardized format.

As you can clearly see, BDD has nothing to do with the technical side of the Testing, let us try to understand why and how BDD is important.

BDD helps bridge the communication gap between clients, developers and other stakeholders.

Collaboration – In traditional testing, nobody would recognize what part of the test/scenario was failing. With the BDD approach, everyone including stakeholders, product team and developers understand testing, making it a win-win situation for organizations.

Requirement Change Management – Traditionally, the requirements clarity were logged in collaboration tools like Jira or other project management tools. With BDD, any changes in requirements would automatically be documented as tests.

Test Management Tools – In traditional method, test management was separate and were automated and manually marked within the test repository. But, with the advent of BDD tools, static metrics such as “#” of specs, “#” of scenarios are collected automatically. Furthermore, other test metrics can effectively be expanded.

Single Source of Truth – Traditionally, requirements would be transferred from project management to test management, and finally to automation. With BDD, in a mature agile process, specs are written correctly in Jira and can serve as source of truth. This is in contrast to testers reading requirements.

Phases of BDD

The overall BDD process involves two important phases – Process insights & Tools/Technologies. Let us look in detail how vital Process insights & Tools/Technologies are when it comes to BDD process.

a.  Process Insights

To benefit from BDD based test automation, it is imperative to have process overarching Planning, BDD Design and Test Automation Framework.

Planning – Priority based stories/features should be picked up for automation.  Undertaking  an iterative discussion helps to know what activities would be beneficial to ongoing automation efforts.  For best results, effort estimation could be followed up by stabilization of test automation activity instead of the usual factory approach, if the product is still evolving.

BDD Design – It is recommended that, scenarios be designed by QA/BA rather than Quality Engineers. This is instinctively due to fact that they are owners of product quality. In addition to this, principle of collaboration mandates that, they own up this part of automation effort.

Also the scenarios should be reviewed from the point of functional flow, behavior semantics and step reusability by all concerned stakeholders – QA, BA and Engineers. Review process should be a de-facto part of design process.

Test Automation Framework – BDD design ensures that reusability is complemented by the implementation component. Standard automation and development practices must be followed to ensure efficient output.

b. Technologies/ Tools

Some automation tools that support BDD are listed below:

Platform BDD Tool
Java JBehave, Cucumber, Gauge
C# SpecFlow
Python Behave
Ruby Cucumber
Javascript GaugeJS
PHP Behat

Apart from automation tools, Test Management based on BDD test designs play an important role. There are tools like TestRail, HipTest, which now support BDD based test editor functionality and guarantee better integration of processes and implementation.

Business Benefits

Once the Process insights & Tools/Technologies are in-sync, BDD automatically offers benefits:

  • Know What You Test (KWYT) – Since testing is not performed in isolation, continuous tracking and reading of what is being tested becomes possible. Coverages cannot be missed and product owners can now chip in proactively if something is being missed.
  • High Visibility – Due to collaboration, the tests, their quality and their results are visible to all management stakeholders which gives confidence in taking decisions for product releases.

Conclusion

Behaviour Driven Development helps in building quality and creating value. Instead of having tests that are only useful for engineers, BDD aims at tests useful for all. Additionally, it improves the partnership between the parties and allows developers to get a clearer scope of essential features and the customer gets a better knowledge of what will be delivered, with accurate estimates.

Nitor excels at streamlining and operationalizing BDD Based Test Automation through its ready-to-use frameworks, successfully employed strategies and efficient use of tools/technologies.

If you are interested in finding out more about BDD, write to us at marketing@nitorinfotech.com

Boost your business foundation with Microsoft Dynamics xRM

Regardless of what industry your company works, clients are your most vital resource and handling those client relationships is the establishment for developing your business. Additionally, plenty of the organizations look for CRM to manage sales, customer service and marketing. A CRM (Client Relationship Management) software can help gather, sort and deal with the majority of your client information, and is integrated from finance to operations.

One such CRM, Microsoft Dynamics, is one of the most popular tools in the market. Not only does it meet the needs as well as the budgets of smaller, middle-sized and large organizations but it also makes marketing more effective and assists you in getting more out of your customer relationships. Furthermore, Microsoft Dynamics CRM offers the flexibility of both on-demand and on-premise deployments. Additionally, the powerful CRM program offers unparalleled integration with Microsoft Office suite, Microsoft SQL Server, Microsoft Exchange Server, and Microsoft SharePoint, some of the most widespread applications in the business world.

Do you need a Software that is a Step ahead?

A term often associated with CRM, with a twist- is– ‘xRM’ or ‘eXtreme Relationship Management’. xRM is nothing but an extension to CRM, if your organization deals with policies, property taxes, building assets and list goes on. With xRM you can manage the relationship of anything within your company. Additionally, xRM is ‘extended Relationship Management’, which represents the extension of CRM platforms allowing organizations to thrive by helping them manage employees, process, suppliers, assets and much more.

An XRM has a several key components, which can give a strategic approach to building a unified system that connects all aspects of a business together. Following are the XRM components:

1. Entities & Records

2. Fields

3. Forms

4. Web Resources

5. Workflow Processes

6. Plugins

7. Web Services

As you can clearly see, the above components are essential to manage xRM. However, the question remains is, is it useful to deploy a solution like xRM? Will organizations reap any benefit out of it? Or is it just a fad? Well, to answer that honestly xRM is necessary if you already have the CRM within your organization. It has several crucial advantages, which can be vital for developers as well as for organization.

What is in it for Developers/Organizations?

These days there is little time to write a lot of custom code to deliver solutions. With xRM, developers can aim to develop applications rapidly. To meet requirements for business applications, xRM has a framework that provides the agility and flexibility to adapt to changes and get user acceptance and adoption.

From an organization’s point of view, when you take Dynamics 365 and utilize it as a stage for building an XRM system, you get a rock-solid foundation on which to build ‘line-of-business’ (LOB) solutions. Everything can be tailored according to your company’s need and incorporated smoothly with other critical systems.

xRM solutions offer flexibility and customization to meet almost any business or organizational need. Integrating an xRM solution with the Microsoft Dynamics CRM will provide you with several important advantages.

Automation at its best – Microsoft Dynamics integration with xRM automates important tasks that employees would otherwise have to complete manually.

Rapid deployment – Developers do not have to worry about building an LOB software from scratch, as software plugins extend the functionality of the core Microsoft Dynamics CRM system.

Robust Security – Another key advantage is that xRM provides robust security features. It has security roles for users and objects that restrict access to sensitive data, SSL connections for data transfer, and more.

Native Integration – xRM solutions can connect existing systems to CRM, freeing data trapped in outdated systems. Microsoft Dynamics CRM also provides native integration with Microsoft SharePoint® and Microsoft Office® applications including Outlook®, Excel®, and Word.

We at Nitor take pride in our xRM solution capabilities. We specialize in xRM plug-in development, OOTB customizations and creating custom workflows to benefit your organizational requirements.

Find out how xRM would eliminate silos and build a unified marketing & sales funnel, write to us at marketing@nitorinfotech.com.

Dynamic Data Masking: It’s time to secure and transform your data

What is Dynamic Data Masking?

According to Microsoft Dynamic data masking helps prevent unauthorized access to sensitive data by enabling customers to designate how much of the sensitive data to reveal with minimal impact on the application layer. DDM can be configured on the database to hide sensitive data in the result sets of queries over designated database fields, while the data in the database is not changed. It does not encrypt the data, and a knowledgeable SQL user can defeat it.

In any case, it comes with a basic method to administer from the database, what information the different clients of a database application can and cannot see, making it a valuable tool for the developer. Having said the above Dynamic data masking needs a proper implementation. Let us look at how exactly the Dynamic data masking is implemented:

  • To implement DDM, you define masking rules on the columns that contain the data you want to protect. 
  • For each column, you add the MASKED WITH clause to the column definition, using the following syntax:

    MASKED WITH (FUNCTION = ‘<em><function></em>(<em><arguments></em>)’)

  • Dynamic data masking limits (DDM) sensitive data exposure by masking it to non-privileged users. It can be used to greatly simplify the design and coding of security in your application.
  • Dynamic data masking helps prevent unauthorized access to sensitive data by enabling customers to designate how much of the sensitive data to reveal with minimal impact on the application layer. This can be turned into the server as possible
  • DDM can be configured on the database to hide sensitive data in the result sets of queries over designated database fields, while the data in the database is not changed.
  • Dynamic data masking is easy to use with existing applications, since masking rules are applied in the query results.

To summarize, when it comes to sensitive fields in the database, a centralized data masking policy acts directly. Additionally, it assigns personal roles or users that do not have access to the sensitive data. DDM features full masking and partial masking functions, as well as a random mask for numeric data.

What makes Dynamic Data Masking Special?

As you can clearly see, the data masking practice is vital and can help address organization with data breaches. Here are some of the additional dynamic data masking benefits, which organizations need to look at:

  • Regulatory Compliance – A strong demand for applications to meet privacy standards recommended by regulating authorities.
  • Sensitive Data Protection – Protects against unauthorized access to sensitive data in the application, and against exposure to developers or DBAs who need access to the production database.
  • Agility and Transparency – Data is masked on the fly, with underlying data in the database remaining intact. Transparent to the application and applied according to user privilege.

As you can clearly see above Dynamic Data Masking has number of benefits for organizations. Similarly, DDM can be an assest when it comes to Developers. Let’s have a look how Developers actually benefit from DDM

  • In DDM, simple and understandable rules are defined to operate on the data. The collection of these rules performs a series of known, tested and repeatable actions at the push of a button.
  • Data Masker handles even the most intricate data structures. It can preserve data relationships between rows in tables, between rows in the same table or even internally between columns in the same row
  • Data synchronization issues of this type can be automatically handled by the addition of simple, easily configured masking rules.
  • DDM works easily with tables containing hundreds of millions of rows.

 Conclusion:

Data security will never not be an issue; it will always be something we have to stay on top of.  However, with some of these practices in place we can avoid the at least giving the data away.

Information security is a never-ending issue; it will always be something we have to stay on top of. Dynamic data masking at least gives us a comfort zone where we can avoid at least giving the data away. Additionally, it minimizes the risk of accidental data leakage and dynamic obfuscation of sensitive data in the database responses.

Nitor’s Dynamic data masking services enables customers to focus on sensitive data elements in the desired databases. Our key objective is to provide customers with a working data masking solution while helping them establish knowledge and confidence. Additionally, we also believe that Dynamic Data Masking is complementary to other security features in SQL Database (e.g., auditing, encryption, RLS) and should be used as part of a comprehensive access control and data protection strategy.

To learn which implementation option best meets your organizations data masking needs please contact marketing@nitorinfotech.com

GitHub Acquisition: Reconciling GitHub with Microsoft

Microsoft’s most recent business move has shrouded the developer community in a state of wariness. GitHub, a popular open source code collaboration platform for developers and scientists (basically anyone working with data), was acquired by the tech behemoth for $7.5 billion. This figure represents an amount thirty times (!) GitHub’s annual recurring revenue.

Before the acquisition, GitHub suffered from multiple issues. These included serious monetary and leadership problems. GitHub narrowed solutions down to two options: the first was to hire a new CEO to streamline the company’s business direction and thus gain invaluable funding opportunities. The second option was to be acquired. GitHub chose this easier, faster path.

The optimal candidates to be acquired by were companies with access to large enterprise customers/subscriptions. This insight is derived from GitHub’s revenue model; GitHub is free for individuals but requires enterprise users to pay. Google, Amazon and Microsoft were among the

companies enticing GitHub with offers of acquisition. In the end, GitHub decided to go with Microsoft because of the tech titan’s more generous value offering. GitHub was also fully aware of Microsoft’s increased appreciation of open source (especially with Satya Nadella as CEO) and of its desire to show this to the world.

In this situation, Microsoft had an excellent opportunity to advance their own interests. First was the opportunity to show the world their transition from a propriety/monopoly based business to an open source model. Next was the all-important target of grabbing networking opportunities. Microsoft acquired LinkedIn in 2016, which enabled access to a network of professionals. With the acquisition of GitHub, Microsoft now has access to a network of developers. With this access to the largest pool of developer mindshare, they can compete with the likes of Facebook. With GitHub being the one of the largest code repositories, Microsoft can easily monitor new projects, interests, technologies, and market trends to stay ahead of the competition. Microsoft can also capitalize on an opportunity to woo developers more effectively by creating more offers and generating value, for example by creating attractive Microsoft based tool chains in open source to gain traction towards Microsoft technology. Lastly, the acquisition may have been an exercise in building strategic value. The strategic value – which pertains to how a certain company’s offerings help a different company (usually larger) to be successful – of GitHub, is essentially in the 85 million repositories and 28 million developers it hosts worldwide. It is not difficult to imagine the value of access to these developers, who regularly use GitHub’s code repository products, especially when the developers can be installed in Microsoft’s immensely profitable developer environment.

Microsoft’s long history of generally running counter to open source software, however, has led to lukewarm reactions from the developer community. Many developers feel that despite Microsoft’s attempts to foster acceptance toward an open source culture, Microsoft is not good for GitHub. This comes from GitHub’s initial premise of hosting distributed version control for remote coding for a flexible coding experience that could boost a developer’s community presence. Behind the developers’ tepid reactions are fears that Microsoft might leverage or co-opt their code for future products, or that the developers will be muscled into using only Microsoft products. Additionally, there are some direct conflicts between Microsoft and GitHub. For example, there are certain GitHub projects that are Xbox simulators. It is quite likely that Microsoft will kill these projects. There are even rumors that Microsoft may add tracking or advertisements to GitHub’s sites. There has thus been an upsurge in developers shifting their code to GitLab, one of GitHub’s prime competitors. In all fairness, however, this might be a reaction to a temporary fear.

So what is the future for GitHub? Looks like only time will tell.

How to Skyrocket Your Venture’s Funding with ICOs

ICOs (Initial Coin Offering) have gained tremendous traction in today’s world of digital currency. Built upon the security, trust and transparency of the Blockchain paradigm, ICOs have helped companies raise 7 billion USD as of May 1, 2018. This is a rise from 5 billion USD in 2017.These facts, coupled with the recent favorable economic climate, indicate that this the optimal time to capitalize upon the rising tide of cryptocurrency.

Read on to discover how and why you should raise maximum funds with this innovative business model.

Why ICO?

ICOs merge the power of crowdfunding with the allure of cryptocurrency.

In an ICO, internet users view your value proposition and invest in your vision by buying tokens. Note that this happens before the actual token-based marketplace is released to the world. The next step is a full exchange in which the issued tokens can be traded for other currencies. This structure motivates the public to participate in the ICO and own as many tokens as possible to gain on future enrollment into cryptocurrency exchanges. Because Blockchain technology underlies ICOs, users can be assured of security, transparency, and trust.

Building an ICO Platform

Nitor’s ICO platform follows certain best practices to ensure that your ICO is a success. First, all necessary ICO information is presented on an intuitive website. This includes token information, ICO duration, the beneficiary wallet address, and interfaces to popular cryptocurrency wallets. After this critical step, it is advisable to ask white-listed users to register and share their information so you are sure that every payment is legitimate.

During the ICO, it is useful to display the token status. This is usually shown as Total Tokens Sold vs. Total Tokens Allocated (known as a hard cap). It is also a good idea to have a token calculator, which shows the relationship between one token and a cryptocurrency such as Ether, Bitcoin etc. You will also need to display a transaction history. This is a list or record of transactions showing wallet addresses, amount invested, transaction costs and transaction signatures. This also helps in maintaining accurate records.

If issues arise, architectural modularity helps quickly identify and fix problems so that token sale can progress. An AML (Anti-Money Laundering) feature ensures that if, based on data analysis, you see an issue with a particular transaction post-payment, you can reclaim the issued token. Finally, remember that it is important to market your ICO. A strong integrated email notification engine automatically feeds ICO highlights to subscribed users. This has the potential to be used as a powerful marketing tool.

Ethereum-based Engineering Guidelines

Some helpful guidelines for Ethereum-based engineering include:

  1. Choose Modular HTML5 frameworks for front-end development as you may be looking to integrate an existing website instead of developing from scratch.
  2. Leverage the Truffle framework. This is useful for the creation of Smart Contracts.
  3. Follow the best practices in writing solidity files.
  4. Use a sandbox/test network such as Ropsten for integrated tests.
  5. Ensure 100% code coverage for all development.
  6. Modularize Smart Contracts for maintainability.
  7. Ensure that a third-party auditor, instead of the developers involved in writing code, conducts the security audit of Smart Contracts.
  8. Deploy your Smart Contracts to the Ethereum public network.

Nitor implements all these and more so that you can hold a profitable ICO.

Key Considerations

Before running your ICO, decide on a minimum funding goal (known as a soft cap). Complete the requisite research beforehand to understand what this number should be. Next, remember to avoid issuing tokens before the sale ends. This is important as record keeping becomes easier. Issue tokens only after the minimum funding goal is achieved and the token sale officially ends. If the minimum funding goal is not achieved, however, it is best to refund the money and modify your approach. With the guidance of Nitor’s dedicated experts, you can avail the benefits of a secure sandbox ICO platform to pre-test token sales.

Nitor’s services can help you at every stage of the ICO process. Nitor can get to the heart of the complicated code of Smart Contracts, leaving you free to strategize and innovate. We also help you drive innovation with 70% of ICO contract features already in place. With our knowledgeable teams, you can set the stage with your ICO website within two short weeks.

ICOs are one of the most useful, secure, and transparent tools for fundraising today. Nitor can help you leverage the brilliant power of Blockchain technology with the application of the aforementioned tips. With our experts, you can craft a brilliant strategy to generate the funding that your revolutionary product deserves.

If you would like to benefit from our world-leading Blockchain arsenal to raise funds for your venture, reach out to us at marketing@nitorinfotech.com.