Monday, August 28, 2017

Five ‘Must Haves’ in the Self-serving BI Tools

 

The world of business analytics has seen some major shifts in its analytical frame. The current demand for instant usability and conformability has made the traditional lengthy-process of getting out reports via business analyst or analysis-specific IT teams redundant. Even before the data reaches the actual business users for ultimate decision making, the time taken for the data travel and subsequent conversion causes it to expire upon arrival.

The rise of self-service business intelligence (BI) is indeed unfathomable. Several BI companies have established a strong foothold in BI and Gartner has also predicted that “self-service BI platforms will make up 80% of all enterprise reporting by 2020”. Self-service BI tools, as the term denotes, will not only eliminate the need for a mediator to transliterate the information into usable data but also help beat the time delay.

The major perk of a self-service BI tool is that, in comparison to a lot of BI data analytics tools in the market that require SQL developers or BI experts, a person of reasonable understanding would be able to use the tool’s dashboard to easily manipulate the data into the required track. Without any specialized training, the management, marketing, business development, or any controlling department within the business could easily access the businesses’ database to build the necessary reports to answer crucial business questions.

Here are the top 5 ‘must haves’ when you consider a self-service BI tool:
  1.    One Stop Shop – The tool must be able to correlate data and not be a restricted user interface requiring multiple individuals to manually generate statistics and then have another one drive the final report. In short, one tool should be the one stop for all business needs.
  1.    Easily Integrated – Tool should be easy to integrate into existing systems in order to be able to get it working without any delay or the requirement of a major system upgrade in the existing database. Adaptability is a priority.
  1.    Real-time – Constant real-time update feasibility to ensure that the numbers are live and not require a constant error due to poor data feeds.
  1.    Simplified Decision Making – Avoid “decision fatigue” that can be the downfall of a business. It is important for the BI tool dashboard to allow a user to ease of access to even high-end data processing so that warranted decisions can be taken without a glitch.
  1.    Time & Money Saver – Lastly, the tool should not be a time or money consumer, because otherwise, businesses tend to take a negative approach towards the BI tool.
Bernard Marr, author of ― Big Data: Using SMART Big Data, Analytics and Metrics To Make Better Decisions and Improve Performance, stated that “As business leaders, we need to understand that lack of data is not the issue. Most businesses have more than enough data to use constructively; we just don’t know how to use it. The reality is that most businesses are already data rich, but insight poor.”  The scope of self-service BI tools is to try and cut through the precise data by having department specific users navigate through the abundant data and use it to their leverage instead of the traditional group of BI experts salvaging random data.

In nutshell, self-serving BI tools anticipate providing independent access to critical data without any constraints. Nevertheless, it is important to factor in that every tool needs regular maintenance and appraising without which an error-free analytics tool would completely collapse over time.

Friday, August 18, 2017

Build Automation using PowerShell



Now that we know why build automation is needed, in this detailed post I will cover how to approach build automation using PowerShell. Since this is a technical post I have included the list of acronyms and some useful links towards the end for reader’s benefits.

Background & Requirement

We use Microsoft Visual studio to develop our .Net projects. For a modestly sized client project with both web services and windows services, we were initially relying on manual approach to create daily builds and deploy it on QA, UAT and Production servers. On an average, this was taking around 2 hours of a DevOps Team member and more if some issues crop up. QA was getting delayed every morning getting the link for QA server with new build. We needed to automate this workflow as much as possible.
I was asked to undertake the task. Being a newbie to this, I did some research and found that doing Continuous integration (CI) means to use different tools and tool sets. Tools that I could use were Team City, Jenkins, Team Foundation Server, Bamboo, Circle CI and so on. Most were excellent tools supporting entire stages of CI but these also cost money in terms of licenses, either per user or per seat. Besides cost, there were also learning curves and their own limitations.
I have worked with PowerShell since its alpha release in 2005 and grew fond of its power and versatility over time. I use it for most of my automation needs, be it personal or work related. A relatively new windows command shell, it is integrated with C# which makes it much more powerful than traditional windows CMD shell. Much loved and rarely rebuked, it has very vibrant user community in System management and automation space. Just do a Net search for common System Management tasks through PowerShell.  Through Powershell Core , it is also making its way in Linux world. PowerShell is released with many inbuilt cmdlets to do most of day to day work and a lot of windows components & other vendors have provided their own cmdlets to manage their environments through PowerShell command line.

Justification & Business Case

Faced with licensing costs & other constraints of traditional CI tools and my own comfort with PowerShell, I decided to script the whole Build Automation workflow using PowerShell. It was a wonderful decision. Automation has been working very well since 2 years and saving minimum 2 hours per day for a small project with 5 web services and 3 windows services and code stored in SVN repository. Later on, we extended it to work with much bigger project with 11 Visual Studio (VS) solutions and 71 VS projects with code stored in TFS (Team Foundation Server). It saved 3 hours per day on the second project. We could deploy this on any windows machine where Windows Management Framework (WMF) 4.0 can be installed. Everyone on development team was aware of the code build status and DevOps could use output of this process to deploy it inside client VPN.

Creating Build Management Framework

For our Build Management workflow, I used the below tools, cmdlets and techniques
  1. Retrieve code from source control using SVN command line and TFS PowerShell
  2. Modify Solution files and Project files which are plain XML and easily manipulated from PowerShell.
  3. Use exe to download on-demand packages.
  4. Use MSBUILD and MSDEPLOY to build and deploy to desired location.
  5. Use IIS PowerShell cmdlets to manage Web services and deploy them.
  6. Use Out of the box PowerShell cmdlets to manage Windows services.
  7. Validate deployments and do BVT using Invoke-WebRequest, Invoke-RestMethod and other out of the box cmdlets.
  8. Invoke our test suites written in TestNG or nUnit directly.
  9. Use Send-MailMessage to send mails with results and the attachments. Securely save Sender password in system through PowerShell Secure Capabilities.
If I had been automating build created in other environment like Java, Maven/Ant & Git, I could have easily installed these on my build machine and use the corresponding command line to get code and build it.

Major Benefit of the PowerShell based approach

Major benefits of using PowerShell to do Build Automation as per me are as follows
  1. Easy availability and No Licensing cost – All things mentioned here are available free of cost on any windows computer. There is no ongoing cost beyond initial development cost.
  2. Easy deployability – This setup can be easily deployed anywhere on a windows computer and it can start working without much hitch.
  3. Easy maintenance – Once used to tweaking the configuration files required for it, even a novice DevOps engineer can maintain this. If DevOps team is well versed with PowerShell, they can even debug and update the scripts as and when needed.
  4. Modularity – I divided my solution in 4 parts, SVN fetch, Solution Modification, Build, Deploy. Most of the code in this is easily reusable across various .net projects.
  5. Longevity – Microsoft is committed to enhance PowerShell experience across the board, as also many vendors. Investment in this is not going to be wasted down the line.
  6. Existing knowledge – Most DevOps engineers have some exposure to PowerShell and can utilize their knowledge very easily in this rather than learning a new UI tool.
  7. Full control – PowerShell cmdlets and other command line tools usually expose more information and allow more fine-tuned control than any UI based tools which always need some scripting support.
With PowerShell, immense power is available with a DevOps engineer to tweak things as needed. Why run around and struggle with various evolving tools when everything is available easily to do most of the build automation tasks through out of the box tools?
Added advantage of a build automation project undertaken through PowerShell would be major knowledge upgrade for the DevOps team. They will be exposed to many tools & concepts and become more nimble & productive with effective usage of PowerShell in their day to day life.
Cmdlets available from others and coming with new software installation

As I noted earlier, many vendors have extended support for PowerShell through their own PowerShell cmdlet packages. I will list out few now which do not constitute an exhaustive list by any standards but gives a glimpse of what can be done through PowerShell from a DevOps architect perspective. If your workflow constitutes more steps than mentioned earlier, you may need to use one of the below.
  1. AWS tools for Windows PowerShell – Manage AWS services from the Windows PowerShell scripting environment.
  2. Azure PowerShell
  3. SQL Server PowerShell – SQL Server 2017 supports Windows PowerShell. PowerShell supports more complex logic than Transact-SQL scripts, giving SQL Server administrators the ability to build robust administration scripts.
  4. Oracle Cluster and PowerShell
  5. Net and Data Access – Connecting to Oracle Database from PowerShell.
  6. PowerShell for Docker – Under construction now, under open source project but very promising.
  7. Manage VPN Connections through PowerShell – If needed; connect to VPN before code download or deployment.
  8. Manage Windows Clusters through PowerShell
  9. Microsoft Office PowerShell cmdlets – Automate editing of Office Files.
  10. PowerShell in Jenkins – Use PowerShell scripts in Jenkins.

Links, Acronyms & Further readings

  1. PowerShell learning from Microsoft Virtual Academy
  2. Continuous integration – A development practice that requires developers to integrate code into a shared repository several times a day. Each check-in is then verified by an automated build, allowing teams to detect problems early.
  3. Windows management Framework – PowerShell upgrade
  4. List of Build Automation tools
  5. Some CI tools – Team City , Jenkins , Team Foundation Server , Bamboo , Circle CI
  6. SVN – Apache Subversion
  7. TestNG – TestNG is a testing framework inspired from jUnit and nUnit but introducing some new functionality that make it more powerful and easier to use.
  8. nUnit – NUnit is a unit-testing framework for all .Net languages
  9. QA – Quality Assurance
  10. UAT – User acceptance testing
  11. Cmdlets – a lightweight command that is used in the Windows PowerShell environment. The Windows PowerShell runtime invokes these cmdlets within the context of automation scripts that are provided at the command line.

Monday, August 14, 2017

Why use Build Automation in Application Development?


Growing relevance of automation & DevOps has revolutionized the software engineering industry and made a deep impact on the way traditional application development is approached. With all the hoopla around it, one thing is certain; it is here to stay with a long list of benefits.  In this blog, I take a look at why Build Automation is needed in contemporary software development projects, but before that let me quickly cover the basics and answer what exactly it is.
What is Build Automation (BA)?
BA also sometimes referred as Continuous Integration (CI), is the process of automating on-demand build creation which encompasses some or all of the below
  1. Download code from a central repository – Git, SVN, TFS etc
  2. Make updates in code structure if needed
  3. Download required external packages via Maven, Nuget, Ant etc
  4. Build code using gcc, javac, MSBuild etc
  5. Create a building share with binary and default configuration – Jar, war, exe, XML, ini etc
  6. Propagate build output to cloud or network shares
  7. Do deployment on web servers & other servers
  8. Configure new or upgraded deployments
  9. Do BVT tests deployments
  10. Inform relevant stakeholders
A CI is triggered usually when a code commit is done or a particular tag is created. A Basic BA job is usually triggered at a fixed time; Dev teams need to finish commits by that given time.
Continuous integration Vs. Build Automation
CI’s benefit lies in giving every team member responsibility for individual commits. Faults are uncovered fast. It’s a complex process even with a licensed software or service and needs good skilled DevOps team. Despite claims of only configuration based settings, some scripting always needs to be done.
In contrast, basic BA takes the time to uncover faults but its predictable timeline reduces anxiety for Team members. It’s easy to implement leaving few manual tasks. It can be developed by anyone with basic scripting knowledge as I will demonstrate in a later post. It can be done using the native shell of an OS without any licensed software.
Hesitation about doing Build Automation
Due to the fact that basic BA may skip some manual steps, many think that it is not worth it. They aren’t helped by the lack of enthusiasm on the part of DevOps or the Dev teams. DevOps teams may think that their job is in danger.
Dev teams aren’t very enthusiastic about the need of their time. With every new technology, new ways of building code and organizing it come around. DevOps teams will not know all nitty-gritty about new build systems. Nuget.exe system may not be very clear to a DevOps person with Linux background. Git brings its own peculiarity in dealing with repositories. Dev Lead has to be really serious about helping their DevOps counterparts during automation development.
Why is hesitation not right?
Unlike CI, BA can be done economically, in less time and gives below benefits
  1. Discipline in team members – Initially, Dev teams complain about frequent build breaks but with a right push by PM, they will inculcate better habits.
  2. DevOps time-saving –They need not stay late in the night or get up early to finish daily build.
  3. QA time-saving – QA need not wait to get the new deployment before starting testing. In the case of build breaks or BVT bugs, a turnaround is faster.
  4. Management visibility – Management can uncover productivity of developers looking at build emails. Many build breaks can initiate improvement in the quality of Dev teams.
  5. Predictable clean build – Manual builds typically are incremental builds which may hide build problems.
  6. Predictable clean deployment – Manual deployments can take dependencies on deleted configuration. Automation can do fast clean installation uncovering broken settings.
  7. Wide Dissemination – More stakeholders can be kept informed by automation.
  8. Knowledge improvement for DevOps
  9. Retention Tool for DevOps – When DevOps teams are doing higher quality work, they will be more inclined to stay on and learn more.
Most benefits mentioned are for any automation project while some are unique for a BA project. It is imperative that any project with >3 developers and >3 months of development time should do BA. With the time, teams can build reusable components for BA projects.
Stakeholders in a Build Automation Project    
DevOps
DevOps team must be proficient in the native shell of an OS; PowerShell, Bash etc. They can additionally learn cross platform scripting languages like Python. They need to be well versed with command line Git/SVN/TFS instead be aware of basic development methodology.
Development
DevOps team will never know everything about a dev system. Dev manager must provide relevant help whenever needed. They also need to keep dev team members informed about requirements of build automation which may include
  1. Commit the only unit tested code
  2. Commit in time
  3. Commit with relevant messages
  4. Maintain code quality
  5. Maintain configuration file quality
  6. Use relevant naming convention in configuration files and code
  7. Help quickly in case of build breaks
QA
QA teams are the biggest beneficiary of this exercise so they need to be very pushy upfront to get BA going. They should help DevOps with a list of BVT tests to validate test deployment and provide with reliable automation.
Project Management
BA Project must be driven by PM team. It’s an essential part of their repertoire to be aware of the challenges and modalities involved.
What’s next?
In the subsequent blog posts, I will cover different aspects of Build automation and tools using different use cases. Be on the lookout for the same and all the best for Automation projects. Do reach out to us in the case of any help needed in your process and we are sure to be a force multiplier for your requirements.

Sunday, August 6, 2017

5 Key Questions to Ask When Evaluating BI Tools

BI Tools Evaluation Criteria - Sigma_Infosolutions


Geoffrey Moore, an American organizational theorist, management consultant, has rightly said: “Without big data analytics, companies are blind and deaf, wandering out onto the web like deer on a freeway.” While the market has a growing variety of vendors for BI tools due to technological advancement, every tool is distinct in its own way. The overall picture may be comprehensive and a flashy demo may convince you that it is the right one, but to pick wisely businesses have to remember that the subtle differences between the tools are the main markers to be carefully considered.

To evaluate these differences, one has to first understand the business requirement and question the intricate aspects to get a better insight into the tool’s hidden limitations. Any tool that you pick should primarily be able to organize the surplus data and aptly generate business charted analysis or reports. There is no one model that can be used like a “super brain” to comprehend the business requirements and generate data on its own will.
Some of the basic questions to address before zeroing would be:

1)    Is the tool shallow in collating data?

Allow the business requirements to predetermine the tool’s function because a tool is an aid to the business and not the other way round. As most BI tools just look at the individual organizational silos and tend to miss out on collaborating coherent information, an ideal BI tool should collate information from the entire business operation, analyze the core functional areas, compare data from multiple types of ERP and external data sources, and then generate a consolidated analytical model that involves all the intricacies.

2)    Does the tool generate a mere report or engage analysis? 

Businesses, most often, fail to see the difference between a summary report and an analytical report. The generated report should be able to map through all the different sectors and at the same time, the collated information should not be just a data mesh. Well, sorted data generation should be the key component for proper functioning in a BI tool.

3)    Is the data current or time-stale? 

The tools used in any form of business should generate data that is updated to the current numbers. If this is absent, then the data generated would be just figures of the past rather than the present. The tool should be well equipped to spontaneously downsize essential data in order to ensure that the business stays in the competition and does not get backlogged.

4)    How fast is the tool and how flexible it is? 

An effective BI tool which can turn out spontaneous reports with the collated information is a major advantage for proper projection of growth and damage control in an organization. Any tool that takes days to churn out information will be of no use to the business. Besides, just like technological evolution, the tool should be technically adaptable to the market trends as tools become easily outdated within few months at times.

5)    How soon can the BI tool be put into play and is it an all in one package?

While most organizations pick ready to use services, building custom BI tools should be quick too; as otherwise, the business would be overlaying progress without the requisite projections. Also, there is no one tool that fits the ideal package. Hence, businesses can try out trial runs and then pick one that fits their parameters and tweak the little details through their IT department in order to ensure that their needle in the haystack is sorted out.

Have you evaluated your BI tool on the above criteria? What are some primary factors you consider while evaluating Business Intelligence tools & services? Sigma’s BI services ensure flexibility on BI tools which are suited best for your business. Do leave your thoughts in the comments section below.
Software Development Blogs - BlogCatalog Blog Directory RSS Search Technology Blogs - Blog Rankings Blog Directory