Offshore I.T. Services
Lean Six Sigma
Lean Six Sigma is a process improvement methodology designed to eliminate problems, remove waste and inefficiency, and improve working conditions to provide a better response to customers’ needs. It combines the tools, methods and principles of Lean and Six Sigma into one popular and powerful methodology for improving your organization’s operations.
Lean Six Sigma’s team-oriented approach has proven results in maximizing efficiency and dramatically improving profitability for businesses around the world.
There are three key elements to Lean Six Sigma.
Tools and techniques: A comprehensive set of tools and analytical techniques that are used to identify and solve problems.
Process and methodology: A series of phases that organize the use of the problem-solving tools to ensure that the true root causes are found and that a solution is fully implemented.
Mindset and culture: A way of thinking that relies on data and processes to achieve operational performance goals and continuously improve.
These three elements reinforce each other. Analytical techniques are not used effectively unless there is a process for applying them and a mindset of continuous improvement creating the need for them. An improvement process does not produce the desired results unless it includes the tools and techniques that define the activity of the process steps and there is a culture that insists on systemic data-based approach to solving problems.
Finally, a culture that seeks to continuously improve will be frustrated if there are no tools and techniques for analysis and no process or methodology that can be applied to organize and focus the improvement efforts. Fortunately, the Lean Six Sigma approach to business improvement includes all three layers.
SAP
The basic idea behind introducing SAP (System Applications and Products) was to provide the customers the ability to interact with common corporate databases for a comprehensive range of applications. SAP is an integrated ERP (Enterprise Resource Planning) to make business process work efficiently.
SAP testing process is usually divided into three phases −
Test Planning
Test System setup
Test Execution and evaluation
Test Planning
Test planning includes the steps that are involved in the initial phase of testing.
Gathering the requirement. What needs to be tested? Functional requirements to be collected for system and application testing.
Test-case development for manual and automation testing. In automation testing, various tools can be used for creating test-cases.
Test System Setup
Test system setup involves setting up the test environment to run the test-cases. Here, the tester needs to define key metrics for reporting.
Test Execution and Evaluation
Test execution and evaluation involves executing the test-cases and noting down the output. It includes the following activities −
Defect handling and reporting.
Assessment of Test plans as per result.
Documentation of all defects and compare the results with key metrics.
Artificial Intelligence & Machine Learning
Artificial intelligence (AI) is a wide-ranging branch of computer science concerned with building smart machines capable of performing tasks that typically require human intelligence.
What are the 4 types of Artificial Intelligence?
Reactive Machines
Limited Memory
Theory of Mind
Self-Awareness
The biggest question in Technological history, by Alan Turing is “Can Machines Think”?
Turing's paper "Computing Machinery and Intelligence" (1950), and its subsequent Turing Test, established the fundamental goal and vision of artificial intelligence.
At its core, AI is the branch of computer science that aims to answer Turing's question in the affirmative. It is the endeavor to replicate or simulate human intelligence in machines.
AI automates repetitive learning and discovery through data. Instead of automating manual tasks, AI performs frequent, high-volume, computerized tasks. And it does so reliably and without fatigue. Of course, humans are still essential to set up the system and ask the right questions.
On the other hand, we have Machine learning, which is a part of AI in many ways.
Machine Learning is the field of study that gives computers the capability to learn without being explicitly programmed. ML is one of the most exciting technologies that one would have ever come across. As it is evident from the name, it gives the computer that makes it more similar to humans: The ability to learn. Machine learning is actively being used today, perhaps in many more places than one would expect.
Resurging interest in machine learning is due to the same factors that have made data mining and Bayesian analysis more popular than ever. Things like growing volumes and varieties of available data, computational processing that is cheaper and more powerful, and affordable data storage. All of these things mean it's possible to quickly and automatically produce models that can analyze bigger, more complex data and deliver faster, more accurate results – even on a very large scale. And by building precise models, an organization has a better chance of identifying profitable opportunities – or avoiding unknown risks. Most industries working with large amounts of data have recognized the value of machine learning technology. By gleaning insights from this data – often in real time – organizations are able to work more efficiently or gain an advantage over competitors.
Cloud Computing
Cloud computing is the delivery of on-demand computing services -- from applications to storage and processing power -- typically over the internet and on a pay-as-you-go basis.
Rather than owning their own computing infrastructure or data centers, companies can rent access to anything from applications to storage from a cloud service provider. One benefit of using cloud-computing services is that firms can avoid the upfront cost and complexity of owning and maintaining their own IT infrastructure, and instead simply pay for what they use, when they use it. In turn, providers of cloud-computing services can benefit from significant economies of scale by delivering the same services to a wide range of customers.
Cloud-computing services cover a vast range of options now, from the basics of storage, networking and processing power, through to natural language processing and artificial intelligence as well as standard office applications. Pretty much any service that doesn't require you to be physically close to the computer hardware that you are using can now be delivered via the cloud – even quantum computing.
Building the infrastructure to support cloud computing now accounts for a significant chunk of all IT spending, while spending on traditional, in-house IT slides as computing workloads continue to move to the cloud, whether that is public cloud services offered by vendors or private clouds built by enterprises themselves.
Data Science
The simplest way to understand data science is the extraction of actionable insights from raw data. Data science involves a plethora of disciplines and expertise areas to produce a holistic, thorough and refined look into raw data. Data scientists must be skilled in everything from data engineering, math, statistics, advanced computing and visualizations to be able to effectively sift through muddled masses of information and communicate only the most vital bits that will help drive innovation and efficiency.
Data scientists also rely heavily on artificial intelligence, especially its subfields of machine learning and deep learning, to create models and make predictions using algorithms and other techniques.
Data science generally has a five-stage life cycle that consists of:
Capture: Data acquisition, data entry, signal reception, data extraction
Maintain: Data warehousing, data cleansing, data staging, data processing, data architecture
Process: Data mining, clustering/classification, data modeling, data summarization
Communicate: Data reporting, data visualization, business intelligence, decision making
Analyze: Exploratory/confirmatory, predictive analysis, regression, text mining, qualitative analysis
Technical tools & skills utilized by Data Scientists:
R
Python
Apache Hadoop
Apache Spark
Apache Pig
MapReduce
NoSQL databases
Cloud Computing
D3
Tableau
iPython Notebooks
GitHub
DevOps
DevOps is a set of practices, tools, and a cultural philosophy that automate and integrate the processes between software development and IT teams. It emphasizes team empowerment, cross-team communication and collaboration, and technology automation.
A DevOps team includes developers and IT operations working collaboratively throughout the product lifecycle, in order to increase the speed and quality of software deployment. It’s a new way of working, a cultural shift, that has significant implications for teams and the organizations they work for.
Under a DevOps model, development and operations teams are no longer “siloed.” Sometimes, these two teams merge into a single team where the engineers work across the entire application lifecycle — from development and test to deployment and operations — and have a range of multidisciplinary skills. DevOps teams use tools to automate and accelerate processes, which helps to increase reliability. A DevOps toolchain helps teams tackle important DevOps fundamentals including continuous integration, continuous delivery, automation, and collaboration.
DevOps values are sometimes applied to teams other than development. When security teams adopt a DevOps approach, security is an active and integrated part of the development process. This is called DevOps.
DevOps lifecycle:
Plan
Build
Continuous integration and delivery
Monitor and alert
Operate
Continuous feedback
DevOps Tools:
Jira software
Confluence
Bitbucket
Opsgenie
Statuspage