SmartQA Community

On Career Growth

(In this SmartBits, Anuj Magazine outlines “ On Career Growth “.  The video is at the end of this blog)

Career growth can be divided into two buckets. One is  the nonlinearity aspect of the career, and the other uni-dimensionality. One of the things found common in most reasonably sized organisation is that each and every organization has career paths and the way the career paths tend to get designed are that there is an entry role that one gets into post college and then there is a role at the top.

The role which is essentially at the top of these ladders are of a VP or a Department Head.  Looking at these career paths, the highest designation in the organization is that of a CEO, if we associate these two logically, we will tend to think why organizations do not give a path till the CEO role in the organisation. This led to the question on the linear approach of following the career paths,  that are designed in the organisation.

Mark Templeton CEO of Citrix for around 20 years and quite respected in his field said that career paths up to the top in the organisation rarely tend to be linear, they always zigzag. One needs to figure out where the next dot is, to move forward. This questions the rationale behind the linear careers. There is nothing wrong having a predictable career path. It does help to solve a problem in the organisation. For instance HR wants predictable processes, even employees want them too, and there is nothing wrong with that, as not everyone wants to be a CEO. But there are other merits to following a nonlinear path.

The second part is on the uni dimensionality, let us take example of startups, .When the startup is new and the product market fit is not achieved, people play different roles being in one designation such as marketing, coding, testing or they may be hustling around and doing sales. In early stage organizations, one can afford to be a specialist in the interest of moving the organisation forward, but when it comes to scale, the uni- dimensionality, the specialisation matter,  of having a deep knowledge of one subject or maybe a related set of technology help scales the organisation and go to the next level.

Should I be a generalist or a specialist? If you want to be a specialist, choose a field that is going to be relevant in the time to come.The people who chose artificial intelligence and machine learning fifteen years back are reaping the rewards of that. In IPL we see around 200 odd cricketers from India in that competition, which is hardly around two to three percent of the cricket playing population or even less and these are like hyper specialised individuals who specialise in their areas.For choosing a specialised field it is better to have the conviction to be in the top ten or twenty percent so that one can reap the rewards in the time to come.

Generalists are people who are more adaptable, who can learn a new skill in a shorter time and deliver value and it is more akin to the gig economy. Pick up the rules for some time and then move on to something else. There is nothing wrong in both of them. Both have its merits and demerits. Hyper specialisation is going to be the thing in the future.

The QA profession has been under pressure from external forces, as decision makers in organization want to see more value. It comes to more of an economics decision, that we always called as cost of quality, we never use the term profit of quality. We need people who can represent QA in a boardroom where value can be showcased, and that is lacking at the moment.

Build in quality

(In this SmartBits, Girish Elchuri outlines “Build in quality“.  The video is at the end of this blog)

One of the important things we have to remember is that we can not add quality. Adding quality is not similar to painting a wall. It has to be built in and for that, we should have all the processes in place, starting from architecture, design and development and so on.

An example in this context would be the way a German car company makes cars. They build the car, do extensive testing, fix it and then only release it to the customers so that they get an excellent product. Looking at another company in Japan that makes high end luxury cars, when this company finds a problem in one of the cars before it is shipped off, they stop the assembly line. Then, they find out where the defect has been introduced, fix the process and throw away all the cars that are manufactured with that defective process, and start manufacturing again. The result is that Japanese car with the same quality and luxury of German car is build at one-third the cost. It is a classic living example we have.

Building quality through absolute processes is more beneficial, important and efficient than trying to add quality or stating that quality can always be added later. When a pizza gets burnt then can the quality assurance team make it proper? Certainly not, because it is a one-way process. It is important to realize that we can’t add quality. 

We have to only build quality into a product as part of our architecture, design and development. It has to be the attitude of the organization. It has to come from top- down. Organizations, people and processes should have the attitude of building quality in and not adding it later.

System deployment architecture and testing

(In this SmartBits, Zulfikar Deen outlines System deployment architecture and testing“. The video is at the end of this blog)

it’s extremely important to understand system deployment architecture. Let’s say we are delivering the same system and as an organization we decide to deploy it in the cloud. It requires a completely different way of testing and we need to show whatever was appropriate works correctly.

If we decide to do a hybrid for various reasons, the solution remains the same, but as an internal decision-making, we may have to make a decision based on compliance. In India, we could take the data into the cloud without much of an issue, especially when the data centres are hosted here, but the same may not be the case in other countries. 
The same solution deployed differently in a different country implies testing has to be different. If it is a hybrid we are trying to work with the two different system fidelities, where part of the system sits on-premise, some of them on cloud. We need to be sure that the data flows through properly and it is secured.

We may not worry about system security between two systems if the entire system is deployed at one place, whereas if it goes through public and private clouds then testing has to be slightly different. If it is completely on-premise then deployment and  testing differs.

If we decide to use a part of it even in the cloud and using only infrastructure as a service the testing would be different. Again if we use a part of a service, say using a data factory from Azure, the testing has to be different because we are using a service provided by the cloud in a different way. We need to make sure it works, if we decide to use advanced services. If we were to use it as part of a whole system, it can’t be tested on-premise. Definitely we need to understand the deployment architecture, how we are planning to deploy and the testing therefore has to be appropriately done for that.

What is Blockchain?

(In this SmartBits, Yuvaraj Thanikachalam outlines What is Blockchain? “. The video is at the end of this blog)

The way we move files from one system to another system via copy and paste “Ctrl-C & Ctrl-V” is very popular with every computer savvy professional. On similar lines can we Ctrl-C and Ctrl-V the money? Why can’t we move the value in a digital format?
A gentleman by name Satoshi Nakamoto wrote a white paper on Bitcoin. He was trying to decentralize the internet, by creating a mechanism in the digital world to move the value from one pocket to the other without any central authority in place. To do this, he took this problem and solved it by using peer-to-peer technology, encryption technology, and database technology. Bringing all this together he created a solution called Bitcoin which everybody referred to as a Blockchain.
Bitcoin is one of the applications of Blockchain, it is not the Blockchain. Underlying technology which powers the innovation of moving money like a copy of a file from one pocket to another pocket solving the double-spending issue was revolutionary.

Efficiency -> Productivity -> Creativity

T Ashok @ash_thiru  on Twitter


Efficiency is a given now , high productivity aided by intelligent systems will become a norm, so what is our role?  In this age of automated & continuous testing, efficiency gains are given and productivity is on the increase. In this era of AI systems, it is time we shift from productivity to creativity. 

Over the years there has been an interesting shift in how we engineer software. Starting with emphasis on process systems in the 90s to ensure consistency and repeatability, we moved on to enhancing efficiencies with tools and Agile processes. Now the focus has shifted to productivity and value by fostering re-use(components, libraries, patterns, frameworks etc), cross-functional teams and more recently, using AI systems.

Efficiency is a given now, high productivity aided by intelligent systems will become a norm, so what is our role? The future is about creativity, a lot of people say. 

In testing with extreme focus on automated & continuous testing, efficiency gains are given and productivity is on the increase. With systems built using multiple frameworks, deployed in various environments, with high business criticality, high expectations of users, the demand of future demands tech savviness and serious creativity – ‘SmartQA , that implies how to get work done efficiently with value focus driven from creative angle’. 

Here is a short summary from two interesting articles on the Efficiency -> Productivity -> Creativity shift.

Focus on productivity, not efficiency

In the article Focus on productivity, not efficiency   Aytekin Tank says Ford reduced the manufacturing time of car from 12 hours to 2.5 hours by improving efficiency, breaking the company’s Model T automobile assembly into 84 distinct steps, with a worker specialising in a task and using power-driven machinery to do the work.

The tide changed in 2015 from being focused on productivity over efficiency. Efficiency is about doing more with less whereas productivity is about doing more with the same.

He then shares four tips on how to lead an organisation with productivity:

1. Team productivity > individual efficiency – Cross-functional teams work on one project at a time.
2. Get out of the way – Stop interrupting  the workflow of team members with meetings that don’t necessarily require their presence.
3. Maximize your MVPs – Do not box talented individuals placed in organizational roles that limit their effectiveness.
4. Lose the “more is better” mentality – Focus on impact not staying busy

Creativity is the new productivity

In the age of A.I and machine learning, just being more productive won’t cut it. The future belongs to the creatives says Scott Belsky in the article Creativity Is the New Productivity.

In his picture of Human Productivity Parabola , he says we have now passed the point — call it the “Productivity-Creativity Inversion” — where machines (algorithms, robots, etc.) have become a better investment for future productivity gains than humans. At this point, we as humans are better off spending our energy on creativity than on productivity.

From Creativity Is the New Productivity

Productivity, previously scarce and valuable, is now abundant and commoditized, and hence we need to creativity, a truly scarce resource whose value is on the rise, he says. He depicts this as a picture consisting of three phases – The Era of Productivity Scarcity, The Era of Productivity Abundance, and The Era of Creativity.

From Creativity Is the New Productivity

He continues on to state that AI will liberate creativity, by allowing machines to take over the mundane tasks, enabling us to be more creative.

A quick primer on AI

Curated by T Ashok @ash_thiru


This article is curated from SIX articles as a quick primer on AI. Starting with a glossary on AI, it delves into tacit knowledge as codified via ML  and understanding the difference between ML & AI. A quick peek into deep learning and challenges of explaining the patterns ending with a interesting piece AI written by an AI program.

Glossary of terminology in AI

Here is an easy article “A Simple, yet Technical Glossary of Terminology in AI” that provides simple glossary of terms, abbreviations, and concepts related to the field and sub-fields of artificial intelligence. Based on technical definitions by pioneers and leaders in AI.

ML & Polanyi’s paradox

“Explicit knowledge is formal, codified, and can be readily explained to people and captured in a computer program. But tacit knowledge, a concept first introduced in the 1950s by scientist and philosopher Michael Polanyi, is the kind of knowledge we’re often not aware we have, and is therefore difficult to transfer to another person, let alone capture in a computer program. Machine learning has enabled AI to get around one of its biggest obstacles, the so-called Polanyi’s paradox.” Read more in this article ”What Machine Learning Can and Cannot Do

ML & AI – The difference (1)

There’s much confusion surrounding AI and ML. Some people refer to AI and ML as synonyms and use them interchangeably, while others use them as separate, parallel technologies. In many cases, the people speaking and writing about the technology don’t know the difference between AI and ML. In others, they intentionally ignore those differences to create hype and excitement for marketing and sales purposes.” This article  “Why the difference between AI and machine learning matters” attempts to disambiguate the jargon and myths surrounding AI.

ML & AI – The difference (2)

“Unfortunately, some tech organizations are deceiving customers by proclaiming using AI on their technologies while not being clear about their products’ limits. There’s still a lot of confusion within the public and the media regarding what truly is artificial intelligence, and what truly is machine learning. Often the terms are being used as synonyms, in other cases, these are being used as discrete, parallel advancements, while others are taking advantage of the trend to create hype and excitement, as to increase sales and revenue.” says Roberto Irliondo in his article “Machine Learning vs. AI, Important Differences Between Them”.

Explaining how patterns are connected

Deep learning is good at finding patterns in reams of data, but can’t explain how they’re connected. Turing Award winner Yoshua Bengio wants to change that, read about this is this article “An AI Pioneer Wants His Algorithms to Understand the ‘Why’

Chapter on AI written by a AI program

Here is an interesting excerpt from an ‘autobiographical’ chapter written by an AI program “This chapter on the future of Artificial Intelligence was written by Artificial Intelligence”, excerpted from the book “The Tech Whispererer”.