In the quest for next big break or opportunity out there, I have come to expect these interview questions and I have decided to blog about it.
After months of relentlessly applying for jobs with mix and match of skill sets that I am eligible for, going through multiple ghost callings (I believe that’s what it is called when recruiters spend around 20 to 30 minutes inquiring everything about you, explaining the job needs, setting the pay expectations, raising false hopes and then never hearing anything back even after sending out multiple mails or messages), I was fortunate to be set up a call finally with actual person in an actual company.
Unfortunately, the interview didn’t pan out well. In terms of actual interview itself, it was more about getting to know about the current role, about the team I would be working with and then straight down to the actual interview. The first question that I was asked was the ‘Toughest challenge’ till date. It caught me bit off-guard as I was expecting some technical questions to start with before settling down on behavioural and finally ending with the expectations of next meeting. I did my best to explain things I have done but I guess it just was not good enough.
- This will act as not only reference to myself but also reminder of the other ‘tough challenges’ that lie ahead that I need to face in the future.
- Second and foremost reason being writing this down has been very cathartic.
- Not EVERYTHING can be put on resume, so here it is then.
Without further ado, let me get to the Toughest Challenge question. Questions like this needs to be addressed via STAR system.
Toughest Challenges Faced –
My stint at Accenture has been the most challenging and most gratifying time of my professional career. I had worn several hats during my 5 years there – started off as a Software Engineer, BI Developer, BI Team Lead, Operations Team Lead (BI). There were numerous times when I had to stay back at the office for one whole day and more to ensure smooth operations. As I write this, it feels as if it was only yesterday I left Accenture as everything feels so real even though it’s been 5 years since I left.
Here are few of the most challenging times and how I handled them –
- Data Migration –
- Challenge – Project was being upgraded from version 1 to version 2. This necessitated initiating data migration from Operational Data Store (ODS) to Data Warehouse (DWH). Daily Nightly Job at that point was designed to handle about ~1M records data migration in about 4-6 hours between ODS and DWH on daily basis. Task was to migrate about 90M records.
- Action Taken – I re-architected the ETL jobs for loading the main fact tables end to end to come up with a scalable, efficient and most performant solution. This involved introducing cached lookups, replacing row by row updates with bulk updates, removing blocking transformations, creating indexed views etc. I had also included audit table to track each batch progress.
- Result – Data migration was successfully achieved in 12 Hrs timeframe as opposed to original design that took about 240+ hours. Additionally, some of the design changes introduced were kept unchanged for daily normal runs thereafter.
- BI Data Feed
- Challenge – BI Data Feed solution was to be implemented and that required someone to be familiar with C#.Net and BI technologies to develop a command line application to ingest data coming from 4 different SQL Server instances comprising 60+ tables. Business logic too dictated numerous complex rules that had to be incorporated.
- Action Taken – Upskilled myself in learning to write efficient C#.Net multi-threaded application, using SMO.Net API calls, BCP commands and combining that with dynamic SQL and SSIS to build the needed solution.
- Result – Solution was delivered much ahead of the time adhering to business specifications.
It was through this company that I first landed here in Sydney, Australia and I worked as Technology Lead for LINK Group over 2.5 years. Best thing I found about the role I can say is the complete autonomy that was offered to perform the task needed with whatever tools available at hand.
Here are few of the most challenging times and how I handled while working here-
- Data import monitoring –
- Challenge – Data imports were regularly done over varying periods of time using out-of-the-box TrimPort and HP Trim. There was no proper way to track the imports to estimate the job completion times resulting in unnecessary waiting for data completion or not able to report on the progress.
- Action Taken – Developed a Power BI dashboard to report the batch import progress via monitoring the data at SQL Server. Created a SSIS package that continually refreshed the data at regular intervals.
- Result – Created a wholistic pluggable data monitoring solution that is fully flexible for future monitoring and giving flexibility for the end-users to properly plan the imports.
Macquarie Group –
Work at Macquarie Group has been the most joyous one. It is a company that I would say puts technology at the forefront of whatever they do.
Here are few of the most challenging times and how I handled while working here-
- Historical data migration failure handling –
- Challenge – Production release got halted when an issue was identified with one of the data migrations script for historical data pertaining to 18 reporting months. Reproducing the issue in non-Prod environment was not possible as it did not have requisite data. Upstream flows had huge impact in absence of this change and release window for the change implementation was not long.
- Action Taken – I broke down the analysis into multiple phases, first by testing out the column counts, then schema validation and identifying the root cause of the problem. Developed unified script handling all the discrepancies. Performed data sampling by running it across the whole dataset to ensure it is working.
- Result – Was able to deliver the solution within two hours after the issue was identified after going through proper peer review process, attaching test evidences, and obtaining business sign-off.
- Setting up of new Environment –
- Challenge – Application team was living on dangerous grounds with only one environment to work with i.e. Production. All the workflows (over 40+), dependent objects, were being run from one drive and there was no safety net. Periodic backups were taken but it was not an efficient system. UAT process had to be set-up.
- Action Taken – I broke down the needs into categories of varying complexities and needs. Co-ordinated with multiple teams in setting a new environment, creating a new repository for version control (BitBucket), performing unit testing and system testing on the UAT branch. Integrated Alteryx with Python in developing a localization workflow that could refresh the underlying data pertaining to an environment.
- Result – Robust new system was in place for the team to work with without having to take multiple backups each time they had to work on change. End to End integrated module was set-up right from raising a Jira, to merging the code was properly documented in a workflow detailing each step.
- Performance enhancement –
- Challenge – Regulatory report containing complex business logic needed to be replicated with data on the data hub. Data pertained to movements across two successive reporting periods and the challenge was to perform cross currency conversion across 15+ reporting currencies using Power BI. Incorporating the business logic within the Power BI was leading to slow-performing reports that was not effective to scaled data volumes. Task needed was to create a scalable solution that also enhances the response time.
- Action Taken – Solution was tackled by moving all the logic on to the data hub. Utilizing the power of Impala queries on the hub, cross currency conversion, look up data inclusion, and using the Pivot along with cross join, whole business logic was replicated. Decision was taken further to move the output to a flat table instead of retrieving it directly from Power BI.
- Result – Highly performant report was generated by using DAX queries and pulling in the transformed data from the flat table. Report performance was brought down from 300+ seconds to <20 sec.
Amaysim was the first product company that I worked with. One of the things that blew my mind was orientation classes that were planned in the first week of joining in the company. I mean who even does that these days? There were series of workshops where different people chipped in and explained the company culture (even CEO).
As I listed here, it was a role that I got accepted because of their belief in my technical capabilities and not the tools themselves and this led me to work on technologies such as Alteryx, AWS Redshift, PostgreSQL, Tableau etc.
- Report Generation –
- Challenge – Finance team were tasked to generate Revenue Reports of NBN customers. This entailed following series of steps in document that involved manual extraction of data, data clean-up for discrepancies, data reconciliation and repeating this process over multiple use cases.
- Action Taken – Developed Alteryx workflow that hooked data right at the source, embedded the business logic within the workflow and using the Tableau Data Extract (TDE) as output with no manual intervention and data getting refreshed at scheduled time.
- Result – Tableau dashboard that business could directly use to perform the reporting, analytics saving hours of effort every month in getting it done.