Improving Processes

  • Situation

    Field foreman manually entered production and drilling data into an excel report for every producing well; this often resulted in duplicates and inconsistencies.

    Field foreman then send the daily report to the field office so it could be saved to the server and imported into the enterprise production software. However, because of the data entry duplicates and inconsistencies, duplicates and inconsistencies happened in the system as well, which later had to be fixed.

    Action

    First, I introduced excel data validation to completely eliminate duplicates and inconsistencies. It took extra effort for the foreman to switch from the old report format saved in their machine to the new report with data validation, but we eventually made the full switch.

    Second, I introduced a python. The initial script loaded all historical production data for the most active region into a JSON database.

    After, I introduced a python script that automized the data transformation and loading. Every morning the python script would run and:

    • check for a new reports on the server based on date

    • transform data

    • import new report to a JSON database

    Results

    1. Increase data accuracy by 90%.

    2. Increase operational efficiency by 28 hours a month.

  • Situation

    Engineers and Geologists were only able to view production data from a desktop, but always had to refer back to excel reports for drilling data.

    The production visualization was not up to the standards of the CEO - it did not allow for extrapolation and logarithmic scales were unavailable.

    Action

    I followed an iterative approach and developed a custom HTML, CSS, and JavaScript dashboard/report for the company’s most active area. The dashboard used the data loaded with a python script (ETL section above) hosted in GitPages. This dashboard/report:

    • included production AND drilling data

    • had options for a liner or logarithmic scales

    • the user could extrapolate production data

    • was mobile friendly with a favicon - users pinned it to the home screen and it was easy to find.

    • was password protected

    • included API integration for crude oil WTI pricing

    Result

    1. Increased data accessibility, all stakeholders could access the data from their mobile devices

    2. Decreased operational cost, stakeholders no longer had to print well production linear graph to extrapolate production data

    3. Increase operational efficiency as stakeholders could all view the data from their mobile device and not go from office to office to discuss the data.

    4. The code for this dashboard/report was used for every other production area, as the design was user friendly and the data was comprehensive

  • Situation

    Every month I downloaded production data for various areas of interest from a government agency website.

    The data required extensive transformation - it took me about 16 hours to manually clean all the data.

    Action

    I introduced VBA macros. Since the downloaded data always had the same format, the macro could do all the necessary transformation.

    Additionally, I converted the script to an Add-On, so I completed the transformation with a click of a button.

    Result

    1. Increased operational efficiency by eliminating 16 hours of manual work. This freed up time to develop the JavaScript dashboards mentioned above.

  • Situation

    The VP of Operations reached out for an exploratory analysis on our automatic validation process - she just wanted to understand the process better.

    She also wanted a pretty quick turnaround time, she wanted something the next day or so.

    Action

    I started to query our Snowflake database.

    I started by creating 2 cohorts for the 2 user types that go into the automatic validation process - Onboarders and Providers. 

    For each, I started by understanding the users, such as total unique users, number of times they've been in the process, and processing times. 

    I started to identify inefficiencies with users in the process that were not generating revenue or had minimal engagement.

    Result

    From there, we were able to set new guidelines such as only automatically putting providers in the process if they were actively working ;while requiring inactive providers to request the validations. 


    The analysis and policy changes saved us about 30% of the resources allocated to automatic credentialling and we were able to revalidate active providers 18 hours quicker.

Previous
Previous

Provider Benchmarking Framework: A Data-Driven Approach to Measuring and Optimizing Performance

Next
Next

Dashboard Development - Tableau and BRD