About Me

My name is Gary Hutson and I have been working in the public sector for over 17 years, splitting my time between Nottinghamshire Police, Nottinghamshire County Council and Nottingham University Hospitals.

I have worked for a private sector healthcare company called Draper and Dash (now RealWorld Health) specialising in AI and predictive integration into dashboards, in these roles I was a Senior Data Scientist and Head of Data Science and AI. There I aided in the development of the companies successful Data Science Platform (DSP) by developing the algorithms and SQL functionality for the web application. Followed by this list of cool projects:

  • Leading on our AI / ML training course e.g. introduction to data science, ML training days (2 day courses focused on supervised and unsupervised learning) and working closely as an Associate to the NHS-R Community
  • Pathway clustering and associative rule mining of mental health pathways
  • Supervised learning models, augmenting the dashboard and BI offerings in the company, the biggest sellers are our readmission avoidance, stranded / long stayer predictors, LOS estimator, radiology turnaround times and OP cancellation predictors.
  • Unsupervised machine learning – clustering secondary care patients based on acuity and disease prevalence using R and Python libraries – integrated with reticulate
  • Custom forecasting tools deployed into our Command Centre offering (https://draperanddash.com/machinelearning/command-centre-amplification-with-predictive-analytics-and-machine-learning/).
  • Researching how our Computer Vision skill sets can be used for Image / Video classification, relating to detection of irregular scans

From there I moved back into the NHS working for a Clinical Support Unit called Arden & GEM (Greater East Midlands) as the Head of Advanced Analytics – there we worked on development of population health methodologies and models. In this time I developed multiple NLP tools and undertook process mining and improvement analysis and modelling.

After that I worked for a predictive property intelligence company called CoreLogic as the Lead for Data & Analysis where we built Flask APIs and predictive regression models to predict property prices based on millions of housing records.

This was followed by working for a R based company called Ascent undertaking multiple projects, such as the development of an RStudio managed service, built a fraud prediction package using my R skills called FraudsteR and worked on developing a travel time analysis optimisation for a customer.

Presently, I work for a company called Crisp Thinking where I head up the machine learning function for the company. This involves lots of Python development, fine-tuning of transformer infrastructures, natural language processing and understanding and enriching the platform with ML signals which are appended to highlight hate, abuse, spam, risk, offensive content, endangerment and other CRISP risk types.

I have expertise and interest in the following areas:

  1. R
  2. Python
  3. Statistics
  4. Object Orientated Programming i.e. C#, VB.Net
  5. Web Development (Javascript, PHP, HTML and CS
  6. VBA for Excel and Access
  7. Machine Learning
  8. Deep Learning
  9. Cloud technologies such as GCP and Microsoft Azure
  10. Packages such as H2O.ai, Keras, MxNet, CARET, SciKit-Learn, OpenCV and Tensorflow (accredited Tensorflow Developer)

I enjoy anything technical and have worked on a number of interesting projects in my time, deploying the right analytical technique for the job.

I love blogging, teaching and learning new data science skillsets. I have conducted paid and free R and Python consulting for top companies and led ML in production sessions and training.