You are here:
- Home
- >
- Our Services
- >
- Innovation Delivery
- >
- Artificial Intelligence
View navigation
We offer a number of AI machine learning services, including natural language processing, computer vision, data mining and data forecasting. Our team have the skills to create a bespoke solution according to your needs.
Our AI services allow your business to increase efficiency, save time, extract insights from your data and streamline your business.
In our work with Microbira, we scaled their bespoke machine learning system. We took Microbira’s offline scientific toolkit, which only one user at a time could use to process data, and turned this into a multi-user, scalable web application . Microbira’s system uses a mass infrared spectrometer to capture a profile of an unknown microorganism, which is then processed by a machine learning algorithm to identify similarities with other, known organisms. Our team used Azure Functions’ serverless processing capability to ensure that the system copes with variable demand.
We built:
We also sponsor the University of Oxford’s Reuben College Fellowship in Artificial Intelligence and Machine Learning. The fellowship promotes collaboration between System C and Reuben College and allows our staff to understand emerging research, encouraging them to consider which elements to apply to both our commercial services and our products. David Clifton was a speaker at our AI conference earlier this year, you can view these videos below.
We have a wide range of experience in building commercial quality artificial intelligence based products.
We created a social media data analysis tool to improve the Emergency Services’ situational awareness during both floods and riots. We focused particularly on using the Twitter platform’s search API to access datasets containing significant quantities of emergency-related tweets. We built a machine-learning algorithm to conduct text analysis of the datasets, which, in the case of flooding, identified tweets that contained patterns of word frequency characteristic to this emergency. We then trained the algorithm against a labelled dataset of tweets and assessed its accuracy against a separate labelled dataset. Following training, it proved capable of mining a voluminous stream of flood-related Twitter posts, transforming the dataset into a valuable source for First Responders.
In the above case study, natural language processing was used to evaluate real social media posts, but when we applied our model to Twitter posts during the London Riots, we were able to train the algorithm to identify instances that were not real. This was important because, during this event, certain social media users incorrectly claimed that the London Eye was on fire. Our AI model assessed the tweets around this ‘incident’ and was able to tell that it was not genuine, primarily by analysing the consistency of the tweets and the networks of the people who posted them. We selected a Naïve Bayes Classifier algorithm for this project, which is used extensively in the field of text analysis.
Mauro is an open-source platform that enables clinicians to store, manage descriptions of, and manage relationships between their healthcare data. We have worked with several clients to build different elements of functionality into Mauro, including a tool that can assess the number of metadata fields associated with a particular dataset, along with a score to rank that number. This ‘medallion quality system’ is used by the UK’s national institute for health data science, HDR UK, and underpinned further development in the form of a dashboard, which highlights areas where the quality did not score highly.
We built an application that could capture short, ultrasound video clips of the foetal brain, paired to an ultrasound probe. The videos are then analysed by AI to calculate the gestational age of a foetus. Though the AI for this project was developed by University of Oxford researchers, it was particularly important for our team to understand the machine vision technique (known as convolutional neural network) that the AI was built upon in order to implement the algorithm used by the application.
We have also used text analysis in our MarketPlace product sold to local government. This is a domain-trained search engine that enables users to locate relevant social care service providers within their Local Authority. Its search tool is underpinned by a powerful algorithm trained to understand the nuances associated with specific terminology, enabling it to distinguish between search terms such as ‘home care’ and ‘care home’ and provide the user with extremely relevant results.
We offer both data mining and data forecasting services.
Firstly, we have skills in data mining, we can analyse, extract, and discover important insights from your data which will allow your organisation to work more efficiently. An example of our work using data mining is shown below in our work with ContrOCC Insights.
Our team can use data forecasting to identify trends, patterns, and future projections from your data. This allows your organisation to effectively use your data to make predictions.
An example of our work using data forecasting is shown in our social care product ContrOCC Insights.
The core of ContrOCC Insights is ContrOCC data which when modelled in Power BI enables detailed comparisons of costs, identification of trends and flagging of unusual items for further investigation.
ContrOCC Insights leverages Microsoft Analytics to provide users with forecasting capabilities (allowing LAs to plan for increases in certain types of care), while a map-based visualisation enables them to view nursing homes within or outside an LAs boundaries. Users can then use this information to determine where they can move clients, and they can also use embedded Microsoft features to establish how much longer it will take for family members to visit – helping LA’s find suitable options (see screenshot below). With Power BI, users can also save reporting views, enabling them to easily configure the user interface for future reports and visualisations.
Fill in our contact form if you would like more information, or to request to speak to our team.
An introduction to the Oxford Computer Consultants Fellow for AI and Machine Learning – An overview of AI and Professor David Clifton’s research.
John Boyle and Professor David Clifton discuss the impact of AI and machine learning in health and social care. They consider the specific challenges around soft outcomes and the use of synthetic data.
Kaz Librowski and David Clifton discuss the importance of large data sets; the challenges ahead for AI and the privacy problems around handling personal data.
OCC’s John Boyle and Reynold Greenlaw discuss turning AI research into commercial applications. They review how OCC has helped emerging and established companies harness AI techniques.