PEARSON PLC ORD 25P  PSON.L 
$1,194.00  $2.00  0.17%  
DIAGEO PLC ORD 28 101/108P  DGE.L 
$2,344.50  $13.00  0.55%  
RECKITT BENCKISER GROUP PLC ORD  RKT.L 
$4,744.00  $17.00  0.36%  
LLOYDS BANKING GROUP PLC ORD 10  LLOY.L 
$55.20  $0.22  0.40%  
MELROSE INDUSTRIES PLC ORD GBP0  MRO.L 
$492.00  $9.20  1.84%  
FRESNILLO PLC ORD USD0.50  FRES.L 
$655.00  $0.5  0.08%  
NATWEST GROUP PLC ORD 107.69P  NWG.L 
$399.20  $2.20  0.55%  
WEIR GROUP PLC ORD 12.5P  WEIR.L 
$2,102.00  $0.0000  0.00%  
STANDARD CHARTERED PLC ORD USD0  STAN.L 
$953.00  $6.00  0.63%  
ENDEAVOUR MINING PLC ORD USD0.0  EDV.L 
$1,610.00  $39.00  2.48%  
OCADO GROUP PLC ORD 2P  OCDO.L 
$306.00  $1.30  0.43%  
ANGLO AMERICAN PLC ORD USD0.549  AAL.L 
$2,336.79  $3.21  0.14%  
ASHTEAD GROUP PLC ORD 10P  AHT.L 
$6,124.00  $32.00  0.52%  
SEGRO PLC ORD 10P  SGRO.L 
$746.40  $1.60  0.21%  
BAE SYSTEMS PLC ORD 2.5P  BA.L 
$1,312.00  $12.00  0.92%  
VODAFONE GROUP PLC ORD USD0.20   VOD.L 
$69.42  $1.82  2.55%  
HSBC HOLDINGS PLC ORD $0.50 (UK  HSBA.L 
$726.91  $4.41  0.61%  
GLENCORE PLC ORD USD0.01  GLEN.L 
$380.21  $0.4898  0.13%  
ROLLS-ROYCE HOLDINGS PLC ORD SH  RR.L 
$528.00  $1.80  0.34%  
UNITE GROUP PLC ORD 25P  UTG.L 
$837.00  $6.50  0.77%  
ANTOFAGASTA PLC ORD 5P  ANTO.L 
$1,682.00  $7.00  0.42%  
CRODA INTERNATIONAL PLC ORD 10.  CRDA.L 
$3,480.00  $21.00  0.60%  
KINGFISHER PLC ORD 15 5/7P  KGF.L 
$284.30  $0.6  0.21%  
SPIRAX GROUP PLC ORD 26 12/13P  SPX.L 
$6,490.00  $25.00  0.39%  
TAYLOR WIMPEY PLC ORD 1P  TW.L 
$126.95  $0.15  0.12%  
WPP PLC ORD 10P  WPP.L 
$812.00  $2.00  0.25%  
RIO TINTO PLC ORD 10P  RIO.L 
$4,916.00  $5.50  0.11%  
HOWDEN JOINERY GROUP PLC ORD 10  HWDN.L 
$807.50  $3.00  0.37%  
MONDI PLC ORD EUR 0.22  MNDI.L 
$1,177.00  $10.00  0.86%  
HARGREAVES LANSDOWN PLC ORD 0.4  HL.L 
$1,089.00  $0.5  0.05%  

Insights Into OpenAI’s New Faster Model : GPT-4o

by Rahil M
0 comments

OpenAI is leading the artificial intelligence world and has become one of the biggest competitors by introducing the GPT-4o.

OpenAI is announcing GPT-4o, a new flagship model that can reason across audio, vision and text in real time. This multimodal model is the company’s fastest. 

Microsoft is thrilled to announce the launch of the GPT-4o setting a new standard for generative and conversational AI experiences. GPT-4o provides a richer and more engaging user experience. 

GPT-4o is one step forward to a more innate human-computer experience and interaction. OpenAI is leading the artificial intelligence world and has become one of the biggest competitors by introducing the GPT-4o. This new age model will attract more users to its platform. The new model is the updated version of the large language model technology that powers ChatGPT. OpenAI’s Chief Executive, Sam Altman tweeted that the company has been working hard on some new things that they believe people might love.

What is GPT-4o? 

The “o” in GPT-4o stands for the word “omni” which means “all” or “everything” as the company claims that their new model has something for everyone. 

Achieving a more natural human-computer interaction, the model accepts any combination of texts, audio, image, and video as input and generates any combination of text, audio, and image as output. It is OpenAI’s fastest model.

GPT-4o is better at vision and audio understanding as compared to existing models. 

How fast is GPT-4o? 

GPT-4o can respond to audio inputs similar to human response time in a conversation in as little as 232 milliseconds, with an average of 320 milliseconds. It includes an improved feature on text in non-English languages and is 50% cheaper. While matching the GPT-4 turbo performance it is much faster than the previous models. 

Tokens are the basic unit in AI that calculates the length of the text input and is able to include punctuation marks, characters and spaces. GPT-4o requires the use of fewer tokens in languages. The token count also varies from one language to another. For example, in Arabic (from 53 to 26), Gujarati (from 145 to 33), Chinese (from 34 to 24), Spanish (from 29 to 26), English from 27 to 24). 

A research revealed that a response in 100 milliseconds is perceived as an instant response, while a second or a little less than a second is fast enough making the users feel they are interacting. On the other hand, a response time of 10 seconds would make the user lose interest and attention. 

How does GPT-4o work? 

A voice mode was used to talk to ChatGPT at latencies of 2.8 seconds (GPT-3.5) and 5.4 seconds (GPT-4) on average in previous AI Models of OpenAI. The voice mode acts as a pipeline of three separate models to achieve this. The first one is a simple model that transcribes audio into text, the second model is the GPT-3.5 or GPT-4 that inputs and outputs text, and the third model is again a simple model that converts the text back to audio. It is a simple process of converting input into output.

The previous model GPT-4 is the main source of intelligence but cannot directly observe tone, multiple speakers, background noises, and loses a lot of information. It cannot produce laughter, singing, or express any kind of emotion as output either. 

Therefore, the new version GPT-4o has been rectified and OpenAI has merged all these functions into one single model for better relatability of the users. GPT-4o comes with end-to-end capabilities across text, vision and audio, reducing the amount of time consumed and information processed. 

GPT-4o archives and excels GPT-4 turbo level performance on text, reasoning, and coding intelligence. It sets new high watermarks on multilingual, audio, and vision capabilities as measured on traditional benchmarks.

Safety and limitations of GPT-4o  

GPT-4o has a built-in safety design in all modalities like data filtering and model’s behaviour post-training. A new safety system is created to guardrail the outputs. Evaluation in all categories show medium risk levels. These assessments involved automated and human evaluations throughout training, testing both pre and post mitigation models. 

The new model GPT-4o has undergone extensive processes to identify new or increased risks with various experts in social psychology, bias and misinformation. These learnings are brought out to improve the safety of interacting with GPT-4o and new risks will be mitigated as they are discovered. 

How much does GPT-4o cost? 

In August, OpenAI launched their ChatGPT Enterprise monthly plan, the pricing of the plan varied as per the user requirement. In January it launched its online ChatGPT store which gave the users access to 3 million custom versions of GPT’s. 

But on the brighter side, this new model GPT-4o is free for all users and their paid users can use up to five times the capacity limits of their free peers, said the Chief Technology Officer of OpenAI, Mira Murati. 

You may also like

Leave a Comment

Subscribe to Our Newsletters

We are a UK-based business awards firm that specializes in recognizing and celebrating exceptional achievements across various sectors. Our team of experts is dedicated to delivering world-class services, including event management, judging, and award design. With a focus on quality and excellence, we aim to showcase the best of international businesses and inspire future success.

Contact us: [email protected]

© 2022 – The Business Pinnacle. All Right Reserved. Developed by Aapta

The Business Pinnacle