Physical Address

304 North Cardinal St.
Dorchester Center, MA 02124

Inside LinkedIn’s AI overhaul: Job search powered by LLM distillation


Join the event that trusts business leaders for almost two decades. VB Transform brings together people who build a real business AI strategy. Learn more


The advent of natural language research has encouraged people to change the way they are looking for information, and Liendinwas Work with many AI models In the past year, hopes that this change is expanding in search of jobs.

The search for jobs fueled by AI from LinkedIn, now available for all LinkedIn users, uses distilled and refined models formed on the basis of knowledge platform for professional social media to restrict potential employment opportunities based on natural language.

“This new research experience allows members to describe their objectives in their own words and to obtain results that really reflect what they are looking for,” said Erran Berger, vice-president of product development at Linkedin, in Venturebeat in an email. “This is the first step of a greater trip to make the job search more intuitive, inclusive and empowering for everyone.”

Linkedin previously indicated in a blush job The fact that an important problem of users was confronted during the search for jobs on the platform was an exaggeration on precise requests of keywords. Often, users hugged a more generic working title and obtained positions that do not exactly correspond. According to personal experience, if I type “report” on Linkedin, I get research results for journalists’ jobs in media publications, as well as the journalist’s openings, which are a completely different set of skills.

Linkedin vice-president for Wenjing Zhang engineering told Venturebeat in a separate interview that they saw the need to improve the way people could find jobs that correspond to perfectly, and that have started with a better understanding of what they are looking for.

“So, in the past, when we use keywords, we essentially examine a keyword and try to find the exact correspondence. And sometimes in the work description, the description of the work can say the journalist, but they are not really a journalist; we always recover this information, which is not ideal for the candidate,” said Zhang.

Linkedin has improved its understanding of user requests and now allows people to use more than simple keywords. Instead of looking for “the software engineer”, they can request “find software engineering work in the Silicon Valley that have been published recently”.

How they built it

One of the first things LinkedIn had to do was revise the capacity of his research function to understand.

“The first step is that when you type a request, we must be able to understand the request, so the next step is that you have to recover the right type of information from our job library. And then the last step is now that you have a few hundred final candidates, how to make the ranking so that the most relevant work appears at the top,” said Zhang.

Linkedin relied on fixed methods based on taxonomy, classification models and older LLMs, which, according to them, “lacked deep semantic understanding”. The company then turned to more modern and already refined model (LLM) models to help improve the natural language processing capacities of their platform (NLP).

But LLM also come with expensive calculation costs. Thus, LinkedIn turned into distillation methods to reduce the cost of using expensive GPUs. They divided the LLM into two steps: one to work on data and the recovery of information and the other to classify the results. Using a teacher model to classify the request and the work, Linkedin said that he was able to align both recovery and classification models.

The method also allowed LinkedIn engineers to reduce the steps used by its work research system. At one point, “there were nine different stages which formed the pipeline to search and match a job”, which were often duplicated.

“To do this, we use a common multi-objective optimization technique. To ensure that recovery and classification are aligned, it is important that recovery classifies the documents using the same MOO as the classification phase uses. The aim is to keep the recovery simple, but without introducing the useless load of the productivity of the AI ​​developer,” said LinkedIn.

Linkedin has also developed a request engine that generates personalized suggestions to users.

Linkedin is not the only one to see the potential of LLM -based business search. Google claims that 2025 will be the year When business search becomes more powerful, thanks to advanced models.

Models like AdhereRerank 3.5 Help Language silos breaks within companies. The various “Deep research” products Since OPENAI,, Google And Anthropic Indicate an increasing organizational demand from agents who access and analyze internal data sources.

Linkedin has deployed several AI -based features in the past year. In October, he launched a AI assistant to help recruiters Find the best candidates.

Deepak Agarwal, the officer of LinkedIn AI chief, will discuss the initiatives of the company’s AI, including the way he has evolved his assistant to hire prototype to productionduring VB Transform in San Francisco this month. Register now to assist.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *