Share Dialog
Share Dialog

Subscribe to gonewind

Subscribe to gonewind
<100 subscribers
<100 subscribers
The ChatGPT transformation of the car has been kicked off, with Mercedes-Benz taking the lead.
Not long ago, Mercedes-Benz integrated ChatGPT into the car, opening a three-month test, which showed that its voice assistant could not only complete simple commands, but also carry out continuous multi-round conversations, with a large improvement in comprehension and response quality.
RISO, Skyworth, Azera and other automakers then swarmed in, using cutting-edge GPT capabilities to take the car's intelligence to the next level. The car machine has completely changed from the initial "radio" to a feature-rich intelligent terminal, and with the GPT "brain", it has started to change from a mute, chicken-hearted machine to a driving partner.
And human-vehicle interaction is not the end of AI on board, autonomous driving is the future. The AI model evolutionary upgrade, so that car companies see the opportunity.
It is becoming a mainstream trend to let AI take the initiative to perceive and make decisions, and give up the reliance on high precision maps. A few days ago, Ideal Auto opened the internal test of City NOA (Navigation Assisted Driving), which uses the BEV (Bird's eye view) big model as the main solution, allowing cars to imitate human "brain circuit" driving. Through continuous learning, Urban NOA can also be trained to "chauffeur" users on their commuting routes.
Taking over the Internet, AI is making deeper transformation of the car, and the big guy with four wheels is becoming more and more like a Transformer.
The car machine + GPT Mercedes-Benz first
A "metamorphosis" from the inside out is sweeping the automotive circle, from traditional fuel power to new energy, from driving tools to intelligent products. Over the years, driven by technology, cars have been changing their appearance and interior, and after the Internet has transformed cars, artificial intelligence is here again.
Mercedes-Benz is leading the way in this new wave by transplanting ChatGPT into the car.
On June 16, Mercedes-Benz's 3-month ChatGPT test program was launched in the U.S. It has partnered with Microsoft to integrate ChatGPT into the car engine through Azure OpenAI service. Owners can choose to use ChatGPT through the Mercedes me app, and there is a more intuitive way to test it - use the voice command "Hey Mercedes, I want to join the test program" directly in the car, and Mercedes' MBUX infotainment system will automatically connect the voice assistant "Hey Mercedes" to ChatGPT.
In the past, Hey Mercedes could provide information such as sports and weather, answer questions about the vehicle's surroundings, and control the user's smart home, all of which were standardized. This is ChatGPT's watchword.

Currently, only about 900,000 MBUX-equipped Mercedes-Benz vehicles in the U.S. can test ChatGPT on a priority basis, and Mercedes-Benz intends to use this initial testing period to gain insight into requests from users to determine future development priorities and adjust launch strategies for different markets and languages.
On the matter of accessing ChatGPT, Mercedes-Benz gave an emotional statement, "Everything is aimed around redefining your relationship with Mercedes." Mercedes-Benz wants ChatGPT to reshape the human-vehicle interaction experience, and the analogy is more like the car machine "coming to life" from a mute, function-oriented machine to the role of an in-car life partner.
After Mercedes-Benz, domestic automakers are the first to follow suit.
On June 19, Ideal Auto launched its self-developed cognitive big model "Mind GPT", which was developed by Ideal's spatial algorithm team. It is said that the start date of the big model training was long before the release of ChatGPT. Based on 10 TB of original training data, Mind GPT uses 1.3 trillion Tokens for base model training and can recognize voice patterns and voice content, as well as understand dialects, while providing travel planning for car owners, and even having functions such as AI painting and AI computing.
RISO revealed that after the release of Mind GPT, the ideal car will have a new LUI (user language interface) interaction, "For example, if you want to eat a hot pot, you just need to call the ideal student, and the car interface will generate a picture of the hot pot for you to choose, and then the travel route will be calculated automatically."
Skyworth Auto also recently announced that two of its models, Skyworth EV6II and Skyworth HT-iII, have integrated ChatGPT in their smart car machines. In addition, four car companies, Great Wall Motor, Azera Auto, Xiaopeng Auto and Chery Auto, all applied for GPT-related trademarks last month.
GPT on board has become a trend. According to Junyi Zhang, Managing Partner of Auld Lang Syndrome, the access of GPT technology can enhance the human-machine interaction ability of the car and the interaction ability of comprehensive environmental issues. In the future, car companies will have less and less brand differences in terms of hardware in the same price segment, and when it is difficult to produce much difference in comfort, safety, power and range competition, volume intelligence becomes an inevitable choice.
Intelligent cockpit with a "brain"
ChatGPT is another bookable entry in the evolutionary history of automobiles. The most cutting-edge natural language processing model is applied to human travel tools, and a richer in-car life experience will emerge.
Looking back to more than 30 years ago, the car entertainment function and car computer intelligence, is still a blank. The first generation of car machine was born in the 1980s and 1990s, when people's general concern is still the car's engine, chassis and transmission "three major", suddenly there are some models not only can listen to the radio, but also can swallow the tape, free to play music, the car has some of the shadow of the second living space.
The second generation of the car is added to the DVD playback, MP3, entertainment highlights at the same time, the car and a step towards the driving experience, joined the car navigation. At this time, the solution to the problem of "road blindness" has become a mainstream trend. Many old drivers will remember that in the era of no car network, the Kaleidoscope car navigation has become the standard for high-end models, which uses GPS satellite positioning and map package data in the car to achieve relatively accurate navigation accuracy.
However, except for navigation and listening to songs and radio, people at that time did not expect much from the car machine, and the car machine was often not the main factor in deciding whether to buy a car or not.
In the 21st century, electronic digital technology continues to develop, the form of cell phones first changed. Following this evolutionary line of thought, a large screen appears on the car machine and intelligence becomes a new selling point. Car machine based on linux, WinCE, Android and other systems have been adopted by car manufacturers, and then the car can not only free real-time navigation, but also has a panoramic visual system, car driving assistance systems, such as 360-degree image.
When the car is connected to the network, everything by becomes more different. Online movie, road book, voice control, appointment maintenance, remote diagnosis and other functions were added to the car machine, the screen in the center console is getting bigger and bigger, more and more functions, some manufacturers directly in the cab installed a larger display than the tablet, and even manufacturers have recently rolled up the "full screen", even the passenger and rear are to install the screen.
Finally, the concept of the "third screen" is becoming more and more conspicuous, and OEMs hope that the car can become the third generation of intelligent terminals that affect human life, following computers and cell phones. With technology-rich car machine to occupy the user's mind, expand more business models, become the direction of the car companies are now attacking.
Now, the original concept of "car machine" is gradually replaced by "intelligent cockpit". Not only is the car more and more intelligent, but car companies are starting to roll up the interior materials, audio system, lighting system, and Azera has also released an AR glasses to support movie viewing with the car; the ideal L9 is even equipped with a rear refrigerator, making the car a moveable house.
But whether it's the car or the intelligent cockpit, voice conversation has been a relatively lagging feature in development, and considering driving safety, voice control is very necessary.
Over the past decade, almost all car companies and a large number of AI startups have invested a lot in the field of natural language processing, hoping to optimize the voice interaction experience in the car. Many of them can answer simple pre-set commands, such as turning up the temperature and forecasting the weather, etc. Upgrades and innovations revolve around broadening natural language commands, such as turning on the air conditioner to cool the car or turn down the temperature when the user says "it's a little hot.
But to make the car understand more "human" words, such as planning routes in various dialects and even finding restaurants, may not be as effective as the car owner himself using mobile maps and public reviews, richer voice-based human-vehicle interaction fell into a bottleneck, until ChatGPT appeared.
Natural language large model products (ChatGPT, Wenxin Yiyin, Tongyi Qianqian, etc.) are directly open for C-sides to use, so that developers of intelligent cockpits see the light. The powerful comprehension and logical reasoning ability is expected to make the car machine a driving assistant with hidden business possibilities.
For example, the car owner can tell the voice assistant, "help me find out the destination near the group purchase discount, rating more than 4.5 hot pot restaurant, there will be five people dining, give me a reservation location, and then look at where to park convenient." Put in the past, the car machine absolutely can not understand so much information at once, but for ChatGPT, this is just its base manipulation, as long as there are enough real-time data sources, the possibility of needs being met can be infinite.
The addition of GPT does not just make the conversation smoother, but gives the car a "brain" that can not only answer questions, but also understand needs and generate answers. As for how high the IQ, how fast the response, depending on the car manufacturer's ability to train a large model on board, as well as the courage to "kryptonite" on more bull hardware (chip).
AI how to make the self-driving "brain circuit" more like people?
The richness of life in the car makes the car gradually grow into a carrier full of warmth, it is no longer a boring, cold means of transportation, but a comfortable living space.
The evolution of AI-led cars is not only about GPT, but also about the promotion of self-driving technology, which is even more significant.
The traditional approach to autonomous driving research is to cover all possible driving scenarios by collecting large-scale driving data and testing longer driving miles to ensure that the car has a predetermined response plan in the event of an unexpected situation. However, the complexity of unexpected situations is often unpredictable, and once the system does not have a plan to deal with a particular unexpected situation, driving safety will be greatly threatened.
This is why current assisted driving systems must require the driver to hold the steering wheel in order to deal with real-time emergencies. The learning capability of AI will potentially change this.
Not long ago, a research team from Tsinghua University proposed the self-driving "Trusted Continuous Evolution" technology, which is based on dynamically evaluating the AI's trustworthiness for learning training to ensure that self-driving cars can continuously improve their driving ability from basic active avoidance when encountering new and unfamiliar scenarios, and achieve better driving performance while ensuring safety. Better driving performance.
Simply understood, using AI, self-driving capable cars can actively learn and become familiar with various newly encountered scenarios for continuous evolution, and with the accumulation of driving mileage and data volume, performance is continuously improved.
Ideal Auto is using AI big models for autonomous driving. on June 17, Ideal announced the opening of urban NOA (navigation assisted driving) internal test and will open commuter NOA function to users in the second half of the year. Unlike conventional solutions, RISO adopts BEV (Bird's eye view) big model to sense and understand the road structure information in the environment in real time, so that the car can better imitate the operation habits of human drivers.
In the past, most of the assisted driving systems on most cars used a high precision map solution, which is equivalent to feeding the road conditions to the autonomous driving system in real time for it to make decisions. However, in complex urban roads, there are always areas that cannot be covered by high precision maps and cannot be updated in time, which becomes a major drawback of this solution. With the BEV large model, it is equivalent to AI actively sensing real-time road conditions and making autonomous decisions on driving operations.
Of course, BEV also has disadvantages, for example, at some intersections with large spans and more passing vehicles, the sensor field of view is easily blocked, resulting in the loss of local information as a result of real-time perception at the vehicle end. To make up for this shortcoming, Ideal is purportedly paired with a NeuralPriorNet (NPN) and an end-to-end signal intent network, the former acting as the equivalent of an image reference whenever a car goes through an intersection where a fleet of autonomous drivers has gone; the latter learning how a large number of human drivers react to signal changes at intersections to help the autonomous driving system understand traffic signals.
The ChatGPT transformation of the car has been kicked off, with Mercedes-Benz taking the lead.
Not long ago, Mercedes-Benz integrated ChatGPT into the car, opening a three-month test, which showed that its voice assistant could not only complete simple commands, but also carry out continuous multi-round conversations, with a large improvement in comprehension and response quality.
RISO, Skyworth, Azera and other automakers then swarmed in, using cutting-edge GPT capabilities to take the car's intelligence to the next level. The car machine has completely changed from the initial "radio" to a feature-rich intelligent terminal, and with the GPT "brain", it has started to change from a mute, chicken-hearted machine to a driving partner.
And human-vehicle interaction is not the end of AI on board, autonomous driving is the future. The AI model evolutionary upgrade, so that car companies see the opportunity.
It is becoming a mainstream trend to let AI take the initiative to perceive and make decisions, and give up the reliance on high precision maps. A few days ago, Ideal Auto opened the internal test of City NOA (Navigation Assisted Driving), which uses the BEV (Bird's eye view) big model as the main solution, allowing cars to imitate human "brain circuit" driving. Through continuous learning, Urban NOA can also be trained to "chauffeur" users on their commuting routes.
Taking over the Internet, AI is making deeper transformation of the car, and the big guy with four wheels is becoming more and more like a Transformer.
The car machine + GPT Mercedes-Benz first
A "metamorphosis" from the inside out is sweeping the automotive circle, from traditional fuel power to new energy, from driving tools to intelligent products. Over the years, driven by technology, cars have been changing their appearance and interior, and after the Internet has transformed cars, artificial intelligence is here again.
Mercedes-Benz is leading the way in this new wave by transplanting ChatGPT into the car.
On June 16, Mercedes-Benz's 3-month ChatGPT test program was launched in the U.S. It has partnered with Microsoft to integrate ChatGPT into the car engine through Azure OpenAI service. Owners can choose to use ChatGPT through the Mercedes me app, and there is a more intuitive way to test it - use the voice command "Hey Mercedes, I want to join the test program" directly in the car, and Mercedes' MBUX infotainment system will automatically connect the voice assistant "Hey Mercedes" to ChatGPT.
In the past, Hey Mercedes could provide information such as sports and weather, answer questions about the vehicle's surroundings, and control the user's smart home, all of which were standardized. This is ChatGPT's watchword.

Currently, only about 900,000 MBUX-equipped Mercedes-Benz vehicles in the U.S. can test ChatGPT on a priority basis, and Mercedes-Benz intends to use this initial testing period to gain insight into requests from users to determine future development priorities and adjust launch strategies for different markets and languages.
On the matter of accessing ChatGPT, Mercedes-Benz gave an emotional statement, "Everything is aimed around redefining your relationship with Mercedes." Mercedes-Benz wants ChatGPT to reshape the human-vehicle interaction experience, and the analogy is more like the car machine "coming to life" from a mute, function-oriented machine to the role of an in-car life partner.
After Mercedes-Benz, domestic automakers are the first to follow suit.
On June 19, Ideal Auto launched its self-developed cognitive big model "Mind GPT", which was developed by Ideal's spatial algorithm team. It is said that the start date of the big model training was long before the release of ChatGPT. Based on 10 TB of original training data, Mind GPT uses 1.3 trillion Tokens for base model training and can recognize voice patterns and voice content, as well as understand dialects, while providing travel planning for car owners, and even having functions such as AI painting and AI computing.
RISO revealed that after the release of Mind GPT, the ideal car will have a new LUI (user language interface) interaction, "For example, if you want to eat a hot pot, you just need to call the ideal student, and the car interface will generate a picture of the hot pot for you to choose, and then the travel route will be calculated automatically."
Skyworth Auto also recently announced that two of its models, Skyworth EV6II and Skyworth HT-iII, have integrated ChatGPT in their smart car machines. In addition, four car companies, Great Wall Motor, Azera Auto, Xiaopeng Auto and Chery Auto, all applied for GPT-related trademarks last month.
GPT on board has become a trend. According to Junyi Zhang, Managing Partner of Auld Lang Syndrome, the access of GPT technology can enhance the human-machine interaction ability of the car and the interaction ability of comprehensive environmental issues. In the future, car companies will have less and less brand differences in terms of hardware in the same price segment, and when it is difficult to produce much difference in comfort, safety, power and range competition, volume intelligence becomes an inevitable choice.
Intelligent cockpit with a "brain"
ChatGPT is another bookable entry in the evolutionary history of automobiles. The most cutting-edge natural language processing model is applied to human travel tools, and a richer in-car life experience will emerge.
Looking back to more than 30 years ago, the car entertainment function and car computer intelligence, is still a blank. The first generation of car machine was born in the 1980s and 1990s, when people's general concern is still the car's engine, chassis and transmission "three major", suddenly there are some models not only can listen to the radio, but also can swallow the tape, free to play music, the car has some of the shadow of the second living space.
The second generation of the car is added to the DVD playback, MP3, entertainment highlights at the same time, the car and a step towards the driving experience, joined the car navigation. At this time, the solution to the problem of "road blindness" has become a mainstream trend. Many old drivers will remember that in the era of no car network, the Kaleidoscope car navigation has become the standard for high-end models, which uses GPS satellite positioning and map package data in the car to achieve relatively accurate navigation accuracy.
However, except for navigation and listening to songs and radio, people at that time did not expect much from the car machine, and the car machine was often not the main factor in deciding whether to buy a car or not.
In the 21st century, electronic digital technology continues to develop, the form of cell phones first changed. Following this evolutionary line of thought, a large screen appears on the car machine and intelligence becomes a new selling point. Car machine based on linux, WinCE, Android and other systems have been adopted by car manufacturers, and then the car can not only free real-time navigation, but also has a panoramic visual system, car driving assistance systems, such as 360-degree image.
When the car is connected to the network, everything by becomes more different. Online movie, road book, voice control, appointment maintenance, remote diagnosis and other functions were added to the car machine, the screen in the center console is getting bigger and bigger, more and more functions, some manufacturers directly in the cab installed a larger display than the tablet, and even manufacturers have recently rolled up the "full screen", even the passenger and rear are to install the screen.
Finally, the concept of the "third screen" is becoming more and more conspicuous, and OEMs hope that the car can become the third generation of intelligent terminals that affect human life, following computers and cell phones. With technology-rich car machine to occupy the user's mind, expand more business models, become the direction of the car companies are now attacking.
Now, the original concept of "car machine" is gradually replaced by "intelligent cockpit". Not only is the car more and more intelligent, but car companies are starting to roll up the interior materials, audio system, lighting system, and Azera has also released an AR glasses to support movie viewing with the car; the ideal L9 is even equipped with a rear refrigerator, making the car a moveable house.
But whether it's the car or the intelligent cockpit, voice conversation has been a relatively lagging feature in development, and considering driving safety, voice control is very necessary.
Over the past decade, almost all car companies and a large number of AI startups have invested a lot in the field of natural language processing, hoping to optimize the voice interaction experience in the car. Many of them can answer simple pre-set commands, such as turning up the temperature and forecasting the weather, etc. Upgrades and innovations revolve around broadening natural language commands, such as turning on the air conditioner to cool the car or turn down the temperature when the user says "it's a little hot.
But to make the car understand more "human" words, such as planning routes in various dialects and even finding restaurants, may not be as effective as the car owner himself using mobile maps and public reviews, richer voice-based human-vehicle interaction fell into a bottleneck, until ChatGPT appeared.
Natural language large model products (ChatGPT, Wenxin Yiyin, Tongyi Qianqian, etc.) are directly open for C-sides to use, so that developers of intelligent cockpits see the light. The powerful comprehension and logical reasoning ability is expected to make the car machine a driving assistant with hidden business possibilities.
For example, the car owner can tell the voice assistant, "help me find out the destination near the group purchase discount, rating more than 4.5 hot pot restaurant, there will be five people dining, give me a reservation location, and then look at where to park convenient." Put in the past, the car machine absolutely can not understand so much information at once, but for ChatGPT, this is just its base manipulation, as long as there are enough real-time data sources, the possibility of needs being met can be infinite.
The addition of GPT does not just make the conversation smoother, but gives the car a "brain" that can not only answer questions, but also understand needs and generate answers. As for how high the IQ, how fast the response, depending on the car manufacturer's ability to train a large model on board, as well as the courage to "kryptonite" on more bull hardware (chip).
AI how to make the self-driving "brain circuit" more like people?
The richness of life in the car makes the car gradually grow into a carrier full of warmth, it is no longer a boring, cold means of transportation, but a comfortable living space.
The evolution of AI-led cars is not only about GPT, but also about the promotion of self-driving technology, which is even more significant.
The traditional approach to autonomous driving research is to cover all possible driving scenarios by collecting large-scale driving data and testing longer driving miles to ensure that the car has a predetermined response plan in the event of an unexpected situation. However, the complexity of unexpected situations is often unpredictable, and once the system does not have a plan to deal with a particular unexpected situation, driving safety will be greatly threatened.
This is why current assisted driving systems must require the driver to hold the steering wheel in order to deal with real-time emergencies. The learning capability of AI will potentially change this.
Not long ago, a research team from Tsinghua University proposed the self-driving "Trusted Continuous Evolution" technology, which is based on dynamically evaluating the AI's trustworthiness for learning training to ensure that self-driving cars can continuously improve their driving ability from basic active avoidance when encountering new and unfamiliar scenarios, and achieve better driving performance while ensuring safety. Better driving performance.
Simply understood, using AI, self-driving capable cars can actively learn and become familiar with various newly encountered scenarios for continuous evolution, and with the accumulation of driving mileage and data volume, performance is continuously improved.
Ideal Auto is using AI big models for autonomous driving. on June 17, Ideal announced the opening of urban NOA (navigation assisted driving) internal test and will open commuter NOA function to users in the second half of the year. Unlike conventional solutions, RISO adopts BEV (Bird's eye view) big model to sense and understand the road structure information in the environment in real time, so that the car can better imitate the operation habits of human drivers.
In the past, most of the assisted driving systems on most cars used a high precision map solution, which is equivalent to feeding the road conditions to the autonomous driving system in real time for it to make decisions. However, in complex urban roads, there are always areas that cannot be covered by high precision maps and cannot be updated in time, which becomes a major drawback of this solution. With the BEV large model, it is equivalent to AI actively sensing real-time road conditions and making autonomous decisions on driving operations.
Of course, BEV also has disadvantages, for example, at some intersections with large spans and more passing vehicles, the sensor field of view is easily blocked, resulting in the loss of local information as a result of real-time perception at the vehicle end. To make up for this shortcoming, Ideal is purportedly paired with a NeuralPriorNet (NPN) and an end-to-end signal intent network, the former acting as the equivalent of an image reference whenever a car goes through an intersection where a fleet of autonomous drivers has gone; the latter learning how a large number of human drivers react to signal changes at intersections to help the autonomous driving system understand traffic signals.
No activity yet