[ { "input": "FULL-DUPLEX COMMUNICATION METHODS AND APPARATUSTechniques are described for a full duplex communication method and apparatus for inter-vehicle communication (V2V). A communication apparatus includes one or more transmit antennas, one or more receive antennas, and a processor. For cases where a single transmit antenna and multiple receive antennas are used, a distance between the transmit and receive antennas is greater than a pre-determined value. Further, the transmit antenna is located on or in a central region of a top surface of the vehicle and the receive antennas are evenly distributed located on the vehicle. The processor configured to generate one or more messages to be transmitted via the transmit antenna, where the one or more messages includes vehicle condition information, operational information about a driver of the vehicle, or information associated with one or more sensors of the vehicle.", "output": "The communication apparatus has transmit antenna (104) and receive antenna (106a-106d) that are located on or in first side and second side of the vehicle (102). A processor generates one or more messages to be transmitted via transmit antenna. The messages include vehicle condition information, operational information about the driver of the vehicle or information associated with one or more sensors of the vehicle. An INDEPENDENT CLAIM is included for a wireless communication method. Communication apparatus for full-duplex vehicle-to-vehicle (V2V) or device-to-device (D2D) communication. Assists driver of the vehicle as the vehicle transmits information to or receives information from the vehicles surrounding the driver, and as well as assists vehicle to operate in autonomous driving mode. The drawing is the schematic view of the communication apparatus for full-duplex vehicle-to-vehicle (V2V) or device-to-device (D2D) communication. 102Vehicle104Transmit antenna106a-106dReceive antenna", "instruction": "Please summarize the input" }, { "input": "Method, system and vehicle for controlling over-vehicle on ice and snow road of automatic driving vehicleThe invention claims an automatic driving vehicle ice and snow road overtaking control method, system and vehicle, firstly detecting whether the surrounding vehicle has overtaking intention; detecting the front and back position vehicle on the borrowing lane is a non-automatic driving vehicle or an automatic driving vehicle, and a non-automatic driving vehicle or automatic driving vehicle located on the to-be-executed overtaking vehicle overtaking route position, and the safe distance of the front vehicle position and the rear vehicle position on the borrowing lane, then sending signal to the surrounding vehicle, after executing the first lane change of the overtaking vehicle, judging whether the detecting road is the ice film road, if not, performing the second lane changing to finish the overtaking after the overtaking vehicle speed change driving exceeds the original lane, sending the over-vehicle signal to the surrounding vehicle, the surrounding vehicle recovers the original driving state, the invention controls the automatic driving vehicle and the surrounding automatic driving vehicle, reduces the uncertainty of the non-automatic driving vehicle in the vehicle process, improves the super-vehicle safety performance of the ice and snow road.|1. An automatic driving vehicle control method for snowy and icy road of automobile, wherein it comprises the following steps: firstly comparing the to-be-executed overtaking vehicle and the front vehicle speed, if it is greater than the front vehicle speed, carrying out overtaking, and then detecting whether the surrounding vehicle has overtaking intention; if there is, then to be executed overtaking vehicle deceleration or original state driving; if not, then detecting the front and back position vehicle on the borrowing lane is a non-automatic driving vehicle or an automatic driving vehicle, and a non-automatic driving vehicle or an automatic driving vehicle located on the vehicle overtaking route to be executed on the position, and the safety distance of the front vehicle position and the rear vehicle position on the borrowing lane, It includes the following three cases: The first situation: if the vehicle located at the front vehicle position and the rear vehicle position of the overtaking vehicle borrowing lane is a non-automatic driving vehicle, then the to-be-executed overtaking vehicle sends a first signal for prompting the non-automatic driving vehicle having an overtaking intention, detecting the front vehicle and the rear vehicle speed and does not change or decelerate, executing the first lane changing for the overtaking vehicle; The second situation: if the non-automatic driving vehicle and the automatic driving vehicle are respectively located at the front vehicle position and the rear vehicle position of the overtaking vehicle to be executed, the priority selection sends the second signal to prompt the automatic driving vehicle to change the speed, the overtaking vehicle is to be executed for the first lane change; The third scenario: if the automatic driving vehicle is respectively located at the front vehicle position and the rear vehicle position of the overtaking vehicle to be executed, detecting whether the road surface of the front vehicle and the rear vehicle is the ice film road surface, preferably selecting the second signal to prompt as the automatic driving vehicle changing speed of the non-ice film road surface, performing the first lane changing for the overtaking vehicle, in the three cases, when the first signal or the second signal is selected, the overtaking vehicle sends the third signal to the automatic driving vehicle located at the front of the front vehicle position or the automatic driving vehicle after the rear vehicle position, after the vehicle receives the third signal of the overtaking vehicle to be executed, detecting the current position of the vehicle, if the rear vehicle at the rear vehicle position is slowly decelerated, and sending the variable speed driving warning to the front and rear vehicles, if the vehicle in the front of the front vehicle position is slowly accelerated, and sending the variable speed driving early warning to the surrounding vehicle ; after executing the first lane change of the overtaking vehicle, judging whether the detecting road is ice film road, if so, performing the original state driving of the overtaking vehicle, if not, detecting whether the surrounding vehicle state is changed, if there is no change, performing the second lane changing to finish the overtaking after the overtaking vehicle speed changing running exceeds the original lane front vehicle, sending the over-vehicle signal to the surrounding vehicle, the surrounding vehicle recovers the original driving state.\n | 2. The overtaking control method for snowy and icy road of automatic driving automobile according to claim 1, wherein the judging process of the icy road surface is as follows: the camera of the automatic driving car is matched with the sensor to judge; the distance of the surrounding vehicle passes through the laser radar, the matching of the millimeter-wave radar and the camera can realize measurement; ground adhesion coefficient through the road passing state, the camera sensing and sensor measuring cooperation for judging.\n | 3. The overtaking control method for snowy and icy road of automatic driving automobile according to claim 1, wherein the safe distance of the vehicle and the surrounding vehicle is calculated by the following formula: wherein t1 and t2 are the time of the brake, S is the braking process driving distance, v driving speed, g gravity acceleration, μ road adhesion coefficient, s0 after braking distance from the front vehicle. when the automatic driving vehicle is to perform the overtaking operation, the condition that the vehicle should satisfy the vehicle in the super-vehicle borrowing lane is as follows: the vehicle driving condition with the super-vehicle borrowing lane: △S1 ?S; v is not less than v1; and the super-vehicle driving condition of the front vehicle borrowing lane: DELTA S2 IS NOT LESS THAN S; v is less than or equal to v2; wherein ΔS1 is the horizontal distance with the borrowing lane rear vehicle; v1 is the speed of borrowing lane back vehicle; wherein delta S2 is the horizontal distance with the borrowing lane front vehicle; v2 is the speed of borrowing lane front vehicle.\n | 4. The overtaking control method for snowy and icy road of automatic driving automobile according to claim 1, wherein said borrowing traffic lane can be the adjacent left side or the adjacent right side traffic lane.\n | 5. The overtaking control method for snowy and icy road of automatic driving automobile according to claim 1, wherein said first signal provides lane changing information to the surrounding vehicle in the same way as the non-automatic driving vehicle lane changing process.\n | 6. The overtaking control method for snowy and icy road of automatic driving automobile according to claim 1, wherein the second signal is the interactive vehicle information between the overtaking vehicle to be executed and the automatic driving vehicle at the upper front vehicle position or the rear vehicle position of the borrowing lane, the position and the speed of the vehicle. steering speed and steering time and so on.\n | 7. The method for controlling overtaking of ice and snow road surface of automatic driving automobile according to claim 1, wherein the third signal is the interactive vehicle information between the to-be-executed overtaking vehicle and the automatic driving vehicle located in front of the front vehicle or the automatic driving vehicle after the rear vehicle position, comprising a position, to be executed overtaking vehicle speed, front vehicle position speed or back vehicle position speed and steering time and so on.\n | 8. An automatic driving vehicle ice and snow road overtaking control system, comprising a vehicle controller, a V2V communication unit and a combined instrument; the vehicle controller is adapted to collect the position information of the vehicle, vehicle speed information and steering information; the V2V communication unit is adapted to transmit position information, vehicle speed information and steering information; the vehicle controller is adapted to according to the position information of each vehicle, vehicle speed information and steering information to generate super-vehicle intention signal; the combined instrument is suitable for displaying the corresponding overtaking information according to the intention of the overtaking.\n | 9. An automatic driving vehicle, comprising the automatic driving vehicle ice and snow road overtaking control system.", "output": "The self-driving car overtaking control method involves comparing the speed of a vehicle to be overtaken with the vehicle in front, overtaking if it is greater than the speed of the vehicle in front, and detecting whether the surrounding vehicles have overtaking intentions. The overtaking vehicle is to be decelerated or driven in its original state. Determination is made to detect whether the front and rear position vehicles on the borrowed lane are non-autonomous driving vehicles or automatic driving vehicles. The first signal is sent to remind the non-autonomous vehicle that the vehicle has an overtaking intention if the vehicles in the front and rear positions of the borrowed lane of the vehicle to be overtaken are non-autonomous vehicles. The second signal is preferred to prompt the automatic driving vehicle to change speed if the non-autonomous and automatic driving vehicles are respectively located in the front and rear positions of the borrowed lane of the overtaking vehicle. INDEPENDENT CLAIMS are included for:(1) an overtaking control system for an automatic driving vehicle on icy and snowy roads;(2) an automatic driving vehicle. Self-driving car overtaking control method on icy and snowy roads. The method reduces the uncertainty of the non-automatic driving vehicle in the process of driving through the cooperative control of the automatic driving vehicle and the surrounding automatic driving vehicles, and improves the safety performance of overtaking on ice and snow roads. The drawing shows a flow chart of a self-driving car overtaking control method on icy and snowy roads. (Drawing includes non-English language text).", "instruction": "Please summarize the input" }, { "input": "A non-human bus wire control chassis and automatic driving system thereofThe invention claims a non-human bus wire control chassis and automatic driving system thereof, a unmanned bus wire control chassis, comprising a chassis main body. In the application, the image and the 3 D laser front fusion sensing, for pedestrian and lane line environment detection to ensure the correct understanding and corresponding decision of the vehicle body surrounding environment of the automatic driving vehicle. identifying the road red street lamp by V2X intelligent network connection technology based on the 5 G communication, the technology by installing the signal emitter on the traffic light to continuously transmit the state information of the traffic light, automatically driving the vehicle by receiving the signal sent by the signal emitter to judge the state of the traffic light, using the MPC track tracking can make the vehicle travel along the predetermined track, the algorithm has excellent performance, the track of the track exhibits stable and accurate tracking capability, at the same time, it has enough real-time performance.|1. A non-human bus wire control chassis, comprising a chassis main body, wherein the chassis main body front side is top part mounted with 32-line laser radar, providing a horizontal 360 degrees, vertical 40 degrees of view, four corners of the chassis main body are fixedly mounted with 16-line laser radar, for making up the view blind area caused by 32-line laser radar height, finishing the monitoring area covering 360 degrees, the back side of the chassis main body, the outer side of the two groups of 16-line laser radar is fixedly installed with two groups of blind area auxiliary millimeter wave radar, the front side of the chassis main body is fixedly installed with a preposed millimeter wave radar, three surfaces of the chassis main body outer wall adjacent are distributed with 12 groups of ultrasonic radar, and the ultrasonic radar on the same surface are distributed in equal distance, the front side of the chassis main body is fixedly mounted with an industrial camera 1 and an industrial camera 2, the industrial camera is used for identifying the lane line and the traffic identification, the bottom of the chassis main body is fixedly mounted with an automatic driving calculation platform ADU, comprising an automatic driving operation platform controller, the bottom of the chassis main body is fixedly installed with a 5 G host and a 5 G antenna, and four sides of the chassis main body are respectively fixedly installed with four groups of 5 G cameras, one group of outer side of two groups of the 5 G antennas is provided with a combined navigation host fixedly connected with the chassis main body; 16-line laser radar, 32-line laser radar, preposed millimeter wave radar blind area auxiliary millimeter wave radar and ultrasonic radar can provide millions of data points per second, so as to create a three-dimensional map of surrounding object and environment, combining the auxiliary of combined navigation host and industrial camera, constructing high precision map needed by bus operation.\n | 2. The unmanned bus automatic driving system according to claim 1, wherein the unmanned bus control chassis according to claim 1, wherein it comprises a module software interface and hardware interface, further comprising: sensing algorithm module, which is detected by laser radar, camera detection, laser radar tracking, camera tracking and predicting five sub-modules. The camera detection mainly uses the laser radar obtain the high quality barrier information and detects the final result. the tracking result of each sensor will be fused by the filter algorithm, and the prediction module through the fusion tracking result, inputting and outputting the future track of each type of road participant; a locating module, the locating algorithm is mainly composed of a laser radar milemeter, combined navigation calculation and fusion three sub-modules. after outputting the relative positioning information of the fixed frequency, the fusion module combines the positioning information of different frequencies by filtering algorithm, finally outputting the global position of the fixed frequency, providing absolute positioning capability; a global path planning module, responding to the external routing request, giving an optimal route from the current position to the end point of the request; planning control module, the sensor receives the external information, and through the locating and sensing algorithm module, obtaining the state of the vehicle itself and the surrounding vehicle, and receiving the state of the external information and the vehicle, responsible for autonomous vehicle movement planning and trajectory tracking control; a decision planning module, receiving the real-time location, prediction information, planning according to the global path, combining obstacle avoidance, multiple factors, real time planning the vehicle future a collision-free track of a period of time, and sending to the track tracking controller to execute the vehicle; action prediction module, receiving the input of sensing and positioning module, responsible for giving the action of surrounding other participants 5s-7s the specific motion track, for decision planning module, track tracking control module, after the decision planning module gives the track of safety without collision, the track tracking control module is responsible for calculating the proper control command according to the current vehicle state and the planned track, so that the vehicle can move along the planned track.\n | 3. The unmanned bus automatic driving system according to claim 2, wherein the output frequency of the positioning module is the fixed frequency and is processed in the ADU of the automatic driving calculation platform.\n | 4. The unmanned bus automatic driving system according to claim 2, wherein the hardware interface comprises: a communication data interface: for interactive scheduling command, vehicle positioning, posture; sensor data interface: the combined inertial navigation system IMU and the automatic driving calculation platform ADU, using the USART interface of the IMU to transmit data; multi-line laser radar interface, millions of point cloud data per second, using UDP protocol for data transmission; the ultrasonic radar is a near-distance obstacle detection, the output result is a barrier distance, the data reading is performed by the CAN interface on the ultrasonic radar control box; a control data interface, an automatic driving operation platform ADU and vehicle control chassis interface, using the mode of CAN to transmit.\n | 5. The unmanned bus automatic driving system according to claim 2, wherein said module software interface comprises: sensor abstract layer service interface, providing two types of service interface, one is the information service interface of the intelligent sensor, and the other one is other vehicle sensor interface.\n | 6. The unmanned bus automatic driving system according to claim 2, wherein the laser radar mileage meter in the positioning module uses the GNSS data to finish the initialization, and the point cloud data generated by the laser radar is matched with the high precision map, and the absolute positioning information of the fixed frequency is output. combined navigation calculating module combined with GNSS data and IMU data, then outputting the relative positioning information of the fixed frequency.\n | 7. The non-human bus wire control chassis and automatic driving system thereof according to claim 2, wherein the radar detection and camera detection in the sensing algorithm module can be decoupled and used for tracking.", "output": "The chassis has a chassis main body whose front side is fixedly mounted with first and second industrial cameras. A bottom of the chassis main body is fixedly mounted with an Automatic Drive Unit (ADU). A bottom of the chassis main body is fixedly installed with a fifth generation (5G ) host and a 5G antenna. Four sides of the chassis main body are fixedly installed with four groups of 5G cameras. Groups of the 5G antennas is provided with a combined navigation host fixedly connected with the chassis main body. An INDEPENDENT CLAIM is included for an automatic driving system. Non-human bus wire control chassis for an automatic driving system (claimed). The chassis enables Model predictive control (MPC) tracking to make the vehicle travel along the predetermined track, thus obtaining excellent performance, and stable and accurate tracking capability. The drawing shows a schematic view of a non-human bus wire control chassis. (Drawing includes non-English language text).", "instruction": "Please summarize the input" }, { "input": "lane recognition system, method and automatic driving automobileThe invention embodiment claims a lane recognition system, method and automatic driving vehicle, the lane recognition system, comprising: at least one camera, at least one radar, data processing unit and a lane recognition unit; the weight ratio of the road edge feature the lane recognition unit is connected with the data processing unit for determining the road edge feature obtained by processing the image information and obtained by processing the radar information, and according to the weight proportion, the road edge feature obtained by processing the image information; road edge feature obtained by processing the radar information and the lane mark feature to identify the lane position. the technical solution of the invention realizes the camera and radar device data on the lane recognition fusion, so as to better deal with complex road condition and environmental interference, avoid the occurrence of dangerous accident and improve the safety of driving.|1. A lane recognition system, wherein, comprising: at least one camera for obtaining vehicle driving image information of lane in the path, at least one radar, for obtaining vehicle running radar information of the area, and a processing unit for: processing the image information to obtain the lane line feature and the road edge feature, processing the radar information to acquire the road edge feature, determining the weight ratio of road edge feature by the road edge feature processing obtained by the image information and information obtained by processing the radar; according to said weight proportion, road edge feature obtained by processing the image information, by processing road edge feature obtained by the radar information and the lane mark feature to identify the lane position.\n | 2. The system according to claim 1, wherein, further comprising: at least one illumination device, used for obtaining the vehicle driving the illumination intensity information of the area, wherein the processing unit is used for determining the weight proportion according to the illumination intensity information.\n | 3. The system according to claim 2, wherein it further comprises a vehicle-to-vehicle communication device, which is used for obtaining the vehicle driving road traffic information and auxiliary lane information of the area, wherein: the processing unit is used for according to the road traffic information and the illumination information to determine the road edge feature by processing the weight rate of said image information, obtained by processing the weight ratio of road edge feature obtained by the radar information, the auxiliary lane information of the weight proportion; road edge feature and according to the weight proportion, obtained by processing the image information by processing the road edge feature obtained by the radar information, the lane line feature, the auxiliary lane information to identify lane position.\n | 4. A lane recognition method, wherein the method comprises: according to the image information, obtaining the vehicle running lane in the lane route and lane road edge feature of the two side, wherein the image information collected by the camera by the installed on the vehicle, determining the weight ratio of the respectively obtained according to image information and radar information two sides of the lane of the road edge feature, wherein the radar information by mounting the radar acquisition of the vehicle according to the lane line feature. the weight proportion is, the road edge feature obtained by the image information and the road edge feature identifying lane position acquired by the radar information.\n | 5. The method according to claim 4, wherein it further comprises the following steps: obtaining the vehicle driving the illumination intensity information of the area according to the illumination intensity information to determine the road edge feature by processing the image information to that obtained by processing the weight ratio of road edge feature obtained by the radar information.\n | 6. The method according to claim 5, wherein it further comprises the following steps: obtaining the vehicle running road traffic information and auxiliary lane information of the region; The road traffic information and the illumination information to determine the road edge feature by processing weight ratio of the image information obtained by the weight ratio of the road edge feature obtained by processing the radar information, the auxiliary lane information of weight ratio, and the road edge feature according to the weight proportion, obtained by processing the image information, road edge feature obtained by processing the radar information, the lane line feature, the auxiliary lane information to identify lane position.\n | 7. The method according to claim 4, wherein said according to the image information, obtaining the vehicle running lane and a lane in the lane path at two sides of road edge feature, is implemented as: for enhancing white balance processing to the image information. the said image information into area according to the RGB value of the pixel point, gray processing the image information of the divided area, extracting the road feature, the road characteristic input deep learning model trained in advance, output lane and road edge feature.\n | 8. The method according to claim 4, wherein the radar information obtaining the lane road edge feature at two sides, is implemented as: performing filtering processing to the radar information, extracting the road edge feature.\n | 9. The method according to claim 4, wherein said lane characteristic according to the weight proportion, the road edge feature obtained by the image information and the road edge feature identification vehicle acquired by the radar information of position, comprising: calculating the lane width according to the lane mark feature according to the weight proportion, the road edge feature obtained by the image information and the road edge feature acquired by the radar information of the calculated road width; according to the lane width and the width of the road lane number calculation, and based on the lane width and the lane number identifying the lane position.\n | 10. An automatic driving automobile, comprising one of a lane recognition system according to any one of claims 1~3.", "output": "The system has a camera for obtaining vehicle driving image information of lane in a path. A radar obtains radar information of an area in which a vehicle is traveling. A processing unit processes image information to obtain lane line features and road edge features. The processing unit processes radar information to obtain road edge features. The processing unit determines a road edge feature obtained by processing image information and a weight ratio of a road edge feature obtained by processing radar information. The processing unit identifies a lane position by processing the road edge feature obtained by radar information and lane line feature. An INDEPENDENT CLAIM is also included for a lane recognition method. Lane recognition system. The system realizes data fusion of the camera and the radar device in lane recognition, so as to better deal with complex road conditions and environmental disturbances, avoids the occurrence of dangerous accidents, and improves the safety of driving. The drawing shows a block diagram of a lane recognition system. '(Drawing includes non-English language text)'", "instruction": "Please summarize the input" }, { "input": "A formation control system of automatic driving vehicleThe application model claims a formation control system of automatic driving vehicle, the formation control system of the automatic driving vehicle comprises a vehicle end and a road end, the vehicle end comprises a vehicle unit and a domain controller, the road end comprises a road side unit, a vehicle unit, for obtaining the sensor data of the vehicle end and the vehicle road cooperative data obtained by the V2X communication mode, and sending the sensor data of the vehicle end and the road end of the vehicle road cooperative data to the domain controller; domain controller, used for performing fusion processing for the received data, and performing formation control of automatic driving vehicle according to the fusion processing result; a road side unit, for obtaining the road cooperation data of the road end and sending to the vehicle end through the V2X communication mode. The application realizes the full link design of automatic driving vehicle formation control based on V2X communication, by combining the sensor data of the vehicle end with the road of the road end, providing more abundant, more reliable judging basis for the formation decision planning, improving the formation control precision.|1. A formation control system of automatic driving vehicle, wherein the formation control system of the automatic driving vehicle comprises a vehicle end and a road end, the vehicle end comprises a vehicle-mounted unit and a domain controller, the road end comprises a road side unit, the vehicle-mounted unit, for obtaining the sensor data of the vehicle end and the vehicle road cooperative data of the road end by the V2X communication mode, and sending the sensor data of the vehicle end and the vehicle road cooperative data of the road end to the domain controller; the domain controller is used for performing fusion processing to the sensor data of the vehicle end and the road end of the road end, and according to the fusion processing result for automatically driving the formation control of the vehicle; the road side unit is used for obtaining the road coordinate data of the road end and sending it to the vehicle end through the V2X communication mode.\n | 2. The formation control system of automatic driving vehicle according to claim 1, wherein the domain controller is further used for: performing target prediction according to the fusion processing result; and performing formation control of the automatic driving vehicle according to the target prediction result.\n | 3. The formation control system of automatic driving vehicle according to claim 1, wherein the domain controller is further used for: fusing the data of each sensor of the vehicle end, obtaining the sensor data after fusion; integrating the converged sensor data with the road-end vehicle-road cooperative data to obtain the final fusion processing result.\n | 4. The formation control system of automatic driving vehicle according to claim 1, wherein the domain controller is further used for: determining whether the bicycle can be used as a pilot vehicle according to the fusion processing result; under the condition that the bicycle can be used as a pilot vehicle, based on self-vehicle for formation decision planning, generating formation decision planning task.\n | 5. The formation control system of automatic driving vehicle according to claim 4, wherein the domain controller is further used for: executing the formation decision planning task, and obtaining the self-vehicle fleet according to the execution result of the formation decision planning task; controlling the self-vehicle fleet according to the preset fleet driving strategy for driving.\n | 6. The formation control system of automatic driving vehicle according to claim 4, wherein the vehicle unit is further used for: sending the formation request to the surrounding vehicle corresponding to the bicycle through the V2X communication mode, so that the surrounding vehicle according to the formation request application added to the self-vehicle fleet; according to the response result of the surrounding vehicle to the formation request, updating the to-be-processed state list of the vehicle end, the to-be-processed state list comprises adding the vehicle queue list, member list and leaving the vehicle queue list.\n | 7. The formation control system of automatic driving vehicle according to claim 6, wherein the vehicle unit is further used for: according to the response result of the surrounding vehicle to the formation request, determining the candidate surrounding vehicle; obtaining the vehicle information of the candidate surrounding vehicle by V2X communication mode, and sending the vehicle information of the candidate surrounding vehicle to the domain controller.\n | 8. The formation control system of automatic driving vehicle according to claim 7, wherein the domain controller is further used for: according to the vehicle information of the candidate surrounding vehicle, determining whether the candidate surrounding vehicle satisfy into the requirement of the fleet, under the condition that the candidate surrounding vehicle satisfy added with the requirement of the fleet, the candidate surrounding vehicle is used as the following vehicle to join the self-vehicle fleet.\n | 9. The formation control system of automatic driving vehicle according to claim 4, wherein the formation decision planning task comprises a vehicle fleet driving track, the domain controller is further used for: determining a current lane where the bicycle is located; according to the current lane of the bicycle and the driving track of the train, determining whether the bicycle needs to change the lane; under the condition that the bicycle needs to be changed, generating and executing the lane-changing track planning task, so that the bicycle is changed from the current lane to the target lane.\n | 10. The formation control system of the automatic driving vehicle according to any one of claims 1 to 9, wherein the vehicle road cooperation data is data obtained by sensing the surrounding environment in the preset range of the road side device, the vehicle road cooperation data comprises other traffic participation object data, traffic signal lamp data and road event data in the one kind of or more.", "output": "The system has a vehicle end provided with a vehicle-mounted unit and a domain controller. A road end is provided with a road side unit. The vehicle-mounted unit is used for obtaining the sensor data of the vehicle end and the vehicle road cooperative data of the road end by the V2X communication mode and sending the sensor data of the vehicle end and the vehicle road cooperative data of the road end to the domain controller. The domain controller is used for performing the fusion processing to the sensor data of the vehicle end and the road end. The road side unit is used for obtaining the road coordinate data of the road end and sending to the vehicle end through the V2X communication mode. The domain controller is used for performing the target prediction according to the fusion processing result and performing formation control of the automatic driving vehicle according to the target prediction result. Formation control system for an automatic driving vehicle e.g. automatic driving bus and automatic driving lorry. The application realizes the full link design of automatic driving vehicle formation control based on V2X communication, by combining the sensor data of the vehicle end with the road of the road end, providing more abundant, more reliable judging basis for the formation decision planning, improving the formation control precision. The drawing shows a structure schematic diagram of a formation control system for an automatic driving vehicle e.g. automatic driving bus and automatic driving lorry. (Drawing includes non-English language text).", "instruction": "Please summarize the input" }, { "input": "SYSTEM AND METHOD FOR LONGITUDINAL REACTION-CONTROL OF ELECTRIC AUTONOMOUS VEHICLEA PID automatic control system and method for speed control of an electric vehicle, according to an embodiment, design a commercial vehicle-based autonomous driving system and a controller for development of a semi-autonomous driving acceleration/deceleration controller, develop a driving priority determination algorithm on the basis of V2X communication, develop technologies for correcting autonomous navigation and improving position precision, develop a technology for recognizing autonomous driving road environments by using a cognitive sensor, conceive an application method of a driving habits improvement algorithm by using learning, develop a driving habits improvement algorithm by using learning, and develop an AEB function for a commercial vehicle to which a semi-autonomous driving technology is applied.|1. A PID automatic control system for semi-autonomous driving acceleration/deceleration control, comprising: a communication module for communicating with a nearby vehicle and a leading vehicle when a platooning group is formed around the vehicle;\na detection module for detecting obstacles on the front and rear sides of the vehicle, detecting surrounding vehicle information including the vehicle speed and driving path of the front vehicle, and detecting road information including stop lines, traffic lights, signs, and road curbs; \nThe vehicle speed is controlled according to the result of V2X communication with communication objects around the vehicle and information about surrounding vehicles and road information, and the amount of change in vehicle speed due to the change in pedal tilt is calculated and reflected in the proportional gain value and error calculation, and acceleration control response characteristics and deceleration control a semi-autonomous driving acceleration/deceleration control module for learning response characteristics and applying the result of learning response characteristics for each vehicle to a gain value calculation; \na deceleration/acceleration sensor that detects the inclination of the pedal and the amount of change in the inclination of the vehicle brake and accelerator; and driver-specific driving data and vehicle control data are collected and driver-specific learning data is applied to the vehicle, and driving habits are identified based on the learning data stored for each driver. a driving habit improvement learning module that improves driving habits through and the semi-autonomous driving acceleration/deceleration control module; is a gain calculator for calculating the difference between the target speed and the running speed, the amount of change in the running speed, and proportional gain (Kp), integral gain (Ki), and differential gain (Kd), which are proportional gains for PID calculation; \nan error amount calculator for calculating an error with a target speed after controlling the motor according to the calculated gain value; \na feedback unit for feeding back motor control by applying the calculated error to each gain value; And The detection module implements an autonomous driving navigation position correction algorithm using sensor information to correct the current position of the vehicle from a sensor including a LiDAR and a camera by comparing it with global coordinates, , Through camera coordinate system calibration using camera coordinate system calibration, 1:1 pixel coordinates of external parameters and internal parameters are matched, and the driving habit improvement learning module; the amount of deceleration in the silver curve is small, or the habit of rapidly accelerating when waiting for a signal If monitored, it feeds back to the semi-autonomous driving acceleration/deceleration control module to decelerate further than the amount of deceleration caused by the brake by the driver. PID automatic control system for semi-autonomous driving acceleration/deceleration control, characterized in that it decelerates. \n | 2. A PID automatic control method for semi-autonomous driving acceleration/deceleration control, comprising the steps of: (A) an autonomous driving vehicle communicating with a nearby vehicle and a leading vehicle when a platooning group is formed around the vehicle;\n(B) detecting obstacles on the front and rear sides of the vehicle in the autonomous vehicle and detecting surrounding vehicle information including the vehicle speed and driving path of the vehicle in front, and road information including stop lines, traffic lights, signs, and curbs; and (C) controlling the vehicle speed according to the result of V2X communication with the communication object around the vehicle in the autonomous vehicle, and information about the surrounding vehicle and the road; and (D) the driver's driving data and vehicle control data in the autonomous vehicle. It collects and applies learning data for each driver to the vehicle, identifies driving habits based on the learning data stored for each driver, and implements autonomous driving to improve driving habits through semi-autonomous driving when high-risk driving habits are identified. Including; and the step of (B); Detecting the inclination of the pedal and the amount of change in the inclination of the vehicle brake and accelerator in the deceleration and acceleration sensor; and calculating a change in vehicle speed due to a change in pedal inclination in the autonomous vehicle and reflecting it in calculating a proportional gain value and an error; comprising the step of (B); implements an autonomous driving navigation position correction algorithm using sensor information to compare and correct the current position of the vehicle from sensors including LiDAR and Camera with global coordinates, and calibrate the camera coordinate system using The pixel coordinates of the external parameter and the internal parameter are matched 1:1 through the camera coordinate system calibration, and the step of (C); is the difference between the target speed and the running speed, the amount of change in the running speed, and the proportional gain for PID calculation calculating a gain (Kp), an integral gain (Ki), and a differential gain (Kd); \ncalculating an error with a target speed after controlling the motor according to the calculated gain value; \nfeeding back the motor control by applying the calculated error to each gain value; \nlearning acceleration control response characteristics and deceleration control response characteristics; And Applying the response characteristic learning result for each vehicle to the gain value calculation; Including; Step of (D); When the amount of deceleration in the curve is small or the habit of sudden acceleration when waiting for a signal is monitored, the driver's brake PID automatic control for semi-autonomous driving acceleration/deceleration control, which feeds back to make the vehicle decelerate further than the amount of deceleration by method.", "output": "The system has a communication module (110) that communicates with a nearby vehicle and a leading vehicle when platooning group is formed around the vehicle. A detection module (130) detects obstacles on the front and rear sides of the vehicle. The vehicle speed is controlled according to the V2X communication result with the communication object around the vehicle. A semi-autonomous driving acceleration and deceleration control module (150) learns acceleration control response characteristics and deceleration control response characteristics. A driving habit improvement learning module (170) improves driving habits through semi-autonomous driving. A feedback unit feeds back motor control by applying the calculated error to each gain value. The detection module implements autonomous navigation position correction algorithm using sensor information. An INDEPENDENT CLAIM is included for a proportional integral derivative (PID) automatic control method for semi-autonomous driving acceleration and deceleration control. PID automatic control system for semi-autonomous driving acceleration and deceleration control of 1-ton electric commercial vehicle. The fuel consumption caused by air resistance is reduced, thus improving fuel economy. The semi-autonomous speed control is more accurately performed according to the individual driving characteristics of the vehicle. The driver habit improvement module predicts collision by recognizing obstacle in front of the vehicle being driven by interlocking with the automatic emergency braking system and automatically applies the brake when the driver does not intervene to prevent collision. The drawing shows a block diagram of PID automatic control system for semi-autonomous driving acceleration and deceleration control. (Drawing includes non-English language text) 110Communication module130Detection module150Semi-autonomous driving acceleration and deceleration control module170Driving habit improvement learning module", "instruction": "Please summarize the input" }, { "input": "Interconnected automatic driving decision method based on cooperative sensing and self-adaptive information fusionThe invention claims an interconnected automatic driving decision method based on cooperative sensing and self-adaptive information fusion, which mainly solves the problem that the existing automatic driving decision has low applicability under complex road structure and traffic light information condition. The method considers the multi-lane traffic environment under the world coordinate system, wherein there is the mixed traffic flow composed of the interconnected automatic driving vehicle and the human driving vehicle. Each CAV can obtain surrounding multimodal environmental features (such as lane information, HDV vehicle information, and traffic light information) through a vehicle-mounted sensor and an off-line high precision map. With the help of the vehicle-to-vehicle communication, the CAV can share its information and make a decision within a specified time step t. The aim of the method is to generate speed decision and steering angle decision for CAV. With such action decision, the automatic driving vehicle can safely and effectively drive according to the special route, at the same time, the comfort level of the passenger is greatly improved and the influence to the surrounding HDV is reduced.|1. An interconnected automatic driving decision method based on cooperative sensing and self-adaptive information fusion, wherein it comprises the following steps: S1, collecting the vehicle dynamics information fA of the CAV through the vehicle GNSS and IMU sensor, and detecting the vehicle dynamics information fH of the HDV through the vehicle radar of the CAV; wherein CAV is automatic interconnected driving vehicle, HDV is human driving vehicle; S2, accurately locating the position and direction of the CAV, and identifying the road and traffic light near the CAV, so as to obtain the real-time road information of the CAV preset driving route; S3, each CAV respectively transmits its own vehicle dynamics information fA and the sensed vehicle dynamics information fH of HDV to an MLP, and splices the obtained result codes to form a vehicle code hi; and aggregating the CAV vehicle code based on the CAV communication link matrix M using the graph attention layer to obtain the vehicle flow information of the CAV preset travelling route; wherein MLP represents a multilayer perceptron; S4, adopting the multi-intelligent body strengthening learning algorithm MAPAC training parameterized CAV action structure to obtain the optimal action strategy, and adopting the random Gaussian strategy to improve the searching ability of the algorithm, so as to realize the action decision of CAV; S5, setting the self-central reward function for improving the safety, efficiency and comfort of the CAV and the social influence reward function for reducing the negative influence to the surrounding HDV to optimize the action decision of the CAV.\n | 2. The interconnected automatic driving decision method based on collaborative sensing and self-adaptive information fusion according to claim 1, wherein in the step S1, the vehicle dynamics information fA of CAV comprises vehicle speed, direction, length, width, a lane ID and an offset from the lane centre line; The vehicle dynamics information fH of the HDV includes the relative distance, speed, direction of the HDV relative to the CAV and the lane mark and lane centre line offset of the HDV.\n | 3. The interconnected automatic driving decision method based on cooperative sensing and self-adaptive information fusion according to claim 2, wherein in the step S2, the CAV preset driving route is composed of multiple lanes of roads. the characteristic fL of the road is represented by the lane track point, wherein each track point comprises the lane horizontal height, namely the lane gradient change, the direction and the lane ID; the traffic light information uses the detection technology based on the camera to detect the real-time state and the distance from the vehicle; wherein red is [1, 0, 0], green is [0, 1, 0], yellow is [0, 0, 1]; if the characteristic fL of the road and the traffic light state fTL are coded by a road encoder, Er, i= sigma (phi (fL, L, fTL, L) | phi (fL, C, fTL, C) | phi (fL, R, fTL, R)), wherein Er, i represents road coding, phi represents MLP layer, sigma is ReLU activation function, fL, L, fTL, L represents the left side lane code and traffic light code of the lane where the vehicle is located, fL, C, fTL, C represents the lane code and traffic light code of the lane where the vehicle is located, fL, R, fTL, R represents the right lane code and the traffic light code of the lane where the vehicle is located, and the absolute value is the connection operation; for each intelligent agent, the attention point is only limited to the lane characteristic of the current lane, the red-green lamp and two adjacent lanes; finally connecting the traffic flow code Et, i and the road code Er, i to obtain the state code Es, i for the final road information code operated by the subsequent module.\n | 4. The interconnected automatic driving decision method based on cooperative sensing and self-adaptive information fusion according to claim 3, wherein in the step S3, in the CAV vehicle coding based on CAV communication link matrix M, according to the attention mechanism, Each intelligent agent i in the vehicle communication network calculates the query vector qi, the key vector ki and the value vector vi are listed as follows: qi = Wqhi, ki = Wkhi, vi=Wvhi, in the formula, Wq represents query matrix, Wv represents value matrix, Wk represents key matrix, hi is vehicle code; Assuming that the intelligent agent i has Ni adjacent intelligent agents, the attention score a ij of the intelligent agent to the adjacent intelligent agent j can be calculated as: wherein sigma is the activation function ReLU; LeakyReLU represents a LeakyReLU activation function, exp represents an exponential operation operation, and l represents one of Ni adjacent intelligent bodies; Due to the change of the traffic environment, the intelligent body which lost the communication connection with it in the current time step is filtered, and the final traffic flow code Et is calculated in combination with the CAV link matrix, and i is listed as follows: wherein phi represents the MLP layer, Mi, j are the values of the link matrix, Mi, j = 0 represents that there is no connection between the agent i and the agent j in the current time step, and vice versa; Wherein, the intelligent agent is CAV.\n | 5. The interconnected automatic driving decision method based on collaborative sensing and self-adaptive information fusion according to claim 4, wherein in the step S4, the multi-intelligent body strengthening learning algorithm MAPAC uses the actor commentator structure in strengthening learning, wherein the actor network is used for calculating the action, the commentator network is used for evaluating the action through the estimation value function; the random Gaussian network replaces the original depth Q network, the random Gaussian network outputs a Gaussian distribution, the intelligent agent samples from the distribution to form parameterized continuous action; Wherein, in the model training process using the multi-agent enhanced learning algorithm MAPAC, the actor network pi i of the agent i updates the network by minimizing the following objects: wherein the experience buffer D is used for storing the state and action of all interconnected intelligent bodies, and Q (beta) represents the network parameter of the commentator; lambda is the regularization coefficient of the search performance of the control algorithm, and respectively representing the state information and action information of the intelligent agent i in the time step t, an actor network representing the parameters of the agent i; the combined set of the state and action of the interconnected intelligent body is used as the input of the commentator network, the network then outputs the Q value of the action taken by the intelligent body i in the time step t; The critic network is updated by minimizing the following Berman error JQ: wherein gamma is rewarding discount factor, ri is instant rewarding of t time step; Two target commentator networks for stable training process, when executing, each intelligent body operates the copy of the respective actor network and commentator network, namely distributed execution; the intelligent agent i only needs to obtain the observed traffic environment information and performs information enhancement through the shared information from the interconnected intelligent agent, and then calculates the final parameterized action based on the fused information; finally, selecting the action with the maximum Q value as the actually executed action; All CAVs in the Internet follow the process described above to generate their respective action decisions.\n | 6. The interconnected automatic driving decision method based on collaborative sensing and self-adaptive information fusion according to claim 5, wherein in the step S5, the self-central reward function and the social influence reward function form a mixed reward function. The expression is as follows: In the formula, rego represents a self-centered reward, and rsoc represents a social impact reward; it is used for quantifying the cooperation degree between the automatic driving vehicle union and the human driving vehicle.\n | 7. The interconnected automatic driving decision method based on cooperative sensing and self-adaptive information fusion according to claim 6, wherein the expression of the social influence reward function is as follows: FORMULA. In the formula, rsoc, 1 is used to quantify the incidence of sudden parking of CAV or sudden cut-in of corresponding lane, and the expression is as follows: rsoc = rsoc, 1 + rsoc, 2, wherein rsoc, 1 is used to quantify the incidence of sudden parking of CAV or sudden cut-in of corresponding lane, and the expression is as follows: In the formula, represents the speed of the HDV in the time step t, thrvel is a threshold value of the speed change, for determining whether the CAV causes the quick brake action of the HDV, thracc* delta t is the speed change threshold value between two continuous time steps; The reward is only effective when the HDV deceleration is greater than thrvel; rsoc, 2 is used for quantizing CAV to adjust its speed or position so as to reserve incidence rate of variable track space behaviour for HDV, the expression is as follows: wherein is the adjacent HDV of the agent i in the time step t; when the vehicle of the adjacent lane is in front of the CAV or behind the CAV safe lane change,/> set as 1; In other cases, is set as 0.\n | 8. The interconnected automatic driving decision method based on collaborative sensing and self-adaptive information fusion according to claim 7, wherein the expression of the self-central reward function is as follows: FORMULA. rego=rsaf + reff + rcom, in the formula, rsaf is security reward, reff is efficiency reward, rcom is passenger comfort reward; wherein the security reward is the reward sum of CAV unsafe behaviour and traffic rule compliance rate; the vehicle-following safety uses the predicted collision time TTC to ensure the safe vehicle-following distance of the CAV; The TTC calculation formula is as follows: In the formula, fA. level and fH-level represent speeds of CAV and HDV, respectively, dis (A, H) represents Euclidean distance between A and H; On-vehicle security reward rsaf, 1 is calculated as follows: Wherein, is a threshold value of TTC; Secondly, the lane-keeping security reward keeping CAV travels in the center of the lane, and the calculation method is as follows: wherein dis (wp, A) measures the distance between CAV and the central point of the lane, d is half of the width of the lane; the emergency safety is CAV collision, deviation road action or violation traffic signal lamp of penalty, other condition is 0; efficiency reward: the efficiency of the multi-lane task is the sum of the speed control efficiency and the lane changing efficiency; The speed control efficiency reff, 1 promotes the automatic driving vehicle to keep a safe driving speed, calculating as follows: wherein fA. level represents the speed of the automatic driving vehicle, velmax is the maximum driving speed set by the vehicle; lane change reward reff, 2 encourage vehicle overtaking and avoid obstacle; It is calculated as follows after the lane change is completed wherein dis (Htar, A) and dis (Hprev, A) represent the distance of the vehicle from an obstacle or a forward vehicle on the target lane and the previous lane, respectively; Comfortable rewards for passengers: using the vehicle acceleration change rate Jerk to measure; The Jerk calculation method is as follows: wherein acct is the acceleration of the vehicle under the time step t, delta t is each time step length; rcom is calculated by Jerk: wherein thracc is the allowed maximum Jerk value.", "output": "The method involves collecting vehicle dynamics information of a vehicle through a vehicle global navigation satellite system (GNSS) and an internet of things (IMU) sensor. A self-central reward function is set for improving safety, efficiency and comfort of a CAV and a social influence reward function for reducing negative influence to a surrounding HDV to optimize an action decision of the CAV. Cooperative sensing and self-adaptive information fusion based interconnected automatic driving decision method for automatic driving vehicles e.g. human driving vehicle (HDV) and inter-connected auto-driving vehicle (CAV). The automatic driving vehicle can safely and effectively drive according to the special route, at the same time, the comfort level of the passenger is greatly improved and the influence to the surrounding HDV is reduced. The drawing shows an overall frame diagram of a cooperative sensing and self-adaptive information fusion based interconnected automatic driving decision method. (Drawing includes non-English language text).", "instruction": "Please summarize the input" }, { "input": "Intelligent cruise robot reverse searching system and method based on cloud wireless transmissionThe invention relates to the technology field of reverse searching in parking lot, specifically to an intelligent cruise robot reverse searching system and method based on cloud wireless transmission. The system comprises four parts of an intelligent cruise robot terminal, a parking space identification terminal, a cloud terminal and a reverse searching inquiry terminal. The system is a full-automatic intelligent robot device with vehicle position identification, license plate automatic identification and mobile walking functions. The searching robot device integrates automatic control, video recognition analysis, autonomous driving, wireless communication and algorithm based on artificial intelligence, realizing fast and accurate recognition of license plate and parking space and issuing the information for searching vehicle position.|1. An intelligent cruise robot reverse searching system based on cloud wireless transmission, wherein the system comprises the following four parts: intelligent cruise robot end of the intelligent cruise robot end comprises one or more intelligent cruise robot; each intelligent cruise robot comprises a high-definition camera and a license plate number recognition software, automatically control the cruise module, universal high-performance wireless communication transmission module, server module and a battery drive module, the high definition camera and license plate number recognition software uses video analysis algorithm for license plate number and the carport associated identification of the vehicle, and the computer visual detection; the automatic control cruise module for controlling the intelligent cruise robot walking and obstacle recognition avoidance function, the module through ultrasonic wave radar plus video visual detection of two kinds of technology, it can detect the size and distance of the obstacle object of all ultrasonic radar in the walking. after calculating controlling the robot to perform the corresponding avoid and special marks (such as white lines) video visual detection may be based on the path of the robot intelligent cruise control according to the route automatic walking; the wireless communication module adopts 4 G internet of things technology responsible for identifying the licence plate number and the associated parking space information to the cloud; the generic high performance server module is used for license plate number recognition software, automatically control the cruise module, wireless communication transmission module provides hardware computing and memory function; the battery drive module for providing intelligent cruise robot driving walking power and to charge the battery under the condition of low battery, parking mark end must be a parking mark end formed by the position identification device for associating the license plate of parking and stopping vehicle, wherein identification device adopts two technical methods for parking space recognition, and the two technical means also determines the working mode and working state of the intelligent cruise robot: A smart sign technology using the intelligent vehicle is mounted below the stall, integrated with an automatic induction device, wireless communication transmission device and the parking state display device; automatic sensing device adopts low power consumption communication technology, when the intelligent car label after sensing the vehicle parking the parking state display device from a vehicle green light into red light of a vehicle and the wireless communication transmission device to send a wireless signal to the intelligent cruise robot; the intelligent cruise robot working mode is driven, robot parked at the appointed place, after the wireless signal or more intelligent receiving the intelligent car label sent by the car label of white lines, road is fast by no change of vehicle area to stall before collecting the vehicle license plate number, B adopts traditional ground printed number technology using traditional printing mode, printing numbers on each parking space, the intelligent cruise robot working mode is active the cruise. Under this working mode, intelligent cruise robot according to the set time interval, driving the all vehicle in the area which is responsible for identifying one identifying vehicle license plate at the same time, the identification number of the ground printed. the two kinds of number associated to the cloud for issuance, cloud the cloud comprising one or more management servers, one or more indoor LED guide screen and one or more outdoor LED guide screen, the server cluster working mode, can support large data storage operation, providing the original learning data for future data mining and artificial intelligence algorithms, the server providing a plurality of external interfaces, which can remotely control one or more indoor LED guide screen, one or more outdoor LED guide screen. communicated through the local area network or the Internet server and the guide screen for multiple parking the parking information, receiving parking and vehicle information transmitted by multiple intelligent cruise robot, providing for indoor and outdoor LED guide screen to issue to realize centralized management and resource sharing of the vehicle information; reverse the searching inquiry user through reverse searching inquiry APP software or Minicell public number and cloud communication end, can query the vehicle position after inputting the inquiry condition.\n | 2. A searching method based on the reverse searching system according to claim 1, wherein, when the vehicle identification end is a traditional ground printing vehicle number, the method adopts the intelligent cruise robot via the driving mode to the parking space recognition, specifically comprising the following steps: S1. the parking space of the parking lot A and lot B printed with the traditional ground parking space number associated set walking route on the lane marked white cruise robot, S2. Intelligent cruise robot via preset time (such as every 5 minutes or 10 minutes), driving walking cruise on a predetermined area and route, simultaneous analysis of license plate and parking space number through video analysis and recognition in the walking process, S3. the analysis back to the license and parking space number, the license plate and parking space number information uploaded to the cloud through the wireless communication module of the robot, S4. cloud end after statistic analysis, the transmission guide screen, the vehicle is guided into position to guide indoor LED screen and outdoor LED parking information and vehicle location through the network, S5. APP or Minicell public number owner inquiry end through backward searching when the need to find vehicle to vehicle position inquiry so as to fast and conveniently to the navigation guidance.\n | 3. The searching method based on the reverse searching system according to claim 1, wherein when the vehicle identification terminal is intelligent sign, the method uses intelligent cruise robot parking space recognition by passive cruise mode, specifically comprising the following steps: S1. the parking space of the parking lot A and lot B is equipped with an intelligent vehicle, the intelligent vehicle is integrated with automatic induction device, the wireless communication transmission device and the parking state display device, then information associated with the parking space S2. when there is the vehicle parking position, in the intelligent vehicle automatic induction device utilizing photoelectric or electromagnetic induction technology, after sensing the vehicle parks, the intelligent vehicle position state display device of label index light from the non-green light into red light of a vehicle, the intelligent vehicle one or more label by the wireless communication transmission device and respective intelligent cruise robot communication, intelligent cruise robot receives the information, it will rapidly reach with change of the parking area, S3. cloud end after statistic analysis, the transmission guide screen, the vehicle is guided into position to guide indoor LED screen and outdoor LED parking information and vehicle location through the network, S4. APP or Minicell public number owner inquiry end through backward searching when the need to find vehicle to vehicle position inquiry so as to fast and conveniently to the navigation guidance.", "output": "The system has an intelligent cruise robot for comprising a high-definition camera, a number plate, an universal high-performance wireless communication transmission module, a server module and a battery drive module. An automatic control cruise module controls intelligent cruise robot walking and obstacle recognition avoidance function. A server is connected with external interfaces to control an indoor LED guide screen. A cloud communication end determines a vehicle position by using reverse searching inquiry application (APP) software or public number after inputting inquiry condition. An INDEPENDENT CLAIM is also included for a cloud wireless transmission based intelligent cruise robot reverse searching method. Cloud wireless transmission based intelligent cruise robot reverse searching system. The system can automatically control video recognition analysis, autonomous driving, wireless communication and algorithm based on artificial intelligence and realize information in the number plate and a parking space after searching the vehicle position. The drawing shows a block diagram of a cloud wireless transmission based intelligent cruise robot reverse searching system. '(Drawing includes non-English language text)'", "instruction": "Please summarize the input" }, { "input": "Method for determining reliability of received dataThe invention relates to a computer-implemented method for determining the reliability level of the data received by the autonomous vehicle from the target vehicle, the target vehicle is different from the autonomous vehicle. The invention also relates to a corresponding control system and a computer program product.|1. A computer implemented method for determining the reliability level of the data received by the autonomous vehicle from the target vehicle, wherein the target vehicle is different from the autonomous vehicle and is arranged near the autonomous vehicle, the autonomous vehicle comprises a control unit, wherein the method comprises the following steps: and using wireless communication at the control unit, receiving a first set of operation data associated with the target vehicle, and determining the reliability level by the control unit based on the first set of operation data and a predetermined model of the expected behaviour of the first set of indicating operation data.\n | 2. The method according to claim 1, wherein the predetermined model is further dependent on the target vehicle.\n | 3. The method according to any one of claim 1 and 2, wherein the predetermined model further depends on the expected change of the first set of operation data over time.\n | 4. The method according to claim 1, further comprising the following steps: using a first sensor included in the autonomous vehicle to determine a second set of operational data associated with the target vehicle by the control unit, and determining, by the control unit, a difference between the first set of operational data and the second set, wherein the control unit determines the second set of operational data associated with the target vehicle, The determination of the level of reliability is also based on the determined difference.\n | 5. The method according to claim 4, wherein the predetermined model further indicates an expected behaviour of the second set of operational data.\n | 6. The method according to claim 4, wherein the predetermined model further indicates an expected behaviour of the difference between the first set of operational data and the second set.\n | 7. The method according to any one of the preceding claim, further comprising the following steps: -only when the reliability level is higher than the first predetermined threshold value, the operation data from the target vehicle is defined as reliable.\n | 8. The method according to any one of the preceding claim, further comprising the following steps: -if the reliability level is below a second predetermined threshold value, defining the operation data from the target vehicle as unreliable.\n | 9. The method according to claim 1, wherein the first set of operating data relates to at least one of a speed of the target vehicle, an acceleration, a deceleration, and the like.\n | 10. The method according to claim 2, further comprising the following steps: -determining an identifier of the target vehicle using a second sensor included in the autonomous vehicle.\n | 11. The method according to claim 4, wherein the predetermined model represents a statistical behaviour of the set of operational data.\n | 12. The method according to any one of the preceding claim, further comprising the following steps: -establishing a network connection with a server disposed outside the autonomous vehicle, and requesting the predetermined model from a remote server.\n | 13. The method according to claim 12, further comprising the following steps: if the reliability level is higher than the third predetermined threshold value, providing the updated model.\n | 14. The method according to claim 4, wherein the first sensor is at least one of a radar, a laser radar sensor, or a camera.\n | 15. The method according to any one of the preceding claim, wherein the operation data is a vehicle-to-vehicle (V2V) data.\n | 16. The invention claims an control system vehicle, comprising a control unit, which is suitable for determining the reliability level of the data received by the autonomous vehicle from the target vehicle, the control system control system vehicle is different from the autonomous vehicle and is arranged near the autonomous vehicle, wherein the control unit is adapted to: receiving a first set of operational data associated with the target vehicle using wireless communication, and determining the reliability level based on the first set of operational data and a predetermined model indicative of an expected behaviour of the first set of operational data.\n | 17. The system according to claim 16, wherein the predetermined model is further dependent on the target vehicle.\n | 18. The system according to any one of claim 16 and 17, wherein the predetermined model further depends on the expected change of time of the first set of operation data.\n | 19. The system according to claim 16, wherein the control unit is further adapted to: -determining a second set of operational data associated with the target vehicle using a first sensor included in the autonomous vehicle, and determining a difference between the first set of operational data and the second set, wherein the determination of the reliability level is further based on the determined difference.\n | 20. The system according to claim 19, wherein the predetermined model further indicates an expected behaviour of the second set of operational data.\n | 21. The system according to claim 19, wherein the predetermined model further indicates an expected behaviour of the difference between the first set of operational data and the second set.\n | 22. The system according to any one of 16 to 22 claim to 22, wherein the control unit is further adapted to: -only when the reliability level is higher than the first predetermined threshold value, the operation data from the target vehicle is defined as reliable.\n | 23. The system according to any one of claim 16 to 23, wherein the control unit is further adapted to: -if the reliability level is below a second predetermined threshold value, defining the operation data from the target vehicle as unreliable.\n | 24. A vehicle comprising the control system according to any one of claim 16 to 23.\n | 25. The vehicle according to claim 24, wherein the vehicle is a truck, a vehicle, a bus or a working machine.\n | 26. A computer program product comprising a non-transitory computer readable medium, the non-transitory computer readable medium is stored with a computer program component for operating the control system included in the autonomous vehicle; said control system is suitable for determining the reliability level of the data received by the autonomous vehicle from the target vehicle, the target vehicle is different from the autonomous vehicle and is arranged near the autonomous vehicle, the control system comprises a control unit, wherein the computer program product comprises: a code for receiving, at the control unit, a first set of operational data associated with the target vehicle using wireless communication, and a code for determining the reliability level by the control unit based on the first set of operating data and a predetermined model of expected behaviour of the first set of operating data.", "output": "The method involves receiving a first set of operational data relating to the target vehicle, at the control unit and using wireless communication. The reliability level based on the first set of operational data and a predetermined model indicative of an expected behavior of the first set of operational data is determined by the control unit. The predetermined model is further dependent on the target vehicle. The predetermined model is dependent on an expected variation over time of the first set of operational data. The method comprises determining, by the control unit, a second set of operational data related to the target vehicle using a first sensor comprised with the ego vehicle. INDEPENDENT CLAIMS are included for the following:a control system comprised with an ego vehicle;a computer program product; anda vehicle comprising a control system. Method for use in determining a reliability level of data received by an ego vehicle from a target vehicle being different from the ego vehicle. Greatly improves the determination of the reliability level for the received data. The drawing shows a schematic view of a conceptual control system. 200Control system202Control unit204Radar206Lidar sensor arrangement208Camera", "instruction": "Please summarize the input" }, { "input": "Method and system of determining trajectory for an autonomous vehicleA method of determining a navigation trajectory for an autonomous ground vehicle (AGV) is disclosed. The method may include receiving first Region of Interest (ROI) data associated with an upcoming trajectory path, and receiving predicted attributes associated with a future navigation trajectory for the upcoming trajectory path. The predicted attributes are derived based on map for the upcoming trajectory path. The method may further include modifying the predicted attributes based on environmental attributes extracted from first ROI data to generate modified attributes, and dynamically receiving a second ROI data associated with the upcoming trajectory path upon reaching the upcoming trajectory path. The method may further include predicting dynamic attributes associated with an imminent navigation trajectory for the upcoming trajectory path based on the second ROI data, and refining the modified attributes based on the one or more dynamic attributes to generate a final navigation trajectory.What is claimed is: \n | 1. A method of determining a navigation trajectory for an autonomous ground vehicle (AGV), the method comprising: \nreceiving, by a navigation device, first Region of Interest (ROI) data associated with an upcoming trajectory path, \nwherein the first ROI data is captured using a set of vision sensors installed on one or more road side infrastructures over vehicle-to-infrastructure (V2I) communication network, and \nwherein the first ROI data is an environmental data indicative of an obstacle present in a predetermined ROI along the upcoming trajectory path from an anticipated future location of the AGV; \nderiving, by the navigation device, one or more environmental attributes associated with the predetermined ROI based on the first ROI data; \nreceiving, by the navigation device, one or more predicted attributes associated with a static map of the predetermined ROI, from a cloud, \nwherein the one or more predicted attributes are predicted by performing at least one of: a semantic segmentation, an object detection, and a lane detection on map data associated with the predetermined ROI, using a first artificial intelligence (AI) prediction model deployed on a cloud-based computing device; \nmodifying, by the navigation device, the one or more predicted attributes associated with the static map of the predetermined ROI, based on the one or more environmental attributes associated with the predetermined ROI, to generate one or more modified attributes associated with the static map of the predetermined ROI along the future navigation trajectory, wherein the one or more modified attributes are generated by: \nextracting environmental information from the one or more environmental attributes, wherein the environmental information indicates obstacles present in the predetermined ROI; \nreceiving predictions from a cloud server; \nmerging the environmental information and the predictions received from the cloud server; and \ngenerating the one or more modified attributes based on the merging of the environmental information and the predictions received from the cloud server; \nreceiving, by the navigation device, a second ROI data associated with the predetermined ROI along the upcoming trajectory path upon reaching the anticipated future location, \nwherein the second ROI is captured by a current field-of-view (FOV) of the camera sensor mounted on the AGV; \npredicting, by the navigation device, one or more dynamic attributes associated with the predetermined ROI along the upcoming trajectory path based on the second ROI data using a second AI prediction model deployed on the navigation device; \ndetermining, by the navigation device, an error based on a comparison between the one or more modified attributes with the one or more dynamic attributes; \ngenerating, by the navigating device, one or more refined attributes by correcting the one or more modified attributes based on the error; \nupdating, by the navigation device, the future navigation trajectory based on the one or more refined attributes to generate a final navigation trajectory to refine the one or more dynamic attributes, wherein the one or more modified attributes are refined based on the one or more dynamic attributes to generate the final navigation trajectory; and \ncontrolling, by the navigation device, the AGV to follow the final navigation trajectory. \n | 2. The method of claim 1, wherein, the one or more dynamic attributes associated with the predetermined ROI are predicted, by the second AI prediction model by performing at least one of: the semantic segmentation, the object detection, and the lane detection.\n | 3. The method of claim 1, wherein the first ROI data is further captured using a set of vision sensors installed on or other AGVs, and wherein the first ROI data is provided to the navigation device or vehicle-to-vehicle (V2V) communication network.\n | 4. The method of claim 1, wherein the one or more environmental attributes comprise at least one of a type of an obstacle present in the upcoming trajectory path, and a location of the obstacle.\n | 5. A navigation device for determining a navigation trajectory for an autonomous ground vehicle (AGV), the navigation device comprising: \na processor; and \na memory communicatively coupled to the processor, wherein the memory stores processor instructions, which, on execution, causes the processor to: \nreceive first Region of Interest (ROI) data associated with an upcoming trajectory path, \nwherein the first ROI data is captured using a set of vision sensors installed on one or more road side infrastructures over vehicle-to-infrastructure (V2I) communication network, \nwherein the first ROI data is an environmental data indicative of an obstacle present in a predetermined ROI along the upcoming trajectory path from an anticipated future location of the AGV; \nderive one or more environmental attributes associated with the predetermined ROI based on the first ROI date; \nreceive one or more predicted attributes associated with a static map Of the predetermined ROI, from a cloud; \nwherein the one or more predicted attributes are predicted by performing at least one of: a semantic segmentation, an object detection, and a lane detection based—on map data associated with the predetermined ROI using a first artificial intelligence (AI) prediction model deployed on a cloud-based computing device; \nmodify the one or more predicted attributes associated with the static map of the predetermined ROI, based on the one or more environmental attributes associated with the predetermined ROI, to generate one or more modified attributes associated with the static map of the predetermined ROI along the future navigation trajectory, wherein the one or more modified attributes are generated by: \nextracting environmental information from the one or more environmental attributes, wherein the environmental information indicates obstacles present in the predetermined ROI, \nreceiving predictions from a cloud server, \nmerging the environmental information and the predictions received from the cloud server, and \ngenerating the one or more modified attributes based on the merging of the environmental information and the predictions received from the cloud server, \nreceive a second ROI data associated with the predetermined ROI along the upcoming trajectory path upon reaching the anticipated future location, \nwherein the second ROI is captured by a current field-of-view (FOV) of the camera sensor mounted on the AGV; \npredict one or more dynamic attributes associated with the predetermined ROI along the upcoming trajectory path based on the second ROI data using a second AI prediction model deployed on the navigation device; \ndetermine an error based on a comparison between the one or more modified attributes with the one or more dynamic attributes; \ndetermine an error based on a comparison between the one or more modified attributes with the one or more dynamic attributes; \ngenerate one or more refined attributes by correcting the one or more modified attributes based on the error; \nupdate the future navigation trajectory based on the one or more refined attributes to generate a final navigation trajectory to refine the one or more dynamic attributes, wherein the one or more modified attributes are refined based on the one or more dynamic attributes to generate the final navigation trajectory; and \ncontrol the AGV to follow the final navigation trajectory. \n | 6. The navigation device of claim 5, wherein the one or more dynamic attributes associated with the predetermined ROI are predicted, based-on by the second AI prediction model by performing at least one of: the semantic segmentation, the object detection, and the lane detection.\n | 7. The navigation device of claim 5, wherein the first ROI data is captured using a set of vision sensors installed on or other AGVs, and wherein the first ROI data is provided to the navigation device over vehicle-to-vehicle (V2V) communication network.\n | 8. The navigation device of claim 5, wherein the one or more environmental attributes comprise at least one of a type of an obstacle present in the upcoming trajectory path, and a location of the obstacle.\n | 9. A non-transitory computer-readable storage medium having stored thereon, a set of computer-executable instructions causing a computer comprising one or more processors to perform steps comprising: \nreceiving first Region of Interest (ROI) data associated with an upcoming trajectory path, \nwherein the first ROI data is captured using a set of vision sensors installed on one or more road side infrastructures, \nwherein the first ROI data is an environmental data indicative of an obstacle present in a predetermined ROI along the upcoming trajectory path from an anticipated future location of an autonomous ground vehicle (AGV); \nderiving one or more environmental attributes associated with the predetermined ROI based on the first ROI data; \nreceiving one or more predicted attributes associated with a static map of the predetermined ROI, from a cloud, \nwherein the one or more predicted attributes are predicted by performing at least one of: a semantic segmentation, an object detection, and a lane detection on map data associated with the predetermined ROI using a first artificial intelligence (AI) prediction model deployed on a cloud-based computing device; \nmodifying the one or more predicted attributes associated with the static map of the predetermined ROI, based on the one or more environmental attributes associated with the predetermined ROI to generate one or more modified attributes associated with the static map of the predetermined ROI along the future navigation trajectory, wherein the one or more modified attributes are generated by: \nextracting environmental information from the one or more environmental attributes, wherein the environmental information indicates obstacles present in the predetermined ROI, \nreceiving predictions from a cloud server, \nmerging the environmental information and the predictions received from the cloud server, and \ngenerating the one or more modified attributes based on the merging of the environmental information and the predictions received from the cloud server, \nreceiving a second ROI data associated with the predetermined ROI along the upcoming trajectory path upon reaching the anticipated future location, \nwherein the second ROI is captured by a current field-of-view (FOV) of the camera sensor mounted on the AGV; \npredicting one or more dynamic attributes associated with the predetermined ROI along the upcoming trajectory path based on the second ROI data using a second AI prediction model deployed on the navigation device; \ndetermining an error based on a comparison between the one or more modified attributes with the one or more dynamic attributes; \ngenerating one or more refined attributes by correcting the one or more modified attributes based on the error; \nupdating the future navigation trajectory based on the one or more refined attributes to generate a final navigation trajectory to refine the one or more dynamic attributes, wherein the one or more modified attributes are refined based on the one or more dynamic attributes to generate the final navigation trajectory; and \ncontrolling the AGV to follow the final navigation trajectory. \n | 10. The non-transitory computer-readable storage medium of 9, wherein one or more dynamic attributes associated with the ROI are predicted by the second AI prediction model by performing based-on at least one of a semantic segmentation, an object detection, and a lane detection.\n | 11. The non-transitory computer-readable storage medium of claim 9, wherein the first ROI data is captured using a set of vision sensors installed on other AGVs, and wherein the first ROI data is provided to the navigation device over vehicle-to-vehicle (V2V) communication network.\n | 12. The non-transitory computer-readable storage medium of claim 9, wherein the one or more environmental attributes comprise at least one of a type of an obstacle present in the upcoming trajectory path, and a location of the obstacle.", "output": "The method involves receiving region of interest (ROI) data associated with an upcoming trajectory path by a navigation device, where the ROI data is captured while approaching the trajectory path from an anticipated future location of an autonomous ground vehicle (AGV). A set of predicted attributes associated with a future navigation trajectory for the trajectory is received by the navigation device. The attributes are modified based on a set of environmental attributes to generate modified attributes. The modified attributes are refined based on the dynamic attributes by the device to generate a final navigation trajectory by using an artificial intelligence (AI) prediction model. INDEPEDENT CLAIMS are included for the followinga navigation device for determining navigation trajectory for AGV; anda non-transitory computer-readable storage medium storing program for determining navigation trajectory for AGV. Method for determining navigation trajectory for AGV i.e. autonomous ground vehicle (AGV), in indoor and outdoor settings to facilitate efficient transportation. The AGV is capable of sensing the dynamic changing environment, and of accurately navigating without any human intervention. The method provides for automatic classification for the objects detected in an environment and for enhancing mapping for the AGVs. The drawing shows a schematic view of the system for determining a navigation trajectory.602Computing system 604Processor 608Input device 610Output device 626RAM ", "instruction": "Please summarize the input" }, { "input": "MODIFYING BEHAVIOR OF AUTONOMOUS VEHICLE BASED ON PREDICTED BEHAVIOR OF OTHER VEHICLESA vehicle configured to operate in an autonomous mode could determine a current state of the vehicle and the current state of the environment of the vehicle. The environment of the vehicle includes at least one other vehicle. A predicted behavior of the at least one other vehicle could be determined based on the current state of the vehicle and the current state of the environment of the vehicle. A confidence level could also be determined based on the predicted behavior, the current state of the vehicle, and the current state of the environment of the vehicle. In some embodiments, the confidence level may be related to the likelihood of the at least one other vehicle to perform the predicted behavior. The vehicle in the autonomous mode could be controlled based on the predicted behavior, the confidence level, and the current state of the vehicle and its environment.|1. A method, comprising: \n * determining, using a computer system (112), a current state of a vehicle (308), wherein the vehicle is configured to operate in an autonomous mode;\n * determining, using the computer system, a current state of an environment of the vehicle (308), wherein the environment of the vehicle (308) comprises at least one other vehicle (312, 314);\n * determining, using the computer system, a predicted behavior of the at least one other vehicle (312, 314) based on at least the current state of the vehicle (308) and the current state of the environment of the vehicle (308);\n * determining, using the computer system, a confidence level, wherein the confidence level comprises a likelihood of the at least one other vehicle (312, 314) to perform the predicted behavior, and wherein the confidence level is based on at least the predicted behavior, the current state of the vehicle (308), and the current state of the environment of the vehicle (308); and\n * controlling, using the computer system, the vehicle (308) in the autonomous mode based on the predicted behavior, the confidence level, the current state of the vehicle (308), and the current state of the environment of the vehicle (308).\n | 2. The method of claim 1, wherein determining the current state of the vehicle comprises determining at least one of a current speed of the vehicle, a current heading of the vehicle, a current position of the vehicle, and a current lane of the vehicle.\n | 3. The method of claim 1, wherein determining the current state of the environment of the vehicle comprises determining at least one of a respective position of the at least one other vehicle, a respective speed of the at least one other vehicle, and a position of an obstacle.\n | 4. The method of claim 1, wherein controlling the vehicle comprises at least one of controlling the vehicle to accelerate, controlling the vehicle to decelerate, controlling the vehicle to change heading, controlling the vehicle to change lanes, controlling the vehicle to shift within the current lane and controlling the vehicle to provide a warning notification.\n | 5. The method of claim 1, wherein the predicted behavior is determined by obtaining a match or near match between the current states of the vehicle and the environment of the vehicle and predetermined scenarios.\n | 6. The method of claim 4, wherein the warning notification comprises at least one of a horn signal, a light signal, and a vehicle-to-vehicle communication message transmission and optionally wherein the vehicle-to-vehicle communication message transmission is transmitted using a dedicated short range communications (DSRC) device.\n | 7. A vehicle (308), comprising: \n * at least one sensor (310), wherein the at least one sensor is configured to acquire: \n * i) vehicle state information; and\n * ii) environment state information;\n * wherein the vehicle state information comprises information about a current state of the vehicle (308), wherein the environment state information comprises information about a current state of an environment of the vehicle (308), wherein the environment of the vehicle comprises at least one other vehicle (312, 314); and\n * a computer system configured to: \n * i) determine the current state of the vehicle (308) based on the vehicle state information;\n * ii) determine the current state of the environment of the vehicle (308) based on the environment state information;\n * iii) determine a predicted behavior of the at least one other vehicle (312, 314) based on at least the current state of the vehicle (308) and the current state of the environment of the vehicle (308);\n * iv) determine a confidence level, wherein the confidence level comprises a likelihood of the at least one other vehicle (312, 314) to perform the predicted behavior, and wherein the confidence level is based on at least the predicted behavior, the current state of the vehicle (308), and the current state of the environment of the vehicle (308); and\n * v) control the vehicle (308) in the autonomous mode based on the predicted behavior, the confidence level, the current state of the vehicle (308), and the current state of the environment of the vehicle (308).\n | 8. The vehicle of claim 7, wherein the at least one sensor comprises at least one of a camera, a radar system, a lidar system, a global positioning system, and an inertial measurement unit.\n | 9. The vehicle of claim 7, wherein the computer system is further configured to determine the current state of the vehicle based on at least one of a current speed of the vehicle, a current heading of the vehicle, a current position of the vehicle, and a current lane of the vehicle.\n | 10. The vehicle of claim 7, wherein the computer system is further configured to determine the current state of the environment of the vehicle based on at least one of a respective position of the at least one other vehicle, a respective speed of the at least one other vehicle, a position of an obstacle, and a map of the roadway.\n | 11. The vehicle of claim 7, wherein the computer system is further configured to cause at least one of accelerating the vehicle, decelerating the vehicle, changing a heading of the vehicle, changing a lane of the vehicle, shifting a position of the vehicle within a current lane, and providing a warning notification.\n | 12. The vehicle of claim 7, wherein the warning notification comprises at least one of a horn signal, a light signal, and a vehicle-to-vehicle communication message transmission.\n | 13. The vehicle of claim 7, wherein the vehicle-to-vehicle communication message transmission is transmitted using a dedicated short range communications (DSRC) device.\n | 14. A non-transitory computer readable medium having stored therein instructions executable by a computer system to cause the computer system to perform functions comprising: \n * determining a current state of a vehicle (308), wherein the vehicle is configured to operate in an autonomous mode;\n * determining a current state of an environment of the vehicle (308), wherein the environment of the vehicle comprises at least one other vehicle (312, 314);\n * determining a predicted behavior of the at least one other vehicle (312, 314) based on at least the current state of the vehicle (308) and the current state of the environment of the vehicle (308);\n * determining a confidence level, wherein the confidence level comprises a likelihood of the at least one other vehicle (312, 314) to perform the predicted behavior, and wherein the confidence level is based on at least the predicted behavior, the current state of the vehicle (308) and the current state of the environment of the vehicle (308); and\n * controlling the vehicle (308) in the autonomous mode based on the predicted behavior, the confidence level, the current state of the vehicle (308), and the current state of the environment of the vehicle (308).\n | 15. A computer program to be executed by the computer system of the vehicle claimed in any one of claims 7 to 13 to perform a method as claimed in any one of claims 1 to 6.", "output": "The behavior modification method involves determining confidence level which comprises the likelihood of at least one other vehicle (314,316) to perform a predicted behavior including acceleration, deceleration, change heading, change lanes, and leaving roadway. The confidence level is determined is based on predicted behavior of other vehicle, current vehicle state, and current vehicle environment state. Own vehicle (308) is controlled in autonomous mode based on the predicted behavior, confidence level, current vehicle state, and current vehicle environment state. INDEPENDENT CLAIMS are included for the following:a vehicle; anda non-transitory computer readable medium for storing instructions executable by computer system. Behavior modification method for vehicle (claimed) e.g. truck based on predicted behavior of other vehicle. Interaction between vehicles is allowed through the peripherals. Safety is improved through the computer system that causes the vehicle to slow down slightly by reducing the throttle. The drawing shows the top view of the autonomous vehicle operating scenario.302Left-most lane304Center lane306Right-most lane308Own vehicle314,316Other vehicle320Scenario", "instruction": "Please summarize the input" }, { "input": "Portable universal autonomous driving systemThis invention includes an autonomous driving system for automobiles, comprising: one or more common electronic communication ports of autonomous driving (communication ports) that are built-in on the automobiles; and one or more universal autonomous driving portable controllers (portable controllers) that are to be plugged-in to the said communication ports that are built-in on the automobiles. The interfaces of the communication ports and the portable controllers are both standardized such that the portable controllers can be plugged-in universally to all of the automobiles that are equipped with the built-in communication ports. The communication ports comprise electronic communication of all relevant electronic control units (ECUs) and feedback information of the automobiles, dedicated for the said portable controllers to communicate with and to control the automobiles. In addition to the portable controllers, the communication ports comprise a buffer that is designed to execute a short duration of controls to make emergency stops, in case of loss of connection with the portable controllers due to accidents or other failure conditions. The portable controllers comprise a central control unit (CCU), and a plurality of sensors and processors, and a plurality of data storages, and a plurality of data links, and a Global Positioning System (GPS). The portable controllers have standardized interfaces that match with that of the communication ports. The invention disclosed herein enables all automobiles to be ready for autonomous driving with minimal cost, provided that the said communication ports are adapted to the automobiles. The said portable controllers integrate all the hardware and software relevant to autonomous driving as standalone devices which can share the components, simplify the systems, reduce parasitic material and components, and most importantly, will be safer when multiple sensors and processors that are based on different physics are grouped together to detect objects and environment conditions. A method of compound sensor clustering (CSC) is introduced herein. The CSC method makes the sensors and processors to self-organize and to address real-world driving conditions. The portable controllers can be mass-produced as standard consumer electronics at lower cost. The portable controllers can also be more easily updated with the latest technologies since that they are standalone devices, which would be otherwise hard to achieve when the hardware and software are built-in permanently as part of the automobiles. The invention disclosed herein is more efficient, since that the portable controllers can be plugged-in to the automobiles when there are needs for autonomous driving, comparing with current methods of integrating autonomous driving control hardware and software that are built-in to automobiles permanently, which may not be used for autonomous driving frequently. The system also decouples the liability from automotive manufactures in case of accidents. The portable controllers can be insured by insurance companies independently, much like insuring human drivers.I claim: \n | 1. An autonomous driving system for an automobile, comprising: \na) one or more common electronic communication ports ( 100) for autonomous driving, wherein the communication ports are built-in on the automobile;\nb) one or more universal autonomous driving portable controllers, wherein said portable controllers are plugged in to the exterior of the automobile via the communication ports to detect a driving environment and to control the automobile for autonomous driving; wherein the communication ports and portable controllers share common interfaces; \nc) said one or more communication ports having a primary high speed control area network wherein said primary high speed control area network providing communication between said one or more portable controllers via said one or more communication ports, and at least one electronic control unit, further wherein said at least one electronic control unit configured to control at least one of steering, braking, and acceleration; \nd) said one or more communication ports having a secondary control area network, said secondary control area network configured to provide electronic communication, via said one or more communication ports, between said one or more portable controllers and secondary electronic control units, said secondary electronic control units configured to control at least one of turn signals, brake lights, emergency lights, head lamps and tail lamps, fog lamps, windshield wipers, defrosters, defogs, window regulators, and door locks; \ne) said one or more communication ports having a tertiary control area network configured to electronically communicate at least one feedback parameter to said one or more portable controllers, via said one or more communication ports, said at least one feedback parameter comprised one or more of velocity, acceleration, ABS activation, airbag deployment, and traction control activation; \nf) said one or more communication ports having a quaternary control area network configured to electronically communicate at least one status parameter to said one or more portable controllers via said one or more communication ports, said at least one status parameter comprised of one or more of fuel level, battery charge, tire pressure, engine oil level, coolant temperature, and windshield washer level; \ng) said one or more communication ports having a buffer memory controller that provides emergency control instruction for emergency stops of the automobiles in the event of loss of electronic connection with the portable controller due to accidents or other failure conditions; \nh) said one or more communication ports having electronic connections to the portable controllers and adapted to take at least one of the methods of: wired pin connections, wireless connections, or combinations of wired pin and wireless connections; \ni) said one or more portable controllers adapted for mounting locations and anchorages for the portable controllers, which match with the configurations of the portable controllers; \nj) a driver interface, said driver interface positioned to enable the driver to provide driving instructions to said one or more portable controllers; \nk) said one or more portable controllers having a plurality of sensors, said plurality of sensors comprising: \ni. one or more digital color cameras that detect optical information; \nii. one or more LIDARs that detect geometrical information; \niii. task specific sensors, including one or more ultrasonic sensors to detect near distance objects; \niv. one or more RADARs to detect median and far distance objects; \nv. one or more thermal imaging cameras or passive infrared sensors to detect objects that have heat emissions; \nvi. one or more three dimensional accelerometers to detect acceleration and vibration in vertical, lateral, and fore/aft directions; \nvii. one or more gyroscopes to detect inclination angles; \nviii. one or more physical-chemical sensors which adapted to detect specific air contents; \nix. one or more sound sensors to detect human languages or warning sirens; \nx. one or more water sensors for detecting rain and rain intensity; \nxi. one or more temperature sensors adapted for detecting temperature at the vicinity of the automobiles; \nl) said one or more portable controllers having a plurality of processors comprising: \ni. one or more processors for the digital color cameras; \nii. one or more processors for the LIDARs; \niii. one or more processors for the ultrasonic sensors; \niv. one or more processors for the RADARs; \nv. one or more processors for the thermal imaging cameras or passive infrared sensors; \nvi. one or more processors for the one or more three dimensional accelerometers; \nvii. and one or more processors for the gyroscopes; \nviii. one or more processors for the physical-chemical sensors; \nix. one or more processors for the sound sensors; \nx. one or more processors for the water sensors; \nxi. one or more processors for the temperature sensors; \nm) said one or more portable controllers programmed to generate driving instructions based on information from said plurality of processors; said processors of the plurality of processors programmed to generate queries addressing specific driving conditions, said specific driving conditions being determined by pre-defined criteria, wherein said queries include queries between the processors of said plurality of processors, said queries programmed in the processors; \nn) said one or more portable controllers having a Central Control Unit to direct the operations of the processors; \no) said one or more portable controllers having a plurality of communication links to send and/or receive data, said communication links including vehicle-to-vehicle and vehicle-to-infrastructure links; \np) said one or more portable controllers having a global positioning system to identify the locations of the automobiles to which the portable controllers are plugged-in; \nq) said one or more universal autonomous driving portable controllers are compatible with said communication ports. \n | 2. The autonomous driving system of claim 1 wherein, \na. the processors of said plurality of processors are integrated into said one or more portable controllers, \nb. the sensors of said plurality of sensors are each integrated with at least one of the processors. \n | 3. The autonomous driving system of claim 2 wherein the sensors of said plurality of sensors are built on one or more common substrates and/or integrated circuit boards.\n | 4. The autonomous driving system of claim 2 further comprising wherein querying sensors are dynamically organized as clusters to function as groups such that sensors and processors communicate with each other to validate sensed information pertaining to specific driving conditions.\n | 5. The autonomous driving system of claim 2 further comprising wherein queries function to detect mismatches between information between sensors and alert the Central Control Unit when mismatches are found.\n | 6. The autonomous driving system of claim 5 wherein a mismatch between LIDARs and RADARs generates an alert to the central Control Unit, thereby enabling the Central Control Unit to respond to potential hazards.\n | 7. The autonomous driving system of claim 5 wherein information derived from queries from the temperature sensors and water sensors is used to jointly determine a potential freezing rain condition.\n | 8. The autonomous driving system of claim 5 wherein the queries for detection of said potential freezing rain condition include detection of rain, and/or ice, and/or snow using captured images and pattern recognition.\n | 9. The autonomous driving system of claim 5 wherein detection of smoke by said physical chemical is used to query the thermal imaging cameras of passive infrared sensors to determine if there is a hazardous fire condition.\n | 10. The autonomous driving system of claim 5 wherein road curvatures are detected by the cameras and/or LIDARs when lateral acceleration is detected by combined information from said gyroscopes and accelerometers to inform the central control unit of lateral stability status.\n | 11. The autonomous driving system of claim 8 wherein ABS activation feedback triggers querying the water and temperature sensors.\n | 12. The autonomous driving system of claim 2 wherein the cameras are queried to identify icy road surfaces, thereby generating a categorized information of low coefficient of friction road surface to the Central Control Unit.\n | 13. The autonomous driving system of claim 5 wherein information derived from queries from the cameras and thermal sensors is used to jointly verify an existence of pedestrians.\n | 14. The autonomous driving system of claim 5 wherein, \na. said thermal sensors are queried to detect a human heat signature, and if the human heat signature is detected, then, \nb. the thermal sensor's processor queries object detection sensors for the presence of a human, said object sensors comprising the cameras, LIDARs, RADAR and/or the ultrasonic sensors. \n | 15. The autonomous driving system of claim 5 wherein information derived from the RADARs and/or the ultrasonic sensors detection of a potential road sign generates a query to the cameras, thereby reducing likelihood of missing or misidentifying road signs.\n | 16. The autonomous driving system of claim 15 wherein the queries from a RADAR are generated for detection of a road sign not identified or misidentified by camera captured images and pattern recognition.\n | 17. The autonomous driving system of claim 3 further comprising wherein querying sensors are dynamically organized as clusters to function as groups such that sensors and processors communicate with each other to validate sensed information pertaining to specific driving conditions.\n | 18. The autonomous driving system of claim 17, further comprising wherein queries function to detect mismatches between information between sensors and alert the Central Control Unit when mismatches are found.\n | 19. The autonomous driving system of claim 14, wherein the sensors of said plurality of sensors are built on one or more common substrates and/or integrated circuit boards.\n | 20. The autonomous driving system of claim 3, wherein one or more of the queries from at least one of said one or more RADARs are generated for detection of road signs not identified or misidentified by camera captured images and pattern recognition.", "output": "The computerized control system has common electronic communication ports (100) that are built-in on each of automobiles, and one or more universal autonomous driving portable controllers (200) that can be attached to the automobiles via the communication ports to accomplish the computerized control or autonomous driving. INDEPENDENT CLAIMS are included for the following:a design for buffer memory controller (BMC);a design of location of interface of communication ports;a design of the communication ports;a design of universal autonomous driving portable controllers;a sensor;a compound sensor clustering method;a back-up safety mechanism of interacting with the buffer memory controller; anda manufacturing rights of the electronic communication port of autonomous driving. Computerized control system or autonomous driving for automobiles. Ensures that computerized control or autonomous driving much more efficient, since that the portable controllers can be plugged-in to any of the automobiles that are equipped with the communication ports when there are needs for autonomous driving. The drawing shows the design of universal autonomous driving portable controller and its relation to common electronic communication port of autonomous driving. 100Common electronic communication ports160Mounting fixtures on automobiles200Autonomous driving portable controllers221Data storages230Central control unit241Data links260Wired or wireless user interface", "instruction": "Please summarize the input" }, { "input": "PREDICTING REALISTIC TIME OF ARRIVAL FOR QUEUE PRIORITY ADJUSTMENTA queue prioritization system and method for predicting a realistic time of arrival for performing a queue priority adjustment is provided. The method includes the steps of determining an estimated initial arrival time of a first user and a second user to a destination, the estimated initial arrival time being used to establish a queue priority, tracking a current location and predicting a route to be taken to arrive at the destination from the current location, detecting a schedule-altering event of the first user by analyzing: (i) the predicted route of the first user, or (ii) a current state of a vehicle, and reprioritizing the queue priority database, in response to calculating an updated queue priority of the first user that is lower than the queue priority of the second user, based on the detection of the schedule-altering event of the first user.CLAIMS \n | 1. A method for predicting a realistic time of arrival for performing a queue priority adjustment, the method comprising: \n determining, by a processor of a computing system, an estimated initial arrival time of a first user and a second user to a destination, the estimated initial arrival time being used to establish a queue priority of the first user and a queue priority of the second user, in a queue priority database, the queue priority of the first user being higher than the queue priority of the second user; \n tracking, by the processor, a current location of the first user and a current location of the second user, during transit to the destination; \n predicting, by the processor, a route to be taken to arrive at the destination from the current location of the first user and the current location of the second user, respectively; detecting, by the processor, a schedule-altering event of the first user by analyzing: (i) the predicted route of the first user, or (ii) a current state of a vehicle; and \n reprioritizing, by the processor, the queue priority database, in response to calculating an updated queue priority of the first user that is lower than the queue priority of the second user, based on the detection of the schedule-altering event of the first user. \n | 2. The method of claim 1, wherein determining the estimated initial arrival time of the first user and the second user includes: \n receiving, by the processor, a customer pick-up order and current GPS location information of the first user and the second user, the GPS location information obtained from a mobile device of the first user and the second user; \n reviewing, by the processor, historical user data, including a historical path taken by the first user and the second user to the destination; \n evaluating, by the processor, a complexity of the customer pick-up order to determine an earliest store pick-up time; and \n comparing, by the processor, the earliest store pick-up time, the current GPS location of the first user and the second user, and the historical path taken by the first user and the second user to the destination, to determine the estimated initial arrival time, in response to: (i) prompting the first user and the second user to depart for the destination, or (ii) receiving confirmation from the first user and the second user that the first user and the second user have departed for the destination. \n | 3. The method of claim 2, wherein prompting the first user and the second user to depart for the destination includes providing, by the processor, a suggested departure time based on at least one of: the earliest store pick-up time, current traffic conditions, current location of the first user and the second user, and historical traffic patterns of the first user and the second user.\n | 4. The method of claim 1, wherein determining the estimated initial arrival time includes: receiving, by the processor, scheduled delivery information and current GPS location information of the first user and the second user; \n reviewing, by the processor, historical delivery pattern data, based on previous \n deliveries to the destination; \n evaluating, by the processor, the scheduled delivery information, the current GPS location information of the first user and the second user, and the historical delivery pattern data, to determine an earliest delivery arrival time. \n | 5. The method of claim 1, wherein the detecting of the schedule -altering event includes receiving data from a plurality of data sources, the plurality of data sources including a current GPS location of the user received from a mobile device of the user, a real-time traffic data received from the mobile device of the user, a real-time traffic data received from a third party application server, a weather data received from the mobile device of the user, a weather data retrieved from a third party application server, a historical traffic pattern information of the user, a sensor data received from one or more sensors associated with the user, a vehicle and traffic information received from a vehicle-to- vehicle communication network, and a combination thereof.\n | 6. The method of claim 1, wherein the schedule-altering event is at least one of: a delay, a traffic jam, a traffic accident, a vehicle failure, a weather occurrence, an intervening stop by the first user, a wrong turn of the user, an alternative route taken by the user, a predicted traffic delay of the first user, and a predicted weather delay of the first user.\n | 7. The method of claim 1, wherein predicting the route of the first user includes analyzing the current location of the first user, current traffic data, construction data, historical routes to the destination taken by the first user, and map data.\n | 8. The method of claim 1, wherein reprioritizing the queue priority database causes: (i) an in-store pickup order associated with the second user to be available for pickup when the second user arrives at the destination, and before an in-store pickup order associated with the first user is available for pickup, or (ii) a delivery vehicle operated by the first user to be assigned to an available unloading location at the destination, when the first user arrives at the destination.\n | 9. The method of claim 1, wherein the first user and the second user is a customer, a delivery truck driver, an autonomous vehicle, or an unmanned drone.\n | 10. A computer system, comprising: \n a processor; \n a memory device coupled to the processor; and \n a computer-readable storage device coupled to the processor, wherein the storage device contains program code executable by the processor via the memory device to implement a method for predicting a realistic time of arrival for performing a queue priority adjustment, the method comprising: determining, by a processor of a computing system, an estimated initial arrival time of a first user and a second user to a destination, the estimated initial arrival time being used to establish a queue priority of the first user and a queue priority of the second user, in a queue priority database, the queue priority of the first user being higher than the queue priority of the second user; \n tracking, by the processor, a current location of the first user and a current location of the second user, during transit to the destination; \n predicting, by the processor, a route to be taken to arrive at the destination from the current location of the first user and the current location of the second user, respectively; \n detecting, by the processor, a schedule-altering event of the first user by \n analyzing: (i) the predicted route of the first user, or (ii) a current state of a vehicle; and \n reprioritizing, by the processor, the queue priority database, in response to \n calculating an updated queue priority of the first user that is lower than the queue priority of the second user, based on the detection of the schedule- altering event of the first user. \n | 11. The computer system of claim 10, wherein determining the estimated initial arrival time of the first user and the second user includes: \n receiving, by the processor, a customer pick-up order and current GPS location information of the first user and the second user, the GPS location information obtained from a mobile device of the first user and the second user; \n reviewing, by the processor, historical user data, including a historical path taken by the first user and the second user to the destination; \n evaluating, by the processor, a complexity of the customer pick-up order to determine an earliest store pick-up time; and \n comparing, by the processor, the earliest store pick-up time, the current GPS location of the first user and the second user, and the historical path taken by the first user and the second user to the destination, to determine the estimated initial arrival time, in response to: (i) prompting the first user and the second user to depart for the destination, or (ii) receiving confirmation from the first user and the second user that the first user and the second user have departed for the destination. \n | 12. The computer system of claim 1 1, wherein prompting the first user and the second user to depart for the destination includes providing, by the processor, a suggested departure time based on at least one of: the earliest store pick-up time, current traffic conditions, current location of the first user and the second user, and historical traffic patterns of the first user and the second user. \n | 13. The computer system of claim 10, wherein determining the estimated initial arrival time includes: receiving, by the processor, a scheduled delivery information and a current GPS location information of the first user and the second user; \n reviewing, by the processor, a historical delivery pattern data, based on previous deliveries to the destination; \n evaluating, by the processor, the scheduled delivery information, the current GPS location information of the first user and the second user, and the historical delivery pattern data, to determine an earliest delivery arrival time. \n | 14. The computer system of claim 10, wherein the detecting of the schedule-altering event includes receiving data from a plurality of data sources, the plurality of data sources including a current GPS location of the user received from a mobile device of the user, a real-time traffic data received from the mobile device of the user, a real-time traffic data received from a third party application server, a weather data received from the mobile device of the user, a weather data retrieved from a third party application server, a historical traffic pattern information of the user, a sensor data received from one or more sensors associated with the user, a vehicle and traffic information received from a vehicle-to- vehicle communication network, and a combination thereof.\n | 15. The computer system of claim 10, wherein the schedule -altering event is at least one of: a delay, a traffic jam, a traffic accident, a vehicle failure, a weather occurrence, an intervening stop by the first user, a wrong turn of the user, an alternative route taken by the user, a predicted traffic delay of the first user, and a predicted weather delay of the first user.\n | 16. The computer system of claim 10, wherein predicting the route of the first user includes analyzing the current location of the first user, current traffic data, construction data, historical routes to the destination taken by the first user, and map data.\n | 17. The computer system of claim 10, wherein reprioritizing the queue priority database causes: (i) an in-store pickup order associated with the second user to be available for pickup when the second user arrives at the destination, and before an in-store pickup order associated with the first user is available for pickup, or (ii) a delivery vehicle operated by the first user to be assigned to an available unloading location at the destination, when the first user arrives at the destination.\n | 18. The computer system of claim 10, wherein the first user and the second user is a customer, a delivery truck driver, an autonomous vehicle, or an unmanned drone.\n | 19. A computer program product, comprising a computer-readable hardware storage device storing a computer-readable program code, the computer-readable program code comprising an algorithm that when executed by a computer processor of a computing system implements a method for predicting a realistic time of arrival for performing a queue priority adjustment, the method comprising: \n determining, by a processor of a computing system, an estimated initial arrival time of a first user and a second user to a destination, the estimated initial arrival time being used to establish a queue priority of the first user and a queue priority of the second user, in a queue priority database, the queue priority of the first user being higher than the queue priority of the second user; \n tracking, by the processor, a current location of the first user and a current location of the second user, during transit to the destination; \n predicting, by the processor, a route to be taken to arrive at the destination from the current location of the first user and the current location of the second user, respectively; detecting, by the processor, a schedule-altering event of the first user by analyzing: (i) the predicted route of the first user, or (ii) a current state of a vehicle; and \n reprioritizing, by the processor, the queue priority database, in response to calculating an updated queue priority of the first user that is lower than the queue priority of the second user, based on the detection of the schedule-altering event of the first user. \n | 20. The computer program product of claim 19, wherein determining the estimated initial arrival time of the first user and the second user includes: \n receiving, by the processor, a customer pick-up order and a current GPS location information of the first user and the second user, the GPS location information obtained from a mobile device of the first user and the second user; \n reviewing, by the processor, historical user data, including a historical path taken by the first user and the second user to the destination; \n evaluating, by the processor, a complexity of the customer pick-up order to determine an earliest store pick-up time; and \n comparing, by the processor, the earliest store pick-up time, the current GPS location of the first user and the second user, and the historical path taken by the first user and the second user to the destination, to determine the estimated initial arrival time, in response to: (i) prompting the first user and the second user to depart for the destination, or (ii) receiving confirmation from the first user and the second user that the first user and the second user have departed for the destination. \n | 21. The computer program product of claim 20, wherein prompting the first user and the second user to depart for the destination includes providing, by the processor, a suggested departure time based on at least one of: the earliest store pick-up time, current traffic conditions, current location of the first user and the second user, and historical traffic patterns of the first user and the second user.\n | 22. The computer program product of claim 19, wherein determining the estimated initial arrival time includes: \n receiving, by the processor, a scheduled delivery information and current GPS \n location information of the first user and the second user; \n reviewing, by the processor, a historical delivery pattern data, based on previous deliveries to the destination; and evaluating, by the processor, the scheduled delivery information, the current GPS location information of the first user and the second user, and the historical delivery pattern data, to determine an earliest delivery arrival time. \n | 23. The computer program product of claim 19, wherein the detecting of the schedule -altering event includes receiving data from a plurality of data sources, the plurality of data sources including a current GPS location of the user received from a mobile device of the user, a real-time traffic data received from the mobile device of the user, a real-time traffic data received from a third party application server, a weather data received from the mobile device of the user, a weather data retrieved from a third party application server, a historical traffic pattern information of the user, a sensor data received from one or more sensors associated with the user, a vehicle and traffic information received from a vehicle-to-vehicle communication network, and a combination thereof.\n | 24. The computer program product of claim 19, wherein the schedule -altering event is at least one of: a delay, a traffic jam, a traffic accident, a vehicle failure, a weather occurrence, an intervening stop by the first user, a wrong turn of the user, an alternative route taken by the user, a predicted traffic delay of the first user, and a predicted weather delay of the first user.\n | 25. The computer program product of claim 19, wherein predicting the route of the first user includes analyzing the current location of the first user, current traffic data, construction data, historical routes to the destination taken by the first user, and map data.\n | 26. The computer program product of claim 19, wherein reprioritizing the queue priority database causes: (i) an in-store pickup order associated with the second user to be available for pickup when the second user arrives at the destination, and before an in-store pickup order associated with the first user is available for pickup, or (ii) a delivery vehicle operated by the first user to be assigned to an available unloading location at the destination, when the first user arrives at the destination.\n | 27. The computer program product of claim 19, wherein the first user and the second user is a customer, a delivery truck driver, an autonomous vehicle, or an unmanned drone .", "output": "The method involves tracking a current location of a first user and a current location of a second user during transit to a destination by a processor (141). A route to be taken to arrive at the destination from the current location of the first user and the second user is predicted by the processor. A schedule-altering event of the first user is detected by the processor by analyzing the predicted route of the first user, or a current state of a vehicle. A queue priority database (114) is reprioritized by the processor in response to calculating an updated queue priority of the first user that is lower than queue priority of the second user based on the detection of the schedule-altering event of the first user. INDEPENDENT CLAIMS are also included for the following:a computer systema computer program product comprising a set of instructions for predicting a realistic time of arrival for performing a queue priority adjustment for a user at a retail location by a computer system. Method for predicting a realistic time of arrival for performing a queue priority adjustment for a user e.g. customer, delivery lorry driver, autonomous vehicle or unmanned drone, at a retail location by a computer system (all claimed). The method enables allowing goods to be retrieved in response to the predicted time of arrival based on user input information on a purchase order or to be automatically generated based on complexity of the purchase order. The method enables allowing the customers to pick up purchased items at the retail location selected by the customer by maintaining the queue priority such that the purchased items are ready for the customer when the customer arrives at the retail location. The drawing shows a schematic block diagram of a queue prioritization system. 113Customer database114Queue priority database120Computer system141Processor142Memory", "instruction": "Please summarize the input" }, { "input": "Method for controlling an autonomously operated vehicle and error control moduleThe invention relates to a method for controlling an autonomously operated vehicle (1) in the event of an error (Fm; m = 1,2, ... N0), with at least the following steps: \n- Detection of a fault (Fm) in the vehicle (1); \n- Evaluating the detected error (Fm) and, as a function thereof, outputting an error assessment result (EF), the error assessment result (EF) indicating a significance of the detected error (Fm); \n- Determination of a local environment (Uk; k = 1,2, ..., N3) in which the vehicle (1) is located; \n- Selecting and activating an emergency operating mode as a function of the determined local environment (Uk) as well as the output evaluation result (EF) and / or the significance of the detected error (Fm); and \n- Autonomous control of the vehicle (1) as a function of the selected and activated emergency operating mode using at least one movement system (100; 101, 102, 103) and / or environment detection system (300; 301, 302, 303, 304, 305) in the vehicle (1).|1. Method for controlling an autonomously operated vehicle (1) in the event of an error (Fm; m = 1,2, ... N0), with at least the following steps:\n- Detection of a fault (Fm) in the vehicle (1) (ST1); \n- Assessment of the detected error (Fm) (ST2) and, as a function thereof, output of an error evaluation result (EF) (ST3), the error evaluation result (EF) having a significance (Bi; i = 1,2 ... N1) of the detected fault (Fm); \n- Determination of a local environment (Uk; k = 1,2, ..., N3) in which the vehicle (1) is located (ST4); \n- Selecting and activating an emergency operating mode (NBnk; n = 1,2, ..., N2) depending on the determined local environment (Uk) and the output evaluation result (EF) and / or the significance (Bi) of the detected error (Fm) (ST5); and \n- Autonomous control of the vehicle (1) as a function of the selected and activated emergency operating mode (NBnk) using at least one movement system (100; 101, 102, 103) and / or environment detection system (300; 301, 302, 303, 304, 305 ) in the vehicle (1) (ST6). \n | 2. Procedure according to Claim 1, characterizedthat the error (Fm) from in the vehicle (1) via a vehicle-internal data transmission system (30), for example a CAN bus (31), or from directly transmitted status signals (So, o = 1, 2, ..., N4) is derived.\n | 3. Procedure according to Claim 2, characterizedthat the status signals (So) from the at least one movement system (100; 101, 102, 103) of the vehicle (1) and / or environment detection system (300; 301, 302, 303, 304, 305) of the vehicle (1) and / or from further sources (3) of the vehicle (1), for example a V2X module (3), the status signals (So) indicating whether the respective movement system (100; 101, 102, 103) and / or the environment detection system (300; 301, 302, 303, 304, 305) has an error (Fm).\n | 4. Method according to one of the preceding claims, characterizedthat as an error (Fm) at least\n- an elementary, fatal error (F1) or \n- a moderate error (F2) or \n- a non-safety-critical error (F3) can be detected. \n | 5. Method according to one of the preceding claims, characterizedthat the error evaluation result (EF) output is at least that there is an error (Fm) with high importance (B1) or medium importance (B2) or low importance (B3).\n | 6. Procedure according to Claim 4 or 5, characterizedthat if present\n- A fault (Fm) with high significance (B1) and / or an elementary, serious fault (F1) depending on the local environment (Uk) a first emergency operating mode (NB1k) is activated to use the vehicle (1) of the at least one movement system (100; 101, 102, 103) and / or environment detection system (300; 301, 302, 303, 304, 305) in the vehicle (1) to be brought to a standstill (H) autonomously, or \n- an error (Fm) with medium significance (B2) and / or a moderately serious error (F2) depending on the local environment, a second emergency operating mode (NB2k) is activated in order to move the vehicle (1) using the at least one movement system ( 100; 101, 102, 103) and / or environment detection system (300; 301, 302, 303, 304, 305) in the vehicle (1) to move autonomously to a stopping area (HB), or \n- an error (Fm) of little importance (B3) and / or a non-safety-critical error (F3) depending on the local environment (Uk) a third emergency operating mode (NB3k) is activated in order to stop the autonomous driving of the vehicle (1) To continue using the at least one movement system (100; 101, 102, 103) and / or environment detection system (300; 301, 302, 303, 304, 305) in the vehicle (1). \n | 7. Procedure according to Claim 6, characterizedthat in the first emergency operating mode (NB1k) at least one brake system (101) and / or a drive system (103) for reducing the engine power (ML) is controlled autonomously in order to bring the vehicle (1) to a standstill (H), and as a function of the local environment (Uk), a steering system (103) is still controlled autonomously in order to enable an avoidance to a secured area (BS), for example an emergency lane (BS1), before the standstill (H) is reached.\n | 8. Procedure according to Claim 6 or 7, characterizedthat in the second emergency operating mode (NB2k) a drive system (102) and / or a braking system (101) and / or a steering system (103) of the vehicle (1) are controlled autonomously in such a way that the vehicle (1) moves with reduced Speed ??(vred) and / or with reduced engine power (ML) moves autonomously along a defined driving trajectory (T) to the stopping area (HB).\n | 9. Procedure according to Claim 8, characterizedthat the neck area (HB) and / or the reduced speed (vred) and / or the motor power (ML) is selected as a function of the local environment (Uk).\n | 10. Method according to one of the Claims 6 to 9, characterizedthat the first emergency operating mode (NB1k) is fixed and unchangeable and / or the second emergency operating mode (NB2k) can be expanded and / or the third and / or further emergency operating modes (NBnk, for n> = 3) changed and / or can be expanded.\n | 11. Method according to one of the preceding claims, characterizedthat a motorway (U1), a country road (U2), an urban environment (U3), a depot (U4), a construction site (U5) or a port area (U6) are determined as the local environment (Uk).\n | 12. Method according to one of the preceding claims, characterizedthat the local environment (Uk) is an environment with a public traffic area (?) or an environment with an enclosed area (G).\n | 13. Method according to one of the preceding claims, characterizedthat the local environment (Uk) is determined as a function of position information (PI) and / or environment information (UI), the position information (PI) being based on automatically provided position data (DPa) and / or manually entered position data (DPm) and the environment information (UI) is based on provided environment data (DU) that are output, for example, by environment detection systems (300; 301, 302, 303, 304, 305) in the vehicle (1) become.\n | 14. Procedure according to Claim 13, characterizedthat the automatically provided position data (DPa) are output by a position detection device (70) and contain a global position (Pg) of the vehicle (1) and / or are output by a telematics system (400), the telematics system ( 400) accesses external information (IX) which is transmitted via a local data interface (8).\n | 15. Procedure according to Claim 13 or 14, characterizedthat the local environment (Uk) is extracted from the automatically provided position data (DPa) via map data (KD), in particular a navigation system (7).\n | 16. Method according to one of the preceding claims, characterizedthat the autonomously and / or driverlessly controlled vehicle (1) is controlled according to an autonomy level (AS) equal to three or higher.\n | 17. Method according to one of the preceding claims, characterizedthat the error (Fm) and / or the error evaluation result (EF) after the detection (ST1) and the evaluation (ST2, ST3) of the error (Fm) via a communication module (50) and / or a V2X module ( 3) is issued, for example to a vehicle operator (K1), a dispatcher (K2), to yard staff (K3) of a depot (U4) and / or to other people (K4) and / or to another vehicle (2) and / or to infrastructure facilities (200).\n | 18. Error control module (60) for autonomous control of a vehicle (1) in the event of an error (Fm), in particular according to a method according to one of the preceding claims, the error control module (60) being designed\n- An emergency operating mode (NBnk; n = 1,2 .. N2) as a function of a local environment (Uk) determined from position information (PI) and / or environment information (UI), in which the vehicle (1 ) and to select and activate an evaluation result (EF) output by an error evaluation module (40) and / or a significance (Bi) of an error (Fm) detected by an error detection module (20), and \n- The vehicle (1) autonomously depending on the selected and activated emergency operating mode (NBnk) using at least one movement system (100; 101, 102, 103) and / or environment detection system (300; 301, 302, 303, 304, 305) to control in the vehicle (1). \n | 19. Vehicle (1) with a movement coordination module (10) for coordinating and controlling movement systems (100; 101, 102, 103) and / or surrounding systems (300; 301, 302, 303, 304, 305) in the vehicle (1) for autonomous control of the vehicle (1), an error detection module (20) for detecting an error (Fm) in the vehicle (1), an error evaluation module (40) for evaluating the detected error (Fm) and to output an evaluation result (EF) and with an error control module (60) Claim 18 for autonomous control of the vehicle (1), in particular via the movement coordination module (10), in the event of an error (Fm).", "output": "The method involves detecting a fault (Fm) in vehicle (1). The assessment of the detected fault is performed. An error evaluation result (EF) is outputted, where the error evaluation result has a significance of detected fault. A local environment in which the vehicle is located is determined. An emergency operating mode is selected and activated as a function of a determined local environment and the evaluation result and/or the significance of the detected fault. The autonomous control of the vehicle is performed as a function of the selected and activated emergency operating mode using one movement system (100-103) and/or environment detection system (300-304) in the vehicle. INDEPENDENT CLAIMS are included for the following:an error control module for autonomous control of a vehicle in the presence of an error; anda vehicle with a movement coordination module for coordinating and controlling movement systems and/or surrounding systems in the vehicle. Method for controlling autonomously operated vehicle e.g. truck in event of fault, using error control module (claimed). The autonomous driving operation is ensured safely and efficiently in the event of a fault. The vehicle is autonomously controlled as a function of the local environment when the error occurs, where the error that can have an effect on the own vehicle and/or on an environment around the own vehicle taxes to a certain extent. The autonomous vehicles are operated in a single local environment, and in different local environments. The efficiency and the possibility of reacting to error can turn out to be different depending on the local environment, so that the different emergency operating modes are selected or activated depending on the environment. The warning is effectively issued to external persons or vehicles in addition to the autonomous control, when the vehicle is driving without any occupants. The drawing shows a schematic view of the autonomously operated vehicle in local environment with public traffic area. 1Vehicle100-103Movement system300-304Environment detection systemEFError evaluation resultFmFault", "instruction": "Please summarize the input" }, { "input": "METHOD FOR TRANSFORMING BETWEEN A LONG VEHICLE COMBINATION AND A PLATOON ON THE MOVEThe invention relates to a method for transforming between a long vehicle combination (10) and a platoon (12) on the move. The present invention also relates to vehicles (14a-b; 14b-c) for such a method.|1. A method for transforming between a long vehicle combination (10) and a platoon (12) on the move, wherein the long vehicle combination (10) comprises a plurality of vehicles (14a-c) mechanically coupled together one after the other, which method comprises the steps of: \n * detecting (S2) that the long vehicle combination (10) is approaching a first road section (58) ahead, by means of a navigation system (22) or by means of wireless vehicle-to-infrastructure (V2I) communication, which first road section (58) stipulates decoupling the vehicles (14a-c) of the long vehicle combination (10) to form the platoon (12);\n * automatically decoupling (S4) the vehicles (14a-c) from each other while the vehicles (14a-c) are in motion to form the platoon (12) before reaching the first road section (58);\n * the platoon (12) driving (S5) through the first road section (58);\n * detecting (S6) a second road section (62), by means of a navigation system (22) or by means of wireless vehicle-to-infrastructure (V2I) communication, which stipulates coupling together the vehicles (14a-c) of the platoon (12) to form the long vehicle combination (10);\n * a vehicle (14a-b) in the platoon (12) immediately ahead of a following vehicle (14b-c) of said the platoon (12) sending (S7) information (64) to the following vehicle (14b-c) via wireless vehicle-to-vehicle communication, which information (64) indicates the position and speed of a rear automatic coupling device (18) of the vehicle (14a-b) immediately ahead;\n * based at least on the position and speed indicated in the sent information (64), autonomously driving (S8) the following vehicle (14b-c) so that the rear automatic coupling device (18) of the vehicle (14a-b) immediately ahead of the following vehicle (14b-c) gets within an operational range (66) of a front coupling element (32) of the following vehicle (14b-c);\n * while in motion and when the rear automatic coupling device (18) is within the operational range (66), the following vehicle (14b-c) automatically adjusting (S9) a front coupling device (30) including said front coupling element (32) so that the position of the front coupling element (32) matches the position of the rear automatic coupling device (18) as indicated in the sent information (64); and\n * automatically coupling (S10) together the following vehicle (14b-c) and the vehicle (14a-b) immediately ahead while the vehicles (14a-c) are in motion to form at least a part of the long vehicle combination (10),\n wherein each vehicle (14a-b) immediately ahead is adapted to estimate the position of its rear automatic coupling device (18) based on \n * the heading of the vehicle (14a-b) immediately ahead,\n * the position of a part of the vehicle (14a-b) immediately ahead as determined by a navigation system (22) of the vehicle immediately ahead,\n * a vehicle model representing the vehicle (14a-b) immediately ahead,\n * the height of the rear automatic coupling device (18), and\n * in case the vehicle (14a-b) immediately ahead is an articulated vehicle, at least one articulation angle of the vehicle immediately ahead as detected by at least one articulation angle detection means (28) on the vehicle immediately ahead.\n | 2. A method according to claim 1, wherein each following vehicle (14b-c) comprises actuator means (48a-b) adapted to adjust the front coupling device (30).\n | 3. A method according to claim 2, wherein the actuator means (48a-b) is adapted to laterally adjust the front coupling device (30).\n | 4. A method according to claim 2 or 3, wherein the actuator means (48a-b) is adapted to vertically adjust the front coupling device (30).\n | 5. A method according to any preceding claim, wherein each following vehicle (14b-c) comprises means (54) adapted to adjust the length of the front coupling device (30).\n | 6. A method according to claim 5, further comprising the step of: shortening (S1) the length of the front coupling device (30) while driving as the long vehicle combination.\n | 7. A method according to any preceding claim, wherein each following vehicle (14b-c) is adapted to estimate the position of its front coupling element (32) based on \n * the heading of the following vehicle (14b-c),\n * the position of a part of the following vehicle (14b-c) as determined by a navigation system (36) of the following vehicle (14b-c),\n * a vehicle model representing the following vehicle (14b-c),\n * a first angle representing a lateral adjustment of the front coupling device (30),\n * a second angle representing any vertical adjustment of the front coupling device (30),\n * the length of the front coupling device (30), and\n * a height related to the front coupling device (30).\n | 8. A method according to any preceding claim, wherein each vehicle immediately ahead (14a-b) comprises at least two independent means (21, 22) for determining its speed.\n | 9. A method according to any preceding claim, further comprising the step of: a leading vehicle of the platoon sending an acceleration or deceleration request (63) to the following vehicles (14b-c) of the platoon (12) via wireless vehicle-to-vehicle communication.\n | 10. A method according to any preceding claim, wherein the information (64) sent from the vehicle (14a-b) immediately ahead to the following vehicle (14b-c) includes the heading of the rear automatic coupling device (18) of the vehicle (14a-b) immediately ahead.\n | 11. A method according to any preceding claim, wherein the first road section (58) is at least one of a bridge, a roundabout, and a turn.\n | 12. A method according to any preceding claim, further comprising the step of planning (S3) an inter-vehicle distance (60) between subsequent vehicles based on the first road section (58) ahead, wherein the platoon (12) is driven through the first road section (58) with the planned inter-vehicle distance(s) (60).\n | 13. A method according to any preceding claim, wherein at least one of the automatic decoupling and the automatic coupling is performed while driving at a safety speed.\n | 14. A method according to any preceding claim, wherein the automatic coupling is performed while driving on a straight road.\n | 15. A method according to any preceding claim, wherein the automatic coupling starts with the vehicle (14b) immediately behind the leading vehicle (14a) of the platoon (12) coupling to the leading vehicle (14a) of the platoon (12).\n | 16. A method according to any preceding claim, wherein the automatic decoupling starts with the last vehicle (14c) of the long vehicle combination (10) decoupling from the vehicle immediately ahead (14b).\n | 17. A method according to any preceding claim, wherein each vehicle (14b-c) after the leading vehicle (14a) of the long vehicle combination (10) or platoon (12) is an autonomous vehicle.\n | 18. A method according to any preceding claim, wherein at least one vehicle (14b-c) after the leading vehicle (14a) of the long vehicle combination (10) or platoon (12) is an autonomous dolly (16) and semi-trailer combination.\n | 19. A vehicle (14a-b) comprising: \n * a rear automatic coupling device (18); means (21) for speed determination;\n * a control unit (20) adapted to estimate the position of the rear automatic coupling device (18) while the vehicle (14a-b) is in motion based on the heading of the vehicle (14a-b), the position of a part of the vehicle (14a-b) as determined by a navigation system (22) of the vehicle, a vehicle model representing the vehicle (14a-b), the height of the rear automatic coupling device (18) as determined by a height level sensor (24), and in case the vehicle (14a-b) is an articulated vehicle, at least one articulation angle of the vehicle as detected by at least one articulation angle detection means (28) on the vehicle; and\n * communication means (26) adapted to wirelessly send information (64) indicating the estimated position and the speed of the rear automatic coupling device (18) to a following vehicle (14b-c).\n | 20. A vehicle (14b-c) comprising: \n * a front coupling device (30) including a front coupling element (32);\n * a control unit (34) adapted to estimate the position of the front coupling element (32) while the vehicle (14b-c) is in motion;\n * a navigation system (36) and a height level sensor (38);\n * communication means (40) adapted to wirelessly receive information (64) from a vehicle (14a-b) immediately ahead, which information (64) indicates the position and speed of a rear automatic coupling device (18) of the vehicle (14a-b) immediately ahead;\n * autonomous driving means (42) adapted to drive the vehicle (14b-c) based at least on the position and speed in the received information (64) so that the rear automatic coupling device (18) of the vehicle (14a-b) immediately ahead gets within an operational range (66) of the front coupling element (32); and\n * means (48a-b, 54) adapted to automatically adjust the front coupling device (30), while in motion and when the rear automatic coupling device (18) is within the operational range (66), so that the position of the front coupling element (32) matches the position of the rear automatic coupling device (18) as indicated in the received information (64).", "output": "The method involves detecting (S6) second road section which stipulates coupling together the vehicles of the platoon to form the long vehicle combination. The information is sent (S7) to the following vehicle through wireless vehicle-to-vehicle communication. The information indicates the position and speed of rear automatic coupling device of the vehicle immediately ahead. The following vehicle is automatically driven (S8) so that the rear automatic coupling device of the vehicle immediately ahead of the following vehicle gets within an operational range of front coupling element of the following vehicle. The front coupling device is automatically adjusted (S9) so that the position of the front coupling element matches the position of the rear automatic coupling device as indicated in the sent information. The following vehicle and the vehicle immediately ahead are automatically coupled (S10) together while the vehicles are in motion to form portion of the long vehicle combination. An INDEPENDENT CLAIM is included for a vehicle. Method for transforming between long vehicle combination and platoon on move, for heavy duty vehicle e.g. truck. The following vehicle can receive the correct speed allowing to safely drive so that the rear automatic coupling device of the vehicle immediately ahead gets within the operational range, even if one of the systems fails. The acceleration or deceleration request sent through wireless vehicle-to-vehicle communication can allow following vehicle to safely drive within the operational range, even if the operational range results in relatively short headway between the following vehicle and the vehicle immediately ahead and even if the speed is relatively high. The long vehicle combination can be automatically re-formed in motion after the roundabout or turn, to improve fuel efficiency. The front coupling device is automatically adjusted so that the position of the front coupling element matches the position of the rear automatic coupling device as indicated in the sent information. The drawing shows a flowchart illustrating the method for transforming between long vehicle combination and platoon on move. S6Step for detecting second road section which stipulates coupling together the vehicles of the platoon to form the long vehicle combinationS7Step for sending information to the following vehicle through wireless vehicle-to-vehicle communicationS8Step for automatically driving following vehicle so that the rear automatic coupling device of the vehicle immediately ahead of the following vehicle gets within an operational range of front coupling element of the following vehicleS9Step for automatically adjusting front coupling device so that the position of the front coupling element matches the position of the rear automatic coupling device as indicated in the sent informationS10Step for automatically coupling together the following vehicle and the vehicle immediately ahead while the vehicles are in motion to form portion of the long vehicle combination", "instruction": "Please summarize the input" }, { "input": "A METHOD FOR PROVIDING A POSITIVE DECISION SIGNAL FOR A VEHICLEA method for providing a positive decision signal for a vehicle which is about to perform a traffic scenario action. The method includes receiving information about at least one surrounding road user, which information is indicative of distance to the surrounding road user with respect to the vehicle and at least one of speed and acceleration of the surrounding road user; calculating a value based on the received information; providing the positive decision signal to perform the traffic scenario action when the calculated value is fulfilling a predetermined condition. The value is calculated based on an assumption that the surrounding road user will react on the traffic scenario action by changing its acceleration.|1. A method for providing a positive decision signal for a vehicle which is about to perform a traffic scenario action, such as entering a crossing, entering a highway and/or changing lanes, the method comprising: \nreceiving information about at least one surrounding road user, which information is indicative of distance to the surrounding road user with respect to the vehicle and at least one of speed and acceleration of the surrounding road user; \ncalculating a value based on the received information; \nproviding the positive decision signal to perform the traffic scenario action when the calculated value is fulfilling a predetermined condition, wherein the value is calculated based on an assumption that the surrounding road user will react on the traffic scenario action by changing its acceleration, characterized in that, the surrounding road user is a predefined virtual surrounding road user and the predetermined condition is defined by a threshold value which is indicative of an acceleration limit for the surrounding road user. \n | 2. The method according to claim 1, wherein the value is further calculated based on the assumption that the surrounding road user will react on the traffic scenario action by changing its acceleration after a reaction time.\n | 3. The method according to claim 1, wherein the value is further calculated based on the assumption that the surrounding road user will react on the traffic scenario action by changing its acceleration to an acceleration profile having a constant acceleration.\n | 4. The method according to claim 1, wherein the value is further calculated based on the assumption that the surrounding road user will react on the traffic scenario action by changing its acceleration to an acceleration profile having a variable acceleration.\n | 5. The method according to claim 1, further comprising providing a negative decision signal not to perform the traffic scenario action when the calculated value is not fulfilling the predetermined condition.\n | 6. The method according to claim 1, wherein the threshold value is variable depending on at least one factor, such as any one of speed of the surrounding road user, type of surrounding road user, ambient weather conditions with respect to the vehicle and a state of the surrounding road user, such as a state where a turning indicator is active.\n | 7. The method according to claim 1, wherein the information about the at least one surrounding road user is received by any one of a perception sensor of the vehicle, a V2X communication interface and a remote perception sensor which is in communicative contact with the vehicle.\n | 8. The method according to claim 1, wherein the method is used as a safety control method for an autonomous vehicle, wherein the autonomous vehicle is primarily performing traffic scenario actions by use of a primary autonomous vehicle control method, and wherein a traffic scenario action permitted to be performed by the primary autonomous vehicle control method is allowed to be performed if also the positive decision signal is provided.\n | 9. The method according to claim 1, wherein the calculated value is further based on auxiliary information relating to the traffic scenario action, such as any one of shape and/or dimension(s) of a crossing, a road lane and a neighboring road lane.\n | 10. A method for automatically performing a traffic scenario action of a vehicle, comprising: \nproviding a positive decision signal to perform the traffic scenario action, which positive decision signal has been provided according to the method of claim 1; and\nautomatically performing the traffic scenario action. \n | 11. A method for automatically avoiding performing a traffic scenario action of a vehicle, comprising: \nproviding a negative decision signal not to perform the traffic scenario action, which negative decision signal has been provided according to the method of claim 5; and\nautomatically avoiding performing the traffic scenario action. \n | 12. A control unit for a vehicle which is configured to perform the steps of claim 1.\n | 13. A vehicle comprising the control unit according to claim 12.\n | 14. The vehicle according to claim 13, wherein the vehicle is a fully autonomous or semiautonomous vehicle.\n | 15. The vehicle according to claim 13, wherein the vehicle is a road vehicle, such as a public road vehicle, for example a truck, a bus and a construction equipment vehicle adapted to be driven on a road.\n | 16. The vehicle according to claim 13, wherein the vehicle is a heavy-duty vehicle which has a minimum weight of at least 5000 kg, such as 30.000 kg.\n | 17. A computer program comprising program code means for performing the steps of claim 1, when said program is run on a computer.\n | 18. A computer readable medium carrying a computer program comprising program code means for performing the steps of claim 1, when said program product is run on a computer.", "output": "The method involves receiving information about the surrounding road user (2), which information is indicative of distance to the surrounding road user with respect to the vehicle and the speed and acceleration of the surrounding road user. A value is calculated based on the received information. The positive decision signal is provided to perform the traffic scenario action, when the calculated value is fulfilled a predetermined condition. The value is calculated based on an assumption that the surrounding road user is reacted on the traffic scenario action by changing the acceleration of the vehicle. INDEPENDENT CLAIMS are included for the following:a vehicle;a computer program for providing positive decision signal for vehicle; anda computer readable medium carrying computer program for providing positive decision signal for vehicle. Method for providing positive decision signal for vehicle (claimed). The surrounding road user increases and reduces the speed when the vehicle initiates lane change to the nearby lane to avoid the risk of collision. The improved and cost-efficient redundancy for the autonomous vehicle is implied. The drawing shows a schematic view of the traffic scenario. 1Heavy-duty truck2Surrounding road user", "instruction": "Please summarize the input" }, { "input": "method for the gap between vehicle control teamThe invention claims a vehicle gap (24a-24c) between control vehicle in method (10), the train (10) comprises a preceding vehicle (12) and one or a plurality of following vehicle (14a-14c). the method comprises the following steps: indicating parameter (27) acquires the potential collision threat (26) recognized by leader vehicle autonomous emergency braking system (16), wherein leader of the autonomous vehicle emergency braking system comprises a plurality of predefined control period (28a-28c), and wherein the parameter for indicating the current control stage at least partially determining the autonomous emergency braking system, and obtained the indication parameter is transmitted to the one or more of the following vehicle.|1. A vehicle gap (24a-24c) between a control vehicle in method (10), the vehicle (10) comprises a preceding vehicle (12) and one or a plurality of following vehicle (14a-14c), wherein, the feature of the method lies in the following steps: obtaining by the leader vehicle autonomous emergency braking system (16) identifies the potential collision threat indication parameter (27) (26), wherein the leading vehicle of the autonomous emergency braking system comprises a plurality of predefined control stage (28a-28c). and wherein the current control stage, the indication parameter at least partially determining the autonomous emergency braking system, and the obtained by the indication parameter is transmitted to the one or more of the following vehicle.\n | 2. The method according to claim 1, further comprising the following step: in the one or more of the following vehicle receiving the indication parameter, and automatically adjusting the clearance between vehicle parameters based on the indication received.\n | 3. The method according to claim 1 or 2, wherein the indicating parameter is collision time.\n | 4. The method according to claim 2 or 3, wherein the step of adjusting the gap between the vehicle comprises automatically based on the indication parameter of the received: the one or more of the following vehicle following vehicle (14c) according to the position of the following vehicle in the motorcade from the avoidance time minus a predetermined time so as to generate the collision time (TTC14C), and the following vehicle based on the collision time is reduced to adjust the following vehicle and the preceding vehicle (14b) and gap (24c).\n | 5. The method according to claim 2, wherein the step of automatically adjusting the clearance between the vehicle starts at the fleet of the last vehicle (14c) based on the received indicating parameter so as to increase the last vehicle (14c) and the gap between the preceding vehicle (14b) (24c).\n | 6. The method according to claim 2, wherein the step of automatically adjusting the clearance between said vehicle is started before the complete braking phase of the leading vehicle in the autonomous emergency braking system (28c) based on the received indicating parameter.\n | 7. The method according to claim 2, further comprising: the driver of the leading vehicle with respect to the vehicle of the last vehicle (14c) how to adjust the last vehicle (14c) and the preceding vehicle (14b) between the gap (24c).\n | 8. The method according to claim 1, wherein, using a vehicle-to-vehicle communication device (18) to perform the indication parameter.\n | 9. The method according to claim 2, wherein, using a vehicle-to-vehicle communication device (32a-32c) to perform reception of the indication parameter.\n | 10. The method according to any one of the preceding claims, further comprising: a friction-based estimation value for determining the retarding capacity of the leader vehicle.\n | 11. The method according to claim 2 and 10, wherein the step of automatically adjusting the clearance between the vehicle based on the received indication parameter comprises: also considers the reduction ability.\n | 12. A vehicle gap (24a-24c) between a control vehicle in method (10), the vehicle (10) comprises a preceding vehicle (12) and one or more following vehicle (14a-14c), wherein the feature of the method lies in the following steps: indicating parameter (27) in the one or more of the following vehicle receiving the potential collision threat (26) identified by autonomous of the leader vehicle emergency brake system (16), wherein the leader vehicle of the autonomous emergency braking system comprises a plurality of predefined control period (28a-28c), and wherein the indication parameter the current control stage at least partially determining the autonomous emergency braking system, and automatically adjusting the clearance between vehicle parameters based on the indication received.\n | 13. A computer program comprising program code, when said program is run on a computer, said program code to perform the method according to any one of claims 1-12 the step.\n | 14. A computer readable medium, the computer readable medium carrying a computer program comprising program code, when said program product is run on a computer, said program code to perform the method according to any one of claims 1-12 the step.\n | 15. A control unit (22, 34a-34c), said control unit is used for controlling the clearance between the vehicle in the fleet, the control unit is configured to perform the method according to any one of claims 1-12 the steps of the method.\n | 16. A vehicle (12; 14a-14c), the vehicle (12; 14a-14c) is configured to perform the method according to any one of claim 1-12 the steps of the method.", "output": "The method involves obtaining an indicator (27) of potential collision threat (26) identified by an autonomous emergency braking system (16) of a lead vehicle, where the braking system of the lead vehicle comprises pre-defined control phases, and the indicator determines current control phase of the braking system. The obtained indicator is sent to following vehicles. The indicator is received in the following vehicles. Inter-vehicle gaps (24a-24c) are automatically adjusted based on the received indicator. INDEPENDENT CLAIMS are also included for the following:a computer program comprising a set of instructions for controlling inter-vehicle gaps in a platoona computer readable medium comprising a set of instructions for controlling inter-vehicle gaps in a platoona control unit for controlling inter-vehicle gaps in a platoona vehicle. Method for controlling inter-vehicle gaps between vehicles (claimed), e.g. lorries, buses and passenger cars, in a platoon. The method allows the lead vehicle to remain predictable for the following vehicles even if a slippery or low friction road reduces deceleration capacity and calls for earlier braking, and building buffer distance to mitigate effects of different braking capacity of the vehicles in the platoon. The method enables exploring possibilities to drive road vehicles in the platoons or road trains with small time gaps so as to save fuel and decrease driver workload and road footprint in an effective manner. The drawing shows a schematic view of a platoon. 10Platoon16Autonomous emergency braking system24a-24cInter-vehicle gaps26Potential collision threat27Indicator", "instruction": "Please summarize the input" }, { "input": "A METHOD FOR FORMING A VEHICLE COMBINATIONThe present disclosure relates to a method for selecting and identifying a powered dolly vehicle among a group of powered dolly vehicles in a geographical area for forming a vehicle combination with a primary vehicle and one or more trailers, the method being implemented by one or more processors of a wireless control system, each one of the powered dolly vehicles having an associated distinguishable identification information and an operational characteristic, the method comprising: receiving (S10) a request from the primary vehicle to select a powered dolly vehicle among the group of powered dolly vehicles based at least in part on a mission-characteristic for the vehicle combination; evaluating (S20) the operational characteristic of each one of the powered dolly vehicles based on said mission-characteristic; selecting (S30) a powered dolly vehicle among the group of powered dolly vehicles based at least in part on said evaluation; locating (S40) the selected powered dolly vehicle in the geographical area based at least in part on the identification information; and communicating (S50) the location of the selected powered dolly vehicle to the primary vehicle or operating the powered dolly vehicle to primary vehicle.|1. A method for selecting and identifying a powered dolly vehicle among a group of powered dolly vehicles in a geographical area for forming a vehicle combination with a primary vehicle and one or more trailers, the method being implemented by one or more processors of a wireless control system, each one of the powered dolly vehicles having an associated distinguishable identification information and an operational characteristic, the method comprising: \n * - receiving (S10) a request from the primary vehicle to select a powered dolly vehicle among the group of powered dolly vehicles based at least in part on a mission-characteristic for the vehicle combination;\n * - evaluating (S20) the operational characteristic of each one of the powered dolly vehicles based on said mission-characteristic;\n * - selecting (S30) a powered dolly vehicle among the group of powered dolly vehicles based at least in part on said evaluation;\n * - locating (S40) the selected powered dolly vehicle in the geographical area based at least in part on the identification information; and\n * - communicating (S50) the location of the selected powered dolly vehicle to the primary vehicle or operating the powered dolly vehicle to the primary vehicle.\n | 2. Method according to claim 1, further comprising selecting one or more trailers among a group of trailers based at least in part on the mission-characteristic for the vehicle combination and the selected powered dolly vehicle; communicating the location of the selected one or more trailers to the powered dolly vehicle, or to the primary vehicle; and operating the powered dolly vehicle to couple with the one or more trailers.\n | 3. Method according to any one of the preceding claims, wherein the mission-characteristic of the vehicle combination comprises any one of an assignment instruction for the vehicle combination, a cargo space-requirement for the vehicle combination, a pick-up location of the cargo, a pick-up time for the cargo, a delivery time for the cargo, a delivery location of the cargo, and data indicating type of cargo.\n | 4. Method according to any one of the preceding claims, wherein the mission-characteristic of the vehicle combination comprises data indicating type of primary vehicle.\n | 5. Method according to any one of the preceding claims, further comprising receiving data relating to environmental conditions.\n | 6. Method according to any one of the preceding claims, wherein the associated operational characteristic comprises data indicating any one of a brake capacity of the powered dolly vehicle, energy storage system capacity of the powered dolly vehicle, and state of charge of the energy storage system of the powered dolly vehicle.\n | 7. Method according to any one of the preceding claims, wherein the associated operational characteristic comprises data indicating type of powered dolly vehicle.\n | 8. Method according any one of the preceding claims, wherein the associated distinguishable identification information comprises an identification component configurable to be updated by the one or more processors of a wireless control system.\n | 9. Method according to any one of the preceding claims, wherein the evaluating comprises determining if at least one operational characteristic of at least one of the powered dolly vehicles fulfils the mission-characteristic, or is at least sufficient for fulfilling the mission-characteristic.\n | 10. Method according any one of the preceding claims, wherein the request is received at a remote-control source from the primary vehicle, the remote-control source comprising a transceiver for receiving the request from the autonomous vehicle.\n | 11. Method according to claim 10, wherein the remote-control source comprises a memory configurable to contain and store the associated distinguishable identification information and operational characteristic of each one of the powered dolly vehicles.\n | 12. Method according any one of the claims 10 to 11, comprising receiving the request from the primary vehicle at the remote-control source when the primary vehicle arrives at the geographical area.\n | 13. Method according any one of the preceding claims, wherein the primary vehicle comprises a memory configurable to contain any one of the associated distinguishable identification information and operational characteristic of each one of the powered dolly vehicles.\n | 14. Method according to any one of the preceding claims, comprising obtaining the operational characteristic directly from powered dolly vehicles.\n | 15. Method according to any one of the preceding claims, comprising controlling any one of the primary vehicle and the selected powered dolly vehicle to couple to each other so as to form the vehicle combination.\n | 16. A computer program comprising instructions, which when executed by one or more processors of a wireless control system, cause the one or more processors to perform operations comprising: receiving a request from the primary vehicle to select a powered dolly vehicle among the group of powered dolly vehicles based at least in part on a mission-characteristic for the vehicle combination; evaluating the operational characteristic of each one of the powered dolly vehicles based on said mission-characteristic; selecting a powered dolly vehicle among the group of powered dolly vehicles based at least in part on said evaluation; locating the selected powered dolly vehicle in the geographical area based at least in part on the identification information; and communicating the location of the selected powered dolly vehicle to the primary vehicle or operating the powered dolly vehicle to primary vehicle.\n | 17. A non-transitory computer-readable medium comprising instructions, which when executed by one or more processors of a control system, cause the one or more processors to perform operations comprising: receiving a request from the primary vehicle to select a powered dolly vehicle among the group of powered dolly vehicles based at least in part on a mission-characteristic for the vehicle combination; evaluating the operational characteristic of each one of the powered dolly vehicles based on said mission-characteristic; selecting a powered dolly vehicle among the group of powered dolly vehicles based at least in part on said evaluation; locating the selected powered dolly vehicle in the geographical area based at least in part on the identification information; and communicating the location of the selected powered dolly vehicle to the primary vehicle or operating the powered dolly vehicle to primary vehicle.\n | 18. A wireless control system for identifying and selecting a powered dolly vehicle among a group of powered dolly vehicles in a geographical area for forming a vehicle combination with a primary vehicle and one or more trailers, each one of the powered dolly vehicles having an associated distinguishable identification information and an operational characteristic, the system comprising a memory that stores a set of instructions and one or more processors which use the instructions from the set of instructions to: \n * - receive a request from the primary vehicle to select a powered dolly vehicle among the group of powered dolly vehicles based at least in part on a mission-characteristic for the vehicle combination;\n * - evaluate the operational characteristic of each one of the powered dolly vehicles based on said mission-characteristic;\n * - select a powered dolly vehicle among the group of powered dolly vehicles based at least in part on said evaluation;\n * - locate the selected powered dolly vehicle in the geographical area based at least in part on the identification information; and\n * - communicate the location of the selected powered dolly vehicle to the primary vehicle or operating the powered dolly vehicle to primary vehicle.\n | 19. Wireless control system according to claim 19, further comprising a communication interface operably coupled to the one or more processors for receiving instructions and for transmitting the location of the selected powered dolly vehicle to the primary vehicle.\n | 20. A vehicle for forming a vehicle combination with a powered dolly vehicle and one or more trailers, comprising a memory that stores a mission-characteristic for the vehicle combination and one or more processors which use the mission-characteristic to: \n * - select a powered dolly vehicle among a group of powered dolly vehicles based at least in part on the mission-characteristic for the vehicle combination, each one of the powered dolly vehicles having an associated distinguishable identification information and an operational characteristic;\n * - evaluate the operational characteristic of each one of the powered dolly vehicles based on said mission-characteristic;\n * - select a powered dolly vehicle among the group of powered dolly vehicles based at least in part on said evaluation;\n * - locate the selected powered dolly vehicle in the geographical area based at least in part on the identification information; and\n * - communicate the location of the selected powered dolly vehicle to the primary vehicle or operating the powered dolly vehicle to vehicle.", "output": "The method involves receiving (S10) a request from the primary vehicle to select a powered dolly vehicle among the group of powered dolly vehicles based on a mission-characteristic for the vehicle combination. The operational characteristic of each one of the powered dolly vehicles evaluated (S20) based on mission-characteristic. A powered dolly vehicle selected (S30) among the group of powered dolly vehicles based on evaluation. The selected powered dolly vehicle located (S40) in the geographical area based on the identification information. The location of the selected powered dolly vehicle is communicated (S50) to the primary vehicle or operating the powered dolly vehicle to the primary vehicle. INDEPENDENT CLAIMS are included for the following:a computer program for selecting and identifying powered dolly vehicle;a non-transitory computer-readable medium storing program for selecting and identifying powered dolly vehicle;a wireless control system for identifying and selecting a powered dolly vehicle among a group of powered dolly vehicles in a geographical area for forming a vehicle combination with a primary vehicle and one or more trailers; anda vehicle for forming a vehicle combination with a powered dolly vehicle and one or more trailers. Method for selecting and identifying powered dolly vehicle such as electric-powered dolly, steerable dolly vehicles. The method enables providing more efficient transportation vehicle systems that are fully, or partially, autonomous, thus increasing operational capacity of heavy-duty vehicles by vehicle combinations with multiple vehicle units in form of trailer units. The drawing shows a flowchart illustrating method for selecting and identifying powered dolly vehicle. S10Step for receiving a request from the primary vehicle to select a powered dolly vehicleS20Step for evaluating the operational characteristic of each one of the powered dolly vehiclesS30Step for selecting a powered dolly vehicleS40Step for locating the selected powered dolly vehicle in the geographical areaS50Step for communicating the location of the selected powered dolly vehicle to the primary vehicle or operating the powered dolly vehicle to the primary vehicle", "instruction": "Please summarize the input" }, { "input": "providing supporting device of vehicle, vehicle and method for vehicle driversproviding under the possible condition of overtaking the manual or semi-autonomous driving during a vehicle driver assisting device (1), the vehicle (2) and method (100). device (1) comprises a detector (4), a communication unit (5), which is arranged to be based on historical route information and car navigation system information of at least one of the front vehicle (3) receives the vehicle route information, a processing unit (6); which is set to the vehicle (3) will travel along one or more routes to one or more routes in the vehicle (2) in front of the possibility of processing the received information into a format that is suitable for display, and display the processed information. provide support to the vehicle by the driver.|1. providing support of the vehicle (2) in the device (1) is a vehicle driver during manual or semi-autonomous driving under the possible overtaking condition, wherein said device (1) comprising: a detector (4); the detector (4) is set to detecting the vehicle (2) and vehicle (3) between the closing velocity and distance of the vehicle (2) in the vehicle (3) is at least one of a communication unit (5), the communication unit (5) is set to front vehicle (3) receiving based on historical route information and car navigation system information of the vehicle (3) at least one of the possible route information; synchronized processing unit (6), said processing unit (6) is arranged from the detector (4) receiving indication to the vehicle (2) is at least one of information of the vehicle (3) between the closing velocity and distance of the vehicle (2) in the vehicle (3). if the vehicle (2) detected by the approach speed (3) is greater than the vehicle (2) approaches the speed threshold value or if the detected and the distance between the vehicle (3) is less than the distance threshold, triggering the communication unit (5) to receive information about possible route of the vehicle (3); for the vehicle (2) in front of one or more the plurality of routes, determining vehicle (3) along the one or more possibility of route travel. the determination is based on the received vehicle data and received historical route information in the car navigation system information of at least one, and is set to the received information into a format that is suitable for display, one or more display units (7), the display unit (7) is arranged from the processing unit (6) receives the determined vehicle route information and displays the processed information. providing a support to the vehicle by the driver.\n | 2. The said device (1) according to claim 1, wherein determining the possible routes of two or more vehicle (3).\n | 3. The said device (1) according to claim 1 or 2, wherein said communication unit (5) is set with one or more possible routes to one or more vehicle (3) sending a request to receive with said one or more vehicle (3) of related information.\n | 4. The said device (1) according to claim 1 or 2, wherein said communication unit (5) is configured for communication with other vehicle through vehicle-to-vehicle communication (V2V).\n | 5. The said device (1) according to claim 1 or 2, wherein said communication unit (5) is configured to then through vehicle to infrastructure for communication with other vehicle-to-vehicle communication (V2I2V).\n | 6. The said device (1) according to claim 1 or 2, wherein said one or more display unit (7) is arranged to only less than the distance (3) between the vehicle (2) detected by the detector (4) of the display process of the information distance threshold value.\n | 7. The said device (1) according to claim 1, wherein said device (1) comprises a positioning system (8), the positioning system (8) is connected to a map database (9) and configured to continuously determine the position of the vehicle (2).\n | 8. The said device (1) according to claim 1, wherein the likelihood that the one or more display unit (7) is configured to display information via a chart, and the first graphic element (10a) represents (3) along the first path (11a), and the second graphic element (10b) represents (3) along a second path (11b) is possible.\n | 9. The method according to claim 7 8, wherein said device (1) connected to the vehicle positioning system (8) of the map database (9) is configured to receive the vehicle route, the processing unit (6) is arranged to the vehicle route is compared with the determined route of one or more vehicle (3), and the one or more display unit (7) is arranged to display indicating two or more route overlapped distance information.\n | 10. The method according to claim 7 8, wherein said device (1) is connected to a map database (9) of the vehicle locating system (8) is configured to receive the vehicle route. the processing unit (6) is arranged to the vehicle route is compared with the determined route of one or more vehicle (3), and the one or more display unit (7) is arranged to display indicating one or more vehicle (3) to be driving the possibility of optional distance along the route of the vehicle along the vehicle route information.\n | 11. A vehicle (2), wherein the vehicle (2) comprises said device according to any one of said claims (3).\n | 12. method (100) provides support to the vehicle driver during manual or semi-automatic driving under the possible condition of overtaking, wherein the method (100) comprises: at least one of a period detector (101) between the vehicle and the front vehicle closing velocity and the distance between the vehicle and the front vehicle, connecting the indication of vehicle and front vehicle closing velocity and the distance between the vehicle and the front of at least one of information received from the detector (102) to the processing unit; Pour detected if the closing velocity of the vehicle and the front vehicle is larger than the distance between the vehicle and the front vehicle approach speed threshold or if the detection is less than the distance threshold. is triggered by the processing unit (103) communication unit to front vehicle received based on historical route information and car navigation system information in front of at least one of the route information determined by the processing unit (104) along one or more routes to one or more routes the possibility of running in front of the vehicle, said determining historical route information based on at least one of the received data and the received car navigation system information in information processing (105) the received into a format that is suitable for display, by one or more display unit (106) for processing the information.", "output": "The arrangement (1) has a detector (4) to detect closing velocity and/or distance between a host vehicle (2) and a preceding vehicle (3). A processing unit (6) triggers a communication unit (5) to receive information on a probable route of preceding vehicle when a detected closing velocity is above threshold velocity or distance is below threshold distance. The probability that preceding vehicle will drive along routes is determined for route ahead of host vehicle. The received information is processed for display (7a). An INDEPENDENT CLAIM is included for a method for providing vehicle driver support for a driver of a host vehicle during manual or semi-autonomous driving in a potential overtake scenario. Arrangement in host vehicle (claimed) for providing vehicle driver support during manual or semi-autonomous driving. The vehicle driver is supported such that unnecessary overtaking is avoided, since the probability that the preceding vehicle will drive along the one or more route ahead of the host vehicle is determined and arranged to be displayed. The vehicles are enabled to share information between them in an easy, reliable and cost efficient manner. The drawing shows a schematic view of the vehicle and the arrangement in the vehicle for providing vehicle driver support during manual or semi-autonomous driving in the potential overtake scenario. 1Arrangement2Host vehicle3Preceding vehicle4Detector5Communication unit6Processing unit7aDisplay", "instruction": "Please summarize the input" }, { "input": "Apparatus and method for prediction of time available for autonomous driving, in a vehicle having autonomous driving capProvided are a method and an apparatus (1) for prediction of time available for autonomous driving, in a vehicle (2) having autonomous driving capabilities and comprising remote sensors (3) arranged to acquire vehicle surrounding information (4) and vehicle dynamics sensors (5) arranged to determine vehicle dynamics parameters (6), as well as a vehicle (2) comprising such an apparatus (1). At least one of a positioning arrangement (7) that provides map data with associated information; a route planning arrangement (8) that enables route planning; and a real time information acquiring arrangement, that acquires at least one of real time traffic information (9a) and real time weather information (9b). The time available is calculated based on a planned route and at least one of vehicle surrounding information (4), vehicle dynamics parameters (6), map data with associated information, real time traffic information (9a) and real time weather information (9b), for the planned route. The calculated time is output to a human machine interface (11) arranged in a vehicle (2).|1. An apparatus (1) for prediction of time available for autonomous driving, in a vehicle (2) having autonomous driving capabilities, the vehicle (2) comprising: \n * remote sensors (3) arranged to acquire vehicle surrounding information (4);\n * vehicle dynamics sensors (5) arranged to determine vehicle dynamics parameters (6);\n * the apparatus further comprising: \n * at least one of: a positioning arrangement (7) arranged to provide map data with associated speed limit and road infrastructure information; a route planning arrangement (8); and an arrangement for acquiring real time information (9), including at least one of real time traffic information (9a) and real time weather information (9b), and further\n * a processor (10) arranged to calculate a time available for autonomous driving based on a planned route and at least one of vehicle surrounding information (4), vehicle dynamics parameters (6), map data with associated speed limit and infrastructure information, real time traffic information (9a) and real time weather information (9b),\n * associated with the planned route; and\n * a human machine interface (11) arranged to output to a vehicle (2) passenger compartment (12) the calculated time available for autonomous driving along the planned route characterized in that the processor (10) further is arranged to calculate a hand over time, required for hand over from autonomous driving to manual driving, and to include this calculated hand over time in the calculation of time available for autonomous driving.\n | 2. An apparatus (1) according to claim 1, characterized in that the processor (10) is arranged to calculate the time available for autonomous driving based on at least road infrastructure information, real time traffic information (9a) and real time weather information (9b).\n | 3. An apparatus (1) according to any one of claims 1 to 2, characterized in that the processor (10) further is arranged to calculate the time available for autonomous driving based on certified road sections allowed to drive autonomously on.\n | 4. An apparatus (1) according to any one of claims 1 to 3, characterized in that the arrangement for acquiring real time information (9), when present, comprises an interface for communication via one or more portable communication devices of vehicle occupants for acquiring the real time information.\n | 5. An apparatus (1) according to any one of claims 1 to 3, characterized in that the arrangement for acquiring real time information (9), when present, comprises an interface for performing at least one of vehicle-to-vehicle and vehicle-to-infrastructure communication for acquiring the real time information.\n | 6. A method for prediction of time available for autonomous driving, in a vehicle (2) having autonomous driving capabilities, the vehicle (2) comprising: \n * remote sensors (3) arranged to acquire vehicle surrounding information (4);\n * vehicle dynamics sensors (5) arranged to determine vehicle dynamics parameters (6);\n * the method comprising at least one of the steps of: \n * providing map data with associated speed limit and road infrastructure information using a positioning arrangement (7);\n * performing route planning using a route planning arrangement (8); and\n * acquiring real time information, including at least one of real time traffic information (9a) and real time weather information (9b), and the steps of: \n * calculating, using a processor (10), a time available for autonomous driving based on a planned route and at least one of vehicle surrounding information (4), vehicle dynamics parameters (6), map data with associated speed limit and infrastructure information,\n * real time traffic information (9a) and real time weather information (9b), associated with the planned route, and calculating a hand over time, required for hand over from autonomous driving to manual driving, and including this calculated hand over time in the calculation of time available for autonomous driving; and\n * outputting, to a human machine interface (11) arranged in a vehicle (2) passenger compartment (12), the calculated time available for autonomous driving along the planned route.\n | 7. An automotive vehicle (2) having autonomous driving capabilities characterized in that it comprises an apparatus (1) for prediction of time available for autonomous driving according to any one of claims 1 to 5.", "output": "The apparatus (1) has a processor (10) arranged to calculate time available for autonomous driving based on a planned route and one of vehicle surrounding information (4), vehicle dynamics parameters (6), map data with associated speed limit and infrastructure information, real time traffic information (9a) and real time weather information (9b) associated with the planned route. A human machine interface (11) is arranged to output calculated time available for autonomous driving along the planned route to a vehicle and passenger compartment (12). Apparatus for prediction of time available for autonomous driving in an automotive vehicle (claimed). The apparatus calculates hand over time in calculation of time available for autonomous driving, thus ensuring that time available for autonomous driving is not less than time required for hand over from autonomous driving to manual driving so as to ensure that a vehicle driver does not suffer stressful and potentially dangerous transition to manual driving. The apparatus ensures that provision of an interface for communication through portable communication devices of vehicle occupants for acquiring real time information enables realization of less complex and cost effective apparatus or alternatively provision of a redundant back-up channel for acquiring real time information. The drawing shows a schematic view of an apparatus for prediction of time available for autonomous driving in a vehicle with autonomous driving capabilities. 1Apparatus for prediction of time available for autonomous driving in automotive vehicle4Vehicle surrounding information6Vehicle dynamics parameters9aReal time traffic information9bReal time weather information10Processor11Human machine interface12Vehicle and passenger compartment", "instruction": "Please summarize the input" }, { "input": "DEVICE AND METHOD FOR SAFETY STOPPAGE OF AN AUTONOMOUS ROAD VEHICLEDevice and method for safety stoppage of an autonomous road vehicle (2) having a localization system (3) and sensors (4) for monitoring the vehicle (2) surroundings and motion, and a signal processing system (5) for processing sensor signals enabling an autonomous drive mode thereof. Processing means (7) continuously: predict where a drivable space (8) exists; calculate and store a safe trajectory (10) to a stop within the drivable space (8); determine a current traffic situation; determine any disturbances in sensor data, vehicle systems or components enabling the autonomous drive mode. If an incapacitating disturbance is determined, a request for a driver to take over control is signaled and determined if a driver has assumed control within a pre-determined time. If not, the vehicle (2) is controlled to follow the most recent safe trajectory (10) to a stop in a safe stoppage maneuver during which, or after stopping, one or more risk mitigation actions adapted to the determined current traffic situation are performed.|1. A safety stoppage device (1) of an autonomous road vehicle (2) having a localization system (3) and sensors (4) for monitoring the autonomous road vehicle (2) surroundings and motion, and a signal processing system (5) for processing sensor signals enabling an autonomous drive mode of the autonomous road vehicle (2) by an autonomous drive control unit (6) thereof,\ncharacterized in that it comprises processing means (7) arranged to continuously: \n * predict where a drivable space (8) exists, based on data from the sensors (4);\n * calculate and store to memory means (9) of the autonomous drive control unit (6) a safe trajectory (10) to a stop within the drivable space (8);\n * determine from at least the localization system (3) and the sensors (4) a current traffic situation;\n * determine any disturbances in sensor data, vehicle systems or components enabling the autonomous drive mode of the autonomous road vehicle (2); and\n * if a disturbance is determined, such that the autonomous drive mode is incapacitated, signal to a driver environment of the autonomous road vehicle (2) a request for a driver to take over control of the autonomous road vehicle (2) and, determine if control of the autonomous road vehicle (2) has been assumed by a driver thereof within a pre-determined time, and ,upon a negative determination to control the autonomous vehicle (2) by the autonomous drive control unit (6) to follow the most recently calculated safe trajectory (10) to a stop within the drivable space (8) in a safe stoppage maneuver, wherein, during performance of such a safe stoppage maneuver or after the autonomous road vehicle (2) has stopped, the safety stoppage device (1) further is arranged to perform one or more risk mitigation actions adapted to the determined current traffic situation.\n | 2. The safety stoppage device (1) according to claim 1, characterized in that the processing means (7) further are arranged to continuously estimate a risk associated with performing the safe stoppage maneuver in the determined current traffic situation and to adapt the one or more risk mitigation actions to the estimated risk.\n | 3. The safety stoppage device (1) according to claims 2, characterized in that the processing means (7) further are arranged to adapt at least one of timing and intensity of the one or more risk mitigation actions to the estimated risk.\n | 4. The safety stoppage device (1) according to any one of claims 1 to 3, characterized in that the processing means (7) further are arranged to signal the request to take over control of the autonomous road vehicle (2) to a driver environment of the autonomous road vehicle (2) using means (11, 12, 13) for visual, audible or haptic communication, or any combination thereof.\n | 5. The safety stoppage device (1) according to any one of claims 1 to 4, characterized in that the one or more risk mitigation actions comprises at least one of: increasing the magnitude of the request for a driver to take over control of the autonomous road vehicle (2); activating hazard lights (14) of the autonomous road vehicle (2); activating a horn (15) of the autonomous road vehicle (2); warning or informing other traffic participants trough vehicle-to-vehicle communication (16); notifying a traffic control center (17) that a safe stoppage maneuver is in progress or completed; warning trailing vehicles (18) by blinking tail or brake lights (19) of the autonomous road vehicle (2).\n | 6. The safety stoppage device (1) according to any one of claims 1 to 5, characterized in that the safety stoppage device (1) further is arranged to activate the one or more risk mitigation actions a predetermined time period after the autonomous road vehicle (2) has come to a stop.\n | 7. The safety stoppage device (1) according to any one of claims 1 to 5, characterized in that the safety stoppage device (1) further is arranged to activate the one or more risk mitigation actions during performance of the safe stoppage maneuver.\n | 8. The safety stoppage device (1) according to any one of claims 1 to 5, characterized in that the safety stoppage device (1) further is arranged to activate the one or more risk mitigation actions after the autonomous vehicle (2) has stopped.\n | 9. The safety stoppage device (1) according to any one of claims 1 to 8, characterized in that it further comprises driver monitoring means (20) for determining a physical state of a driver of the autonomous road vehicle (2) and that the safety stoppage device (1) further is arranged to adapt the one or more risk mitigation actions to the monitored physical state of a driver of the autonomous road vehicle (2).\n | 10. The safety stoppage device (1) according to claim 9, characterized in that the safety stoppage device (1) further is arranged to adapt the one or more risk mitigation actions to be performed earlier when the monitored physical state of a driver of the autonomous road vehicle (2) indicates an incapacitated driver.\n | 11. The safety stoppage device (1) according to any one of claims 9 to 10, characterized in that it further is arranged to monitor and store to the memory means (9) data related to safe stoppage maneuver incidents where a monitored physical state of a driver of the autonomous road vehicle (2) indicates these safe stoppage maneuver incidents to be caused by a reckless driver and to deactivate the autonomous drive mode of the autonomous road vehicle (2) after a predetermined number (n) of such incidents.\n | 12. The safety stoppage device (1) according to any one of claims 1 to 11, characterized in that it further comprises communication means (21) for communicating with a traffic control center (17), such that the traffic control center (17) is allowed to monitor the position of the autonomous road vehicle (2) and trigger the safety stoppage device (1) to perform the one or more risk mitigation actions when the monitored the position of the autonomous road vehicle (2) indicates that it is stationary in a potentially unsafe position.\n | 13. A method for safety stoppage of an autonomous road vehicle (2) having a localization system (3) and sensors (4) for monitoring the autonomous road vehicle (2) surroundings and motion, and a signal processing system (5) for processing sensor signals enabling an autonomous drive mode of the autonomous road vehicle (2) by an autonomous drive control unit (6) thereof,\ncharacterized in that it comprises using processing means (7) for continuously: \n * predicting where a drivable space (8) exists, based on data from the sensors (4);\n * calculating and storing to memory means (9) of the autonomous drive control unit (6) a safe trajectory (10) to a stop within the drivable space (8);\n * determining from at least the localization system (3) and the sensors (4) a current traffic situation;\n * determining any disturbances in sensor data, vehicle systems or components enabling the autonomous drive mode of the autonomous road vehicle (2); and\n * if a disturbance is determined, such that the autonomous drive mode is incapacitated, signaling to a driver environment of the autonomous road vehicle (2) a request for a driver to take over control of the autonomous road vehicle (2) and, determining if control of the autonomous road vehicle (2) has been assumed by a driver thereof within a pre-determined time, and ,upon a negative determination controlling the autonomous vehicle (2) by the autonomous drive control unit (6) to follow the most recently calculated safe trajectory (10) to a stop within the drivable space (8) in a safe stoppage maneuver,\n * and, during performance of such a safe stoppage maneuver or after the autonomous road vehicle (2) has stopped, performing one or more risk mitigation actions adapted to the determined current traffic situation.\n | 14. An autonomous road vehicle (2) having a localization system (3) and sensors (4) for monitoring the autonomous road vehicle (2) surroundings and motion, and a signal processing system (5) for processing sensor signals enabling an autonomous drive mode of the autonomous road vehicle (2) by an autonomous drive control unit (6) of the autonomous road vehicle (2), characterized in that it comprises a safety stoppage device (1) according to any one of claims 1 to 12.", "output": "The device (1) has a processing unit (7) for determining whether control of an autonomous road vehicle (2) has been assumed by a driver within pre-determined time and following calculated safe trajectory to a stop within drivable space in a safe stoppage maneuver based on negative determination to control the autonomous road vehicle by an autonomous drive control unit (6), where the device performs risk mitigation actions adapted to determined current traffic situation during performance of the safe stoppage maneuver or after stopping the autonomous road vehicle. An INDEPENDENT CLAIM is also included for a method for facilitating safety stoppage of an autonomous road vehicle. Safety stoppage device for an autonomous road vehicle. The device continuously estimates risk associated with performing the safe stoppage maneuver in the determined current traffic situation, and adapts the risk mitigation actions to the estimated risk for providing reduced risk of accident when the autonomous vehicle is to be stopped, and the driver is not capable of taking control over the vehicle. The drawing shows a schematic view of an autonomous road vehicle comprising a safety stoppage device. 1Safety stoppage device2Autonomous road vehicle4Sensors6Autonomous drive control unit7Processing unit", "instruction": "Please summarize the input" }, { "input": "Apparatus and method for continuously establishing a boundary for autonomous driving availability and an automotive vehicle comprising such an apparatusProvided are a method and an apparatus (1) for continuously establishing a boundary for autonomous driving availability, in a vehicle (2) having autonomous driving capabilities and comprising remote sensors (3) for acquiring vehicle surrounding information (4) and vehicle dynamics sensors (5) for determining vehicle dynamics parameters (6), as well as a vehicle (2) comprising such an apparatus (1). At least one of a positioning arrangement (7) that provides map data with associated information; a route planning arrangement (8) that enables route planning; a vehicle driver monitoring arrangement (9) that provides driver monitoring information (10); and a real time information acquiring arrangement, that acquires at least one of traffic information (11 a) and weather information (11 b). The boundary is calculated based on a planned route and at least one of vehicle surrounding information (4), vehicle dynamics parameters (6), driver monitoring information (10), map data, traffic information (11a) and weather information (11b), for the planned route. Changes in the calculated boundary are output to a human machine interface (13) arranged in the vehicle (2).|1. An apparatus (1) for continuously establishing a boundary for autonomous driving availability in a vehicle (2) having autonomous driving capabilities, the vehicle (2) comprising: \n * sensors (3) arranged to acquire vehicle surrounding information (4);\n * vehicle dynamics sensors (5) arranged to determine vehicle dynamics parameters (6);\n * further comprising: \n * at least one of: a positioning arrangement (7) arranged to provide map data with associated speed limit and road infrastructure information; a route planning arrangement (8); a vehicle driver monitoring arrangement (9) arranged to provide vehicle driver monitoring information (10); and an arrangement for acquiring real time information (11), including at least one of real time traffic information (11a) and real time weather information (11b), and\n * a processor (12) arranged to continuously calculate a boundary for autonomous driving availability based on a planned route and at least one of vehicle surrounding information (4), vehicle dynamics parameters (6), vehicle driver monitoring information (10), map data with associated speed limit and infrastructure information, real time traffic information (11a) and real time weather information (11b), associated with the planned route, and\n * characterized in that it further comprises: \n * a human machine interface (13) arranged to output to a vehicle (2) passenger compartment (14) information on any changes in the calculated boundary for autonomous driving availability along the planned route,\n * wherein the human machine interface (13) is arranged to present the information graphically to a display indicating a distance to the calculated boundary for autonomous driving availability.\n | 2. An apparatus (1) according to claim 1, characterized in that it further comprises an interface for communicating the information on any changes in the calculated boundary for autonomous driving availability along the planned route to an autonomous drive control unit of the vehicle.\n | 3. An apparatus (1) according to any one of claims 1 to 2, characterized in that the human machine interface (13) further is arranged to output to the vehicle (2) passenger compartment (14) information relating to changes in automation level available with the current calculated boundary for autonomous driving availability.\n | 4. An apparatus (1) according to any one of claims 1 to 3, characterized in that the arrangement for acquiring real time information (11), when present, comprises an interface for communication via one or more portable communication devices of vehicle (2) occupants for acquiring the real time information.\n | 5. An apparatus (1) according to any one of claims 1 to 4, characterized in that the arrangement for acquiring real time information (11), when present, comprises an interface for performing at least one of vehicle-to-vehicle and vehicle-to-infrastructure communication for acquiring the real time information.\n | 6. An apparatus (1) according to any one of claims 1 to 5, characterized in that it further comprises an interface for communicating the information on any changes in the calculated boundary for autonomous driving availability along the planned route externally of the vehicle (2) using at least one or more portable communication devices of vehicle (2) occupants, vehicle-to-vehicle communication and vehicle-to-infrastructure communication.\n | 7. A method for continuously establishing a boundary for autonomous driving availability in a vehicle (2) having autonomous driving capabilities, the vehicle (2) comprising: \n * sensors (3) arranged to acquire vehicle surrounding information (4);\n * vehicle dynamics sensors (5) arranged to determine vehicle dynamics parameters (6);\n * the method comprising at least one of the steps of: \n * providing map data with associated speed limit and road infrastructure information using a positioning arrangement (7);\n * performing route planning using a route planning arrangement (8);\n * monitoring a vehicle driver and providing vehicle driver monitoring information (10) using a vehicle driver monitoring arrangement (9); and\n * acquiring real time information, including at least one of real time traffic information (11a) and real time weather information (11b), and the step of:\n continuously calculating, using a processor (12), a boundary for autonomous driving availability based on based on a planned route and at least one of vehicle surrounding information (4), vehicle dynamics parameters (6), vehicle driver monitoring information (10), map data with associated speed limit and infrastructure information, real time traffic information (11a) and real time weather information (11b), associated with the planned route, \n * characterized in that the method further comprises:\n outputting, to a human machine interface (13) arranged in a vehicle (2) passenger compartment (14), information on any changes in the calculated boundary for autonomous driving availability along the planned route, \n * wherein the human machine interface (13) is arranged to present the information graphically to a display indicating a distance to the calculated boundary for autonomous driving availability.\n | 8. An automotive vehicle (2) having autonomous driving capabilities characterized in that it comprises an apparatus (1) for continuously establishing a boundary for autonomous driving availability according to any one of claims 1 to 6.", "output": "The apparatus (1) has a processor (12) continuously calculating a boundary for autonomous driving availability based on a planned route and one of vehicle surrounding information (4), vehicle dynamics parameters (6), vehicle driver monitoring information (10), map data with associated speed limit and infrastructure information, real time traffic information (11a), and real time weather information (11b) associated with the planned route. A human-machine interface (13) outputs information on any changes in the calculated boundary along the planned route to the passenger compartment (14). An INDEPENDENT CLAIM is also included for a method for continuously establishing a boundary for autonomous driving availability in a vehicle having autonomous driving capabilities. Apparatus for continuously establishing boundary for autonomous driving availability in vehicle having autonomous driving capabilities. The provision of a continuously calculated boundary for autonomous driving availability and a human machine interface arranged to output to a vehicle passenger compartment information on any changes in the calculated boundary for autonomous driving availability along the planned route promotes the driver's trust in the autonomous driving capabilities of the vehicle as well as increases the driver's readiness to assume manual control of the vehicle if so required. The provision of a request for hand over from autonomous driving to manual driving provides sufficient time for safe hand over from autonomous driving to manual driving, thus ensuring that the vehicle driver does not suffer a stressful and potentially dangerous transition to manual driving. The provision of communicating the information on any changes in the calculated boundary for autonomous driving availability along the planned route to an autonomous drive control unit of the vehicle enables vehicle systems performs adaptations in dependence upon the available degree of automation indicated by the calculated boundary for autonomous driving availability. The provision of a human machine interface arranged to output to the vehicle passenger compartment information relating to changes in automation level available with the current calculated boundary for autonomous driving availability enables a driver of the vehicle to become aware of why adaptations in in the autonomous drive, e.g. towards a higher or lower degree of automation, is made and also enables the driver to continuously monitor the autonomous drive while retaining a feeling of control. The provision of outputting the information to the vehicle passenger compartment through one of a graphical, an audio or a tactile output arrangement provides options for ensuring that the information reaches the vehicle driver, irrespective of his/her current focus. The provision of an interface for communication via one or more portable communication devices of vehicle occupants for acquiring the real time information enables either the realization of a less complex and more cost effective apparatus or alternatively the provision of a redundant back-up channel for acquiring the real time information. The provision of an interface for performing one of vehicle-to-vehicle and vehicle-to-infrastructure communication for acquiring the real time information enables the realization of an effective apparatus for acquiring real time information which is highly relevant for the current surroundings. The drawing shows the schematic diagram of an apparatus for continuously establishing a boundary for autonomous driving availability, in a vehicle having autonomous driving capabilities. 1Apparatus4Vehicle surrounding information6Vehicle dynamics parameters10Vehicle driver monitoring information11aReal time traffic information11bReal time weather information12Processor13Human-machine interface14Passenger compartment", "instruction": "Please summarize the input" }, { "input": "CONCEPT OF COORDINATING AN EMERGENCY BRAKING OF A PLATOON OF COMMUNICATIVELY COUPLED VEHICLESThe present invention provides a concept of coordinating emergency braking of a platoon 100 of communicatively connected vehicles 110 . In response to the emergency situation 120 , individual braking control settings are centrally determined for one or more vehicles 110 of the platoon 100 by the management entity 110 - 3 managing the platoon 100 ( 230 ). ). The individual braking control settings are communicated from the management entity 110 - 3 to the one or more vehicles 110 of the platoon 100 . One or more vehicles 110 of the platoon 100 brake 250 according to each individual brake control setting received from the management entity 110 - 3 .|1. A method (200) for adjusting emergency braking of a platoon (100) of communicatively connected vehicles (110), wherein an emergency (120) is detected by the vehicle (110-1) of the platoon a step 210 of;\nbroadcasting (220) an emergency message from the vehicle (110-1) to other vehicles (110-2, 110-3) of the platoon in response to the detection of the emergency situation (120); \nin response to receiving the emergency message from the vehicle (110-1), forming a braking pressure of the other vehicles (110-2, 110-3) of the platoon; \nIndividual braking for one or more vehicles 110 of the platoon 100 by the management entity 110 - 3 managing the platoon 100 after generating the braking pressure in response to the emergency situation 120 . determining (230) control settings; \ntransmitting (240) the individual braking control settings from the management entity (110-3) to one or more vehicles (110) of the platoon (100); \nA method of adjusting emergency braking, comprising the step of braking (250) one or more vehicles (110) of the platoon (100) according to respective individual braking control settings received from the management entity (110-3) (200). \n | 2. The method (200) of claim 1, wherein the individual braking control settings indicate respective braking forces to be applied.\n | 3. The emergency braking according to claim 1 or 2, further comprising the step of notifying the management body (110-3) about at least one of individual characteristics and current conditions of each vehicle of the platoon (100). How to adjust (200).\n | 4. A method (200) according to claim 3, wherein determining the individual braking control settings of the vehicle (110) is based on at least one of an individual characteristic and a current condition of each vehicle.\n | 4. The method (200) according to claim 3, wherein at least one of the individual characteristics and the current state comprises at least one of weight, braking force, inter-vehicle distance, speed, and tire condition.\n | 6. Method (200) according to claim 1 or 2, wherein the plurality of vehicles (110) of the platoon (100) are each an autonomous vehicle or an at least partially autonomous vehicle. .\n | 3. A method (200) according to claim 1 or 2, wherein the plurality of vehicles (110) in the platoon communicate via a vehicle-to-vehicle communication system.\n | 8. The emergency braking according to claim 1 or 2, wherein the management body (110-3) is a vehicle of the platoon (100) acting as a master vehicle with respect to other vehicles of the platoon acting as a slave vehicle. How to adjust (200).\n | 9. In a platoon 100 of a plurality of communicatively connected vehicles 110, a first vehicle configured to detect an emergency situation and broadcast an emergency message to other vehicles in the platoon in response to the detected emergency situation ( 110-1), wherein in response to receiving the emergency message from the first vehicle (110-1), a braking pressure of another vehicle in the platoon is established;\na second vehicle (110-3) configured to determine a respective brake control parameter for each vehicle of the platoon in response to the emergency message and send the respective individual brake control parameter to each vehicle of the platoon; \na third vehicle (110-2) configured to adjust its braking setting according to its respective individual braking control parameter, wherein prior to reception of the individual braking control parameter by the second vehicle (110-3), the a plurality of communications, wherein in response to receiving the emergency message from the first vehicle 110 - 1 , a braking pressure is established in the second vehicle 110 - 3 and the third vehicle 110 - 2 . Platoon 100 of vehicles 110 connected to each other.", "output": "The method (200) involves determining (230) individual braking control settings for one or more vehicles of the platoon by a managing entity managing the platoon in response to an emergency situation. The individual braking control settings from the managing entity are communicated (240) to one or more vehicles of the platoon. One or more vehicles of the platoon are braked (250) in accordance with the respective individual braking control settings received from the managing entity. INDEPENDENT CLAIMS are included for the following:a system of several communicatively coupled vehicle; anda vehicle. Method for coordinating emergency braking of platoon of communicatively coupled vehicle (claimed). The managing entity can be kept up to date with respect to current vehicle parameters, leading to more accurate individual braking control settings. The accurate prediction and coordination of the individual emergency braking maneuvers can be allowed. The drawing shows a flowchart illustrating the process for coordinating emergency braking of platoon of communicatively coupled vehicle. 200Method for coordinating emergency braking of platoon of communicatively coupled vehicle210Step for detecting emergency situation by vehicle of platoon230Step for determining individual braking control settings for one or more vehicles of platoon240Step for communicating individual braking control settings from managing entity to one or more vehicles of platoon250Step for braking one or more vehicles of platoon", "instruction": "Please summarize the input" }, { "input": "Method for providing travel route presettingThe invention relates to a method for providing travel route presetting (100) for a travel route system of a vehicle (300), the method comprises the following steps: providing a plurality of detected tracks (101) of other vehicles (320) in the route section (150) to be travelled, determining track preset (102) from the detected tracks (101), determining a deviation area (110) according to the detected tracks (101), wherein the deviation area (110) is determined according to the deviation between at least each detected track (101) and the track preset (102), and the travel route preset (100) is determined at least according to the track preset (102) and the deviation area (110). & #10; In addition, the present invention relates to a travel route system for a vehicle (300), comprising: a receiving module (301), for receiving the track (101) detected in the route section (150) to be travelled, a calculating unit (302), the receiving module (301) sends the detected track (101) to the calculating unit, -the computing unit is adapted to determine a trajectory preset (101) from the detected trajectory (101) and to determine a deviation area (110) based on at least the deviation of each detected trajectory (101) from the trajectory preset (102), and determining a travel route preset (100) based at least on the trajectory preset (102) and the deviation area (110).|1. A method for providing a travel route presetting (100) for a travel route system of a vehicle (300), the method comprising the steps of: providing a plurality of detected tracks (101) of other vehicles (320) in the route section (150) to be travelled, determining track presets (102) from the detected tracks (101), determining a deviation area (110) according to the detected trajectory (101), wherein the deviation area (110) is determined according to the deviation between at least each detected trajectory (101) and the trajectory preset (102), wherein the deviation area (110) is set as The deviation area (110) surrounds the track presetting (102), and thus forms the following area in the route section (150) to be travelled: when the vehicle (300) should move from the first route point (30) to the second route point (40), the vehicle (300) can preferentially stay in the area, wherein the detected track (101) and the track preset (102) extend between the first route point (30) and the second route point (40), determining the travel route preset (100) at least according to the track preset (102) and the deviation area (110).\n | 2. The method according to claim 1, wherein the detected trajectory (101) is transmitted by moving data and/or by vehicle-to-vehicle communication and/or between the vehicle (300) and/or infrastructure (400).\n | 3. The method according to claim 1 or 2, wherein the track presetting (102) is the average track of the detected track (101), wherein the deviation area is calculated by the standard deviation of the detected track (101).\n | 4. The method according to claim 1 or 2, wherein the following steps are further executed: detecting the sensor information about the environment of the vehicle (300) in the driving route, determining the track preset (103) based on the sensor according to the sensor information, and adapting the travel route presetting (100) according to the sensor-based trajectory presetting (103) and the deviation area (110).\n | 5. The method according to claim 1 or 2, wherein the weighting factor is considered when determining the track presetting (102) and/or the deviation area (110). wherein the weighting factor takes into account the time between determining the detected trajectory (101) and the time at which the vehicle (300) plans to travel through the route section (150), and/or the type of the other vehicle (320) of the detected trajectory (101).\n | 6. The method according to claim 1 or 2, wherein determining the interval of the travel route presetting (100) depends on the travel speed of the vehicle (300).\n | 7. The method according to claim 1 or 2, wherein the vehicle (300) is pre-set (100) by the at least partially autonomous vehicle controller using the travel route so as to guide the vehicle (300) in the transverse and/or longitudinal direction.\n | 8. The method according to claim 3, wherein the mean trajectory is determined by averaging through an arithmetic mean value of the detected trajectory (101).\n | 9. The method according to claim 4, wherein the travel route presetting (100) is adapted when the sensor information is specific to an obstacle (50) in a route section (150) to be travelled by the vehicle (300).\n | 10. A travel route system for a vehicle (300), the travel route system comprising: a receiving module (301), for receiving the track (101) detected in the route section (150) to be travelled, a calculating unit (302), the receiving module (301) sends the detected track (101) to the calculating unit, the calculation unit is adapted to determine a trajectory preset (102) from the detected trajectory (101) and determine a deviation area (110) based on at least a deviation of each detected trajectory (101) from the trajectory preset (102), and at least according to the track preset (102) and the deviation area (110) determining the driving route preset (100), wherein the deviation area (110) is set to, the deviation area (110) surrounds the track preset (102), and thereby forming the following areas in the route section (150) to be travelled: when the vehicle (300) should move from the first route point (30) to the second route point (40), the vehicle (300) can preferentially stay in the area, wherein the detected trajectory (101) and the trajectory presetting (102) extend between the first route point (30) and the second route point (40).\n | 11. The travel route system according to claim 10, wherein the sensor (303) suitable for detecting the environment around the vehicle (300) is connected with the calculating unit (302) so as to send the sensor information to the calculating unit (302). wherein the calculating unit (302) is adapted to determine a sensor-based trajectory presetting (103) based on the sensor information so as to additionally determine a travel route presetting (100) based on the sensor-based trajectory presetting (103).\n | 12. The travel route system according to claim 10 or 11, wherein the travel route system is adapted to perform the method for providing travel route presetting (100) according to any one of claims 1 to 8.", "output": "The method involves providing detected trajectories of vehicles in a path section to be traveled. A trajectory specification is determined from the detected trajectories. A deviation zone is determined from the detected trajectories. The deviation zone is determined on the basis of deviation of individual detected trajectories from the trajectory specification. The route specification is determined based on the trajectory specification and the deviation zone. An INDEPENDENT CLAIM is included for route system of vehicle. Method for providing route specification for route system (claimed) of vehicle e.g. car. The significance and the benefit of the trajectory specification is improved. The provision of a particularly safe and precise route specification is enabled. The quality of the route specification and the safety is increased. The unnecessary arithmetic operations are avoided in sections in which few changes are required. The unnecessary adaptation of the route is avoided. The drawing shows a schematic view of the route section to be traveled with different trajectories. 10,20First and second axes30First waypoint", "instruction": "Please summarize the input" }, { "input": "Method for the autonomous or partly autonomous execution of a cooperative driving maneuverA method for autonomously or semi-autonomously carrying out a cooperative driving maneuver and a vehicle. Provision is made for a maneuvering vehicle which plans the execution of a driving maneuver to determine a maneuvering area of a road in which the driving maneuver is potentially executed, to communicate with one or more vehicles via vehicle-to-vehicle communication to detect one or more cooperation vehicles which will presumably be inside the maneuvering area during the execution of the driving maneuver, and to adapt its own driving behavior to the presumable driving behavior of the one or more cooperation vehicles to execute the planned driving maneuver. The disclosure provides a possibility which, by vehicle-to-vehicle communication, allows vehicles for jointly carrying out a cooperative driving maneuver to be identified and then allows the cooperative driving maneuver to be executed.The invention claimed is: \n | 1. A method for autonomously or semi-autonomously carrying out a cooperative driving maneuver, wherein a maneuvering vehicle plans execution of a driving maneuver, wherein during the method the maneuvering vehicle executes operations for planning and execution of a single driving maneuver comprising: \ndetermining a maneuvering area of a road in which the driving maneuver is potentially executed; \ncommunicating with other vehicles via vehicle-to-vehicle communication during the planning of the execution of the cooperative driving maneuver; \nfiltering communications received by the maneuvering vehicle via vehicle-to-vehicle communication from the vehicles, wherein the filtering is performed to determine which vehicles are relevant to carrying out the maneuvering vehicle's planned driving maneuver to detect cooperation vehicles which are presumed to be inside the maneuvering area during the execution of the driving maneuver; \ndetermining message formats of the messages received by the maneuvering vehicle via vehicle-to-vehicle communication; \ndetermining potential cooperation vehicles based on the message formats transmitted by the vehicles, and \nadapting the maneuvering vehicle's own driving behavior to presumable driving behavior of the one or more cooperation vehicles of the potential cooperation vehicles to execute the planned driving maneuver, \nwherein, in response to a determination that transmitted message formats are Environmental Perception Messages from the potential cooperation vehicles, wherein the Environmental Perception Messages contain information about free areas between the potential cooperation vehicles from the potential cooperation vehicles received by the maneuvering vehicle, the potential cooperation vehicles are determined to be one or more cooperation vehicles. \n | 2. The method of claim 1, wherein the maneuvering vehicle determines an approach period in which the maneuvering vehicle is presumed to reach the maneuvering area.\n | 3. The method of claim 1, wherein the maneuvering vehicle executes at least one of the following operations to detect the cooperation vehicles: \ndetermining approach areas of the road, from which the maneuvering area is theoretically reached within the approach period; \ndetermining potential cooperation vehicles which are in the approach areas based on the data received via vehicle-to-vehicle communication, predicting the driving behavior of each potential cooperation vehicle based on the data received via vehicle-to-vehicle communication; \npredicting its own driving behavior; and \ncomparing the predicted driving behavior of the one or more potential cooperation vehicles with the maneuvering vehicle's own predicted driving behavior to determine the plurality of cooperation vehicles. \n | 4. The method of claim 1, wherein the maneuvering vehicle continuously detects and evaluates the development of free areas between vehicles and, for this purpose, executes the following operations: \ndetermining the minimum size of a free area for carrying out the driving maneuver, the vehicle dimensions of the maneuvering vehicle and/or a safety distance; \ncomparing the size of newly detected free areas with the determined minimum size; and \nselecting a suitable free area presumed to be inside the maneuvering area during the planned execution of the maneuver and to have at least the minimum size for executing the driving maneuver. \n | 5. The method of claim 4, wherein the adaptation of the driving behavior of the maneuvering vehicle to the presumable driving behavior of the cooperation vehicles comprises adapting the trajectory of the maneuvering vehicle to reach the selected free area in response to the selected free area being inside the maneuvering area, wherein executability being cyclically checked as the free area is approached.\n | 6. The method of claim 1, further comprising: \ndetermining a maneuvering area of a road in which a driving maneuver of a vehicle is expected; \ncommunicating between the cooperation vehicle and one or more vehicles via vehicle-to-vehicle communication to detect a maneuvering vehicle which plans the execution of a driving maneuver and is presumed to be inside the maneuvering area during the execution of the driving maneuver; and \nadapting the driving behavior of the cooperation vehicle to the presumable driving behavior of the maneuvering vehicle to assist with the planned driving maneuver of the maneuvering vehicle. \n | 7. The method of claim 6, wherein the cooperation vehicle determines the maneuvering area and/or an approach period in which it will presumably reach the maneuvering area.\n | 8. The method of claim 6, wherein the cooperation vehicle executes at least one of the following operations to detect the maneuvering vehicle: \ndetermining the message formats of the messages received via vehicle-to-vehicle communication; \ndetermining one or more potential maneuvering vehicles based on the message format transmitted by these vehicles; \ndetermining approach areas of the road, from which the maneuvering area is theoretically reached within the approach period; \ndetermining one or more potential maneuvering vehicles which are in the approach areas by the data received via vehicle-to-vehicle communication; \npredicting the driving behavior of each potential maneuvering vehicle by the data received via vehicle-to-vehicle communication; \npredicting its own driving behavior; and \ncomparing the predicted driving behavior of the one or more potential maneuvering vehicles with the maneuvering vehicle's own predicted driving behavior to determine one or more maneuvering vehicles. \n | 9. The method of claim 6, wherein the adaptation of the driving behavior of the cooperation vehicle to the presumable driving behavior of the maneuvering vehicle comprises the following operations: \nadapting the trajectory of the cooperation vehicle to enlarge a free area in which the driving maneuver of the maneuvering vehicle is executed inside the maneuvering area. \n | 10. A transportation vehicle comprising: \na communication device for communicating with other transportation vehicles by vehicle-to-vehicle communication; and \nthe vehicle being configured to, operate as a maneuvering vehicle and/or as a cooperation vehicle, autonomously or semi-autonomously carrying out a cooperative driving maneuver, wherein in operating as a maneuvering vehicle, which plans the execution of a driving maneuver, the vehicle executes the following operations: \ndetermining a maneuvering area of a road in which the driving maneuver is potentially executed; \ncommunicating with vehicles via vehicle-to-vehicle communication with each of a plurality of other vehicles that are presumed to be inside the maneuvering area of the road during execution of the driving maneuver, filtering the communications received via vehicle-to-vehicle communication from the other vehicles according to vehicles which are relevant to carrying out the planned driving maneuver to detect cooperation vehicles; \ndetermining message formats of the messages received via vehicle-to-vehicle communication from these vehicles; \ndetermining potential cooperation vehicles based on the message formats transmitted by these vehicles; and \nadapting the maneuvering vehicle's own driving behavior to the presumable driving behavior of the cooperation vehicles to execute the planned driving maneuver, \nwherein in response to a determination that transmitted message formats are Environmental Perception Messages from the potential cooperation vehicles containing information about free areas between the potential cooperation vehicles, the potential cooperation vehicles are determined to be cooperation vehicles for the maneuvering vehicle. \n | 11. The method of claim 1, wherein information of each Environmental Perception Message is determined by sensors in each potential cooperation vehicle.\n | 12. The method of claim 11, wherein the sensors comprise radar sensors and the Environmental Perception Messages are transmitted several times a second.\n | 13. The vehicle of claim 10, wherein information of each Environmental Perception Message is determined by sensors in each potential cooperation vehicle.\n | 14. The vehicle of claim 13, wherein the sensors comprise radar sensors and the Environmental Perception Messages are transmitted several times a second.\n | 15. The method of claim 1, wherein the driving maneuver comprises merging into a flow of vehicles, wherein the communicating with the other vehicles includes communicating with a plurality of vehicles presumed to be in a maneuvering area during the execution of the merging.\n | 16. The transportation vehicle of claim 10, wherein the driving maneuver comprises merging into a flow of vehicles, wherein the communicating with the other vehicles includes communicating with a plurality of vehicles presumed to be in a merging area during the execution of the merging.", "output": "The method involves determining a maneuvering area (28) of a road, in which the driving maneuver is potentially executed, and communicating with the vehicles by vehicle-to-vehicle communication. The filtering is carried out according to vehicles, which are relevant to carrying out the planned driving maneuver to detect the cooperation vehicles (26). The maneuvering vehicle own driving behavior is adapted to the presumable driving behavior of the cooperation vehicles to execute the planned driving maneuver. Method for autonomously or semi-autonomously carrying out a cooperative driving maneuver for vehicle (Claimed). The method involves determining a maneuvering area of a road, in which the driving maneuver is potentially executed, and communicating with the vehicles by vehicle-to-vehicle communication, and hence ensures safe and reliable driving maneuver carrying out method. The drawing shows a schematic representation of traffic situation. 26Cooperation vehicles28Maneuvering area30Road34Maneuvering vehicle36Approach areas", "instruction": "Please summarize the input" }, { "input": "Method and control system for determining a traffic gap between two vehicles for changing lanes of a vehicleThe present invention relates to a vehicle-to-vehicle communication system and a method for determining a traffic gap between two vehicles for changing lanes of a vehicle. The method includes identifying 110 a traffic gap based on a first detection and based on a second detection. The first detection is based on at least one vehicle-to-vehicle status message of at least one other vehicle 200. The second detection is based on the on-board sensor system of the vehicle 100. \n\n|1. A method for determining a traffic gap between two vehicles for lane change of a vehicle 100, the method comprising: identifying (110) a traffic gap based on a first detection and based on a second detection, the second detection 1 detection is based on at least one vehicle-to-vehicle status message of at least one other vehicle 200, and the second detection is based on an on-board sensor system of the vehicle 100, the step of identifying ( 110);\nDetecting (155) that the identifying (110) did not identify a traffic gap; And a driving intention message based on the step 155 of detecting that the identifying step 110 does not identify a traffic gap.- The driving intention message includes information on a request for a future lane change of the vehicle 100. Including the step of transmitting 160, wherein the rough detection of the traffic gap is performed in the first detection, and the precise detection of the traffic gap detected in the first detection in the second detection is performed. , A method for determining a traffic gap between two vehicles. \n | 2. The vehicle-to-vehicle status message according to claim 1, wherein the at least one vehicle-to-vehicle status message includes information on at least one of a location and a trajectory of the at least one other vehicle (200), and the first detection is performed on the at least one other vehicle ( 200) based on information about at least one of the location and trajectory.\n | 3. The method according to claim 1 or 2, wherein the identifying step (110) is further based on a third detection based on a vehicle-to-vehicle message comprising environmental information of the at least one other vehicle (200), and the environment The information is based on a sensor record of the environment of the at least one other vehicle 200 by at least one on-board sensor of the at least one other vehicle 200.\n | 3. The method of claim 1 or 2, further comprising: longitudinally adjusting (120) the vehicle parallel to the identified traffic gap; And transversely adjusting (130) the vehicle by changing lanes parallel to the identified traffic gap.\n | 5. The method of claim 4, wherein the longitudinal adjustment step (120) corresponds to the adjustment of the speed or position of the vehicle (100) in the driving direction, or the longitudinal adjustment step (120) is a speed for an adaptive cruise control system. -Including the step of providing a time curve, or the longitudinal adjustment step 120 includes displaying a longitudinal adjustment aid for the driver of the vehicle 100, or the vehicle 100 is automatically The method corresponds to the driving vehicle 100, wherein the longitudinal adjustment step 120 corresponds to the longitudinal control of the autonomous driving vehicle 100 based on the identified traffic gap.\n | 6. The method of claim 4, wherein the transverse adjustment step (130) corresponds to the adjustment of the position of the vehicle (100) in the horizontal direction with respect to the driving direction, or the longitudinal adjustment step (120) is performed by the vehicle (100). When the position is set parallel to the identified traffic gap, the lateral adjustment step 130 is performed, or the lateral adjustment step 130 includes a driver-led automatic lane change, or the lateral adjustment step ( 130) includes the step of displaying a lateral direction adjustment assistance means for a driver of the vehicle 100, or the vehicle 100 corresponds to an autonomous vehicle 100, and the lateral direction adjustment step 130 Is corresponding to the lateral control of the self-driving vehicle 100.\n | 3. A method according to any of the preceding claims, further comprising the step (150) of determining the driving intention of a driver of the vehicle with respect to a lane change.\n | 8. The method of claim 7, further comprising transmitting (160) a driving intention message based on the step of determining the driving intention (150).\n | 9. A method for a vehicle (205), the method comprising: receiving (210) a driving intention message including a lane change request from an inquiry vehicle (100);\nAs a step 220 of detecting information on cooperation during cooperative driving control with the inquiry vehicle 100, the information on cooperation includes cooperation in consideration of whether the vehicle 205 is possible as a cooperation partner and traffic conditions. The step of detecting 220 is to present whether the movement is possible based on the driving intention message; \nTo make it possible to calculate whether the interruption request can be met within the range of possible cooperation, information about at least one gap for at least one of the front vehicle and the rear vehicle is detected (232), and the driving control is performed. Based on the information, the information on the at least one interval, the speed of the vehicle 205 and the distance to the possible cooperation range, the execution of driving control is detected (234), and whether driving control is possible in consideration of the traffic situation By calculating whether or not (236), determining (230) information about the driving control; And providing (240) a driving assistance for executing driving control, wherein the method includes exchanging a vehicle-to-vehicle adjustment message for coordinating cooperative driving control with at least one other vehicle 200 Further comprising, upon receipt of a message regarding acceptance of a lane change request from the at least one other vehicle 200, the detecting step 220, the determining step 230 and the providing step 240 The method for vehicle 205, at least one of which is interrupted. \n | 10. The vehicle according to claim 9, wherein the providing (240) of the driving assistance corresponds to an automatic or semi-automatic execution of driving control, or the providing (240) of the driving assistance is performed by means of a human-machine interface. The method corresponding to the step of providing guidance for the driver of 205 regarding the implementation of the driving maneuver.\n | 11. The method of claim 9 or 10, wherein the providing step further comprises providing a message regarding acceptance of a lane change request between the inquiry vehicle 100 and the at least one other vehicle 200. Way.\n | 12. In the control system 10 for vehicle 100, identifying a traffic gap does not identify a traffic gap, so as to identify a traffic gap between two vehicles based on a first detection and a second detection. In order to detect, if identifying the traffic gap does not identify the traffic gap, the vehicle in parallel to the identified traffic gap to transmit a driving intent message containing information regarding a future lane change request of the vehicle 100. Is formed to adjust longitudinally and to adjust the vehicle laterally by changing lanes parallel to the identified traffic gap, wherein the first detection is at least one vehicle-to-vehicle status message of at least one other vehicle 200 Based on, and the second detection is based on the on-board sensor system of the vehicle 100, the rough detection of the traffic gap is performed in the first detection, and the detection in the first detection in the second detection A control system for a vehicle, in which precise detection is performed on the resulting traffic gap.\n | 13. In the control system 20 for the vehicle 205, the vehicle 205 when cooperative driving control with the inquiry vehicle 100 is received to receive a driving intention message including a lane change request from the inquiry vehicle 100 Calculate whether the interruption request can be met within the scope of possible cooperation, to detect information on cooperation that suggests based on the driving intention message whether it is possible as this cooperation partner and whether cooperation behavior is possible taking into account traffic conditions. In order to enable the detection of information on at least one interval for at least one of a front vehicle and a rear vehicle, information on driving control, information on the at least one interval, and speed of the vehicle 205 And a driving assistant for executing driving maneuvering to determine information about driving maneuvering by detecting the execution of driving maneuvering based on the distance for the possible cooperation range, and calculating whether driving maneuvering is possible in consideration of the traffic condition. To provide stance, And it is formed to exchange a vehicle-to-vehicle adjustment message for coordinating cooperative driving control with at least one other vehicle 205, upon receiving a message regarding acceptance of a lane change request from the at least one other vehicle 200 , At least one of the detecting (220), the determining (230) and the providing (240) is interrupted.", "output": "The method involves identifying (110) the traffic gap based on a first detection and based on a second detection. The first detection is based on a vehicle-to-vehicle status message of a vehicle (200). The second detection is based on a board sensor system of a vehicle (100). The vehicle-to-vehicle status message comprises information about a position and/or a trajectory of the vehicle (200). The first detection is based on the information about the position and/or the trajectory of the vehicle (200). An INDEPENDENT CLAIM is included for a control system for determining traffic gap between vehicles for lane change for vehicle. Method for determining traffic gap between vehicles for lane change for vehicle e.g. car. The traffic gap between vehicles for lane change for vehicle is determined effectively. The cooperative driving functions of the vehicle are supported efficiently. The drawings show the flow diagrams illustrating the process for determining traffic gap between vehicles for lane change for vehicle, and block diagram of the control system for determining traffic gap between vehicles for lane change for vehicle. (Drawing includes non-English language text) 100,200Vehicles110Step for identifying traffic gap120Step for performing longitudinal regulation corresponding to regulation of speed of vehicle130Step for performing transverse regulation for threading into selected gap150Step for determining driving intention of driver of vehicle", "instruction": "Please summarize the input" }, { "input": "METHOD FOR RESOURCE ALLOCATION IN A MOBILE COMMUNICATION SYSTEM AND BASE STATION, AND PARTICIPANT COMMUNICATION MODULE FOR THE USE IN THE METHODFor the scenario of vehicles (30) equipped with wireless communication modules (31) that communicate directly with each other on public roads, either for a cooperative or autonomous driving scenario, a very high reliability is very important. With LTE-V, the 3GPP standardization organization has specified a technique called sidelink communication with which the direct communication between cars is possible in the LTE frequency bands. The resources are scheduled in a base station (20) of a mobile communication cell. Since different mobile communication providers are available, there is the problem how to make it possible that participants from different providers can communicate with each other for a cooperative awareness traffic scenario with LTE-V communication. The solution proposed is that each provider will assign a dedicated spectrum (V, T, E, O) that is controlled by each provider itself for resource allocation for its own participants and the participants of other providers. The resource allocation management functionality for the direct communication among the participants from the plurality of providers is shifted from provider to provider from time slice (t_0, t_1, t_2, t_3) to time slice (t_0, t_1, t_2, t_3). This provides for a fair distribution of the resource management functionality among the different providers. At the same time, it avoids the provision of multiple transceiver chains in the communication modules with which the vehicles are equipped.|1. Method for resource allocation in a mobile communication system, comprising a plurality of base stations (20) from a plurality of mobile communication providers and a plurality of participants from the plurality of mobile communication providers, wherein each provider has assigned a dedicated spectrum (V, T, E, O) for resource allocation for its own participants, wherein the participants from the plurality of providers communicate directly among each other, wherein a given provider allocates a part (V2V) of its dedicated spectrum for the direct communication among the participants from the plurality of providers, \n * ? wherein either said given provider will schedule the resources in the part (V2V) of the dedicated spectrum (V, T, E, O) for its own participants and the participants of the other providers by means of a scheduler (225) in a provider owned base station (20), or\n * ? wherein the part (V2V) of a dedicated spectrum (V, T, E, O) of said given provider for the direct communication among the participants from the plurality of providers is divided into sections (V2V_V, V2V_T, V2V_E, V2V_O), with each provider of the plurality of providers having been assigned at least one section (V2V_V, V2V_T, V2V_E, V2V_O) of said part (V2V) of the dedicated spectrum (V, T, E, O) of the given provider, and where a base station (20T, 20V) of each of the plurality of providers other than said given provider will schedule the resources in its assigned section (V2V_V, V2V_T, V2V_E, V2V_O) of said dedicated spectrum (V,T, E, O) for the direct communications of its own participants, wherein the resource allocation management functionality for allocating a part of its dedicated spectrum for the direct communication among the participants from the plurality of providers is shifted from provider to provider from time slice (t_0, t_1, t_2, t_3) to time slice (t_0, t_1, t_2, t_3).\n | 2. Method according to claim 1, wherein the resource allocation functionality is shifted from provider to provider from time slice (t_0, t_1, t_2, t_3) to time slice (t_0, t_1, t_2, t_3) in a round robin fashion, maximum rate queuing fashion or proportionally fair queuing fashion.\n | 3. Method according to claim 1 or 2, wherein each provider announces to all other providers which part (V2V) of its dedicated spectrum (V, T, E, O) is reserved for the direct communication among the participants from the plurality of providers.\n | 4. Method according to claim 3, wherein each provider announces to its own participants which section of the announced part (V2V) of the dedicated spectrum (V, T, E, O) is reserved for the direct communication among its own participants.\n | 5. Method according to claim 3 or 4, wherein each provider will schedule resources in its section (V2V_V, V2V_T, V2V_E, V2V_O) of the part of (V2V) the dedicated spectrum (V, T, E, O) for its own participants by means of a scheduler in said provider owned base station (20).", "output": "The method involves providing base stations from multiple mobile communication providers and multiple participants from the multiple mobile communication providers in which each provider has assigned a dedicated spectrum (V,T,E,O) for resource allocation for its own participants and participants from the providers communicate directly among each other in particular with cooperative awareness messages. The resource allocation management functionality for the direct communication among the participants from the multiple providers is shifted from provider to provider from time slice (t-0-t-3) to time slice. INDEPENDENT CLAIMS are included for the following:a participant communication module; anda base station. Method for resource allocation in mobile communication system. The wireless vehicle communication network can help to reduce the weight of the vehicle by eliminating the need to install cables between the components which communicate. The drawing shows a schematic view illustrating how a portion of a dedicated spectrum in the LTE frequency bands which is allocated for communication is shifted from provider spectrum to provider spectrum per time slice. V,T,E,OSpectrumt-0-t-3Time slice", "instruction": "Please summarize the input" }, { "input": "METHOD FOR PLANNING A COOPERATIVE DRIVING MANEUVER, CORRESPONDING CONTROL UNIT AND VEHICLE EQUIPPED WITH A CONTROL UNIT AS WELL AS COMPUTER PROGRAMThe proposal concerns a method for planning a cooperative driving maneuver which may be used in the scenario of cooperative driving or autonomous driving. The method comprises the steps of observing the surroundings of a vehicle (10A), determining a planned trajectory (PT) the vehicle (10A) drives on for a certain amount of time, determining a desired trajectory (DT) different from the planned trajectory (PT) requiring a cooperative driving maneuver with at least one of the surrounding vehicles (10B, 10C). The solution according to the invention comprises the steps of determining a timeout value for the cooperative driving maneuver, starting a negotiation phase with the vehicles (10B, 10C) involved in the cooperative driving maneuver by sending a cooperative driving maneuver request message (MCM), waiting for the response messages from the involved vehicles (10B, 10C) and changing to the desired trajectory (DT) if the involved vehicles (10B, 10C) have accepted the desired trajectory (DT) before the negotiation phase has expired according to the timeout value.|1. Method for planning a cooperative driving maneuver, comprising the steps of observing the surroundings of a vehicle (10A), determining a planned trajectory (PT) the vehicle (10A) drives on for a certain amount of time, determining a desired trajectory (DT) different from the planned trajectory (PT) requiring a cooperative driving maneuver with at least one of the surrounding vehicles (10B, 10C), characterized by the steps of determining a timeout value (TO) for the cooperative driving maneuver, starting a negotiation phase with the vehicles (10B, 10C) involved in the cooperative driving maneuver by sending a maneuver coordination message (MCM), waiting for the response messages from the involved vehicles (10B, 10C) and changing to the desired trajectory (DT) if the involved vehicles (10B, 10C) have accepted the desired trajectory (DT) before the negotiation phase has expired according to the timeout value (TO).\n | 2. Method according to claim 1, further comprising a step of determining a branch point (BP) corresponding to a point lying on the planned trajectory (PT) and the desired trajectory (DT) at which the planned (PT) and the desired trajectory (DT) separate and checking if the vehicle (10A) will reach the branch point (BP) before the negotiation phase is over according to the determined timeout value (TO) and if yes, terminating the planning of the cooperative driving maneuver and not sending out said maneuver coordination message (MCM).\n | 3. Method according to claim 1 or 2, wherein for the step of determining a timeout value (TO) a step of determining the number of vehicles involved in the cooperative driving maneuver is performed and wherein the typical one-way trip time required for sending a message from one vehicle to another multiplied by the number of vehicles involved in the cooperative driving maneuver is added to the typical time for deciding on the acceptance or rejection of the cooperative driving maneuver to calculate the negotiation time (NT) for the cooperative driving maneuver.\n | 4. Method according to claim 3, wherein the typical round trip time for the internal network transfer in the vehicle (10A) having sent out the maneuver coordination message (MCM) is added to the negotiation time (NT) in order to determine the total negotiation time.\n | 5. Method according to claim 3 or 4, wherein in the vehicle (10A) having sent out the maneuver coordination message (MCM) the typical one-way trip time required for sending a message from one vehicle to another is adapted to the current estimation of the quality of service of the vehicle-to-vehicle radio communication system.\n | 6. Method according to one of claims 3 to 5, wherein in the vehicle (10A) having sent out the cooperative driving maneuver request message the timeout value (TO) is set to the negotiation time (NT) when it is found that the requesting vehicle (10A) will reach the branch point (BP) before the negotiation time (NT) is over.\n | 7. Method according to one of the previous claims, wherein the timeout value (TO) is entered into the payload field of the maneuver coordination message (MCM) to inform the involved vehicles (10B, 10C) about the timeout value for the negotiation phase of the cooperative driving maneuver.\n | 8. Method according to one of the previous claims, wherein the planned trajectory (PT) and the desired trajectory (DT) is entered into the payload field of the maneuver coordination message (MCM) to inform the involved vehicles (10B, 10C) about the planned cooperative driving maneuver.\n | 9. Method according to one of claims 6 to 8, wherein an involved vehicle (10B, 10C) performs a step of checking the timeout value (TO) in the received maneuver coordination message (MCM) and when it finds that the typical time for deciding on the acceptance or rejection of the cooperative driving maneuver is longer than the reported timeout value (TO), the involved vehicle (10B, 10C) will stop negotiating about the cooperative driving maneuver and transmit back to the requesting vehicle (10A) a message in which the cooperative driving maneuver is rejected.\n | 10. Computing unit, characterized in that, the computing unit (180) is adapted to perform the steps of one of the previous claims.\n | 11. Vehicle, characterized in that, the vehicle (30) is equipped with a computing unit (180) according to claim 10.\n | 12. Computer program, characterized in that, the computer program comprises program steps, which when the program is processed by a computing unit (180), cause it to carry out the method according to one of claims 1 to 9.", "output": "The method involves observing the surroundings of a vehicle (10A). A planned trajectory (PT) is determined that the vehicle drives on for a certain amount of time. A desired trajectory (DT) id different from the planned trajectory requiring a cooperative driving maneuver with one of the surrounding vehicles (10B, 10C, 10D). A negotiation phase is started with the vehicles involved in the cooperative driving maneuver in response to the steps of determining a timeout value for the cooperative driving maneuver by sending a maneuver coordination message. Waiting is done for the response messages from the involved vehicles and changes are made to the desired trajectory if the involved vehicles have accepted the desired trajectory before the negotiation phase has expired according to the timeout value. INDEPENDENT CLAIMS are included for the following:a computing unit;a vehicle; anda computer program for planning a cooperative driving maneuver. Method for planning a cooperative driving maneuver in a vehicle e.g. mobile robot and driverless transport system, that is utilized in a motorway. Improves efficiency and comfort of automated driving. Ensures simple, reliable and efficient solution for cooperative driving maneuvers supported by vehicle-to-vehicle communication. The drawing shows a schematic view of the cooperative driving scenario. 10AVehicle10B, 10C, 10DSurrounding vehiclesDTDesired trajectoryPTPlanned trajectory", "instruction": "Please summarize the input" }, { "input": "System and method for using global electorate using regional certificate trust listThe invention claims a system, method and component for managing trust of a plurality of root certificate authority (CA) using both a voter and a regional certificate trust list (CTL). Accordingly, providing a system and method, the system and method is used for managing trust of multiple root CA, and in a more effective manner than the traditional known or can be used for the management. More specifically, the invention claims a system and a method for realizing V2I and/or V2X PKI technology.|1. A system for managing trust of a plurality of root certificate issuing mechanisms, which is used for using in the communication between at least two vehicles and vehicle (V2V) of a plurality of transport vehicles in the form of continuous broadcast of the basic safety (BSM), the system comprising: a transport vehicle device located on a transport vehicle of the plurality of transport vehicles; the transport vehicle device comprises a transceiver and at least one processor controlling the transceiver, wherein the at least one processor is configured to control the transceiver; so as to provide V2V communication through at least one communication link between the transport vehicle device of the transport vehicle and the transport vehicle device of other transport vehicles in the plurality of transport vehicles, wherein the communication link is provided by direct radio link, or through the communication of the mobile radio network; wherein the method uses the root certificate associated with the transport vehicle sending the BSM to perform digital signature to each BSM; the root certificate is used for protecting the transmission of the BSM on the communication link; and at least one area root certificate issuing mechanism in a plurality of area root certificate issuing mechanism; the area root certificate issuing mechanism determines whether the identity of the root certificate associated with the corresponding transport vehicle in the plurality of transport vehicles is legal for the jurisdiction of at least one area issuing mechanism.\n | 2. The system according to claim 1, wherein at least two of the plurality of region awarding mechanisms use a certificate trust list that includes at least one common root certificate in each of at least two of a plurality of respective jurisdictions identified as a legitimate root certificate.\n | 3. The system according to claim 2, wherein each region awarding mechanism is configured for modifying a certificate trust list listing legal certificates in their jurisdictions using an elector-based root management.\n | 4. The system according to claim 3, wherein the voter-based root management is performed using a ballot having a endorsement, wherein the majority of the voters identified by the area awarding mechanism of adding or deleting the root certificate in the certificate trust list are sought to sign the ballot; so as to carry out endorsement or revocation to the root certificate.\n | 5. The system according to claim 1, wherein the plurality of region awarding mechanisms are associated with respective jurisdictions that share a common border.\n | 6. The system according to claim 1, wherein each of the root certificate issuing mechanisms issues a digital certificate including a root certificate, wherein the digital certificate certiates the ownership of the public key through the named body of the digital certificate.\n | 7. The system according to claim 1, wherein each of the BSM comprises data specific to a transport vehicle, and the data includes a time, a position, a speed, and a forward direction of a transport vehicle to which the BSM is transmitted.\n | 8. The system according to claim 1, wherein the digital signature of the BSM is used as an authentication of the correctness and reliability of the data contained in the BSM.\n | 9. The system according to claim 1, wherein a digital signature of the BSM is analyzed by the transport vehicle safety application, prior to data prior to accessing the BSM by one or more transport vehicles safety on a transport vehicle in the plurality of transport vehicles that receive the BSM through V2V communication.\n | 10. The system according to claim 9, wherein the one or more transport vehicle safety applications are autonomous or auxiliary driving applications.\n | 11. The system according to claim 1, wherein a digital signature of the BSM is analyzed by the transport vehicle safety application, prior to receiving data of the BSM by one or more of the transport vehicles safety on the transport vehicle in the plurality of transport vehicles that are received through the V2V communication. .\n | 12. The system according to claim 11, wherein the one or more transport vehicle safety applications are autonomous driving or auxiliary driving applications.\n | 13. A method for managing trust of a plurality of root certificate issuing mechanisms, which is used for using in the form of continuous broadcast of the basic safety message (BSM) between at least two of the plurality of transport vehicles and the vehicle (V2V) communication, the method comprising: controlling the transmission of the V2V communication from the transport vehicle device located on the transport vehicle of the plurality of transport vehicles; the transport vehicle device comprises a transceiver and at least one processor controlling the transceiver, wherein the at least one processor controls the transceiver; so as to provide V2V communication through at least one communication link between the transport vehicle device of the transport vehicle and the transport vehicle device of other transport vehicles in the plurality of transport vehicles, wherein the communication link is provided by direct radio link, or through the communication of the mobile radio network; wherein the method uses the root certificate associated with the transport vehicle sending the BSM to perform digital signature to each BSM; the root certificate is used for protecting the transmission of the BSM on the communication link; wherein at least one of the plurality of area root certificate issuing mechanism determines whether the identity of the root certificate associated with the corresponding transport vehicle in the plurality of transport vehicles is legal to the jurisdiction of the at least one area issuing authority.\n | 14. The method according to claim 13, wherein at least two of the plurality of region awarding mechanisms use a certificate trust list that includes at least one common root certificate in each of at least two of a plurality of respective jurisdictions identified as a legitimate root certificate.\n | 15. The method according to claim 14, wherein each region awarding mechanism uses the root management based on the electorate to modify the certificate trust list listing the legal certificate in the jurisdiction.\n | 16. The method according to claim 15, wherein the voter-based root management is performed using a ballot having a endorsement, wherein the majority of the voters identified by the area awarding mechanism of the root certificate are sought to be added or deleted in the certificate trust list. to sign the vote, so as to carry out endorsement or revocation to the root certificate.\n | 17. The method according to claim 13, wherein the plurality of region awarding mechanisms are associated with respective jurisdictions that share a common border.\n | 18. The method according to claim 13, wherein each of the root certificate issuing mechanisms issues a digital certificate including a root certificate, wherein the digital certificate certiates the ownership of the public key through the named body of the digital certificate.\n | 19. The method according to claim 13, wherein each of the BSM comprises data specific to a transport vehicle, and the data includes a time, a position, a speed, and a forward direction of a transport vehicle transmitting the BSM.\n | 20. The method according to claim 13, wherein the digital signature of the BSM is used as an authentication of the correctness and reliability of the data contained in the BSM.\n | 21. The method according to claim 13, wherein the digital signature of the BSM is analyzed by the transport vehicle safety application, before one or more transport vehicles safety the transport vehicle of the plurality of transport vehicles receiving the BSM through the V2V communication are used to access the data of the BSM. .\n | 22. The method according to claim 21, wherein the one or more transport vehicle safety applications are autonomous or auxiliary driving applications.\n | 23. The method according to claim 13, wherein prior to receiving data of the BSM, one or more transport vehicles safety on a transport vehicle in the plurality of transport vehicles through V2V communication receiving the BSM; A digital signature of the BSM is analyzed by the transport vehicle safety\n | 24. The method according to claim 23, wherein the one or more transport vehicle safety applications are autonomous driving or auxiliary driving applications.\n | 25. A non-transitory computer readable medium, comprising a computer program with computer software code instruction, when the at least one computer processor to realize the code instruction computer software the computer software code instruction executes a method for managing trust of a plurality of root certificate issuing mechanism, which is used for using in the form of continuous broadcast of basic safety message (BSM) between at least two vehicles in a plurality of transport vehicles and vehicle (V2V) communication; the method comprises: controlling the transmission of the V2V communication from the transport vehicle device located on the transport vehicle of the plurality of transport vehicles; the transport vehicle device comprises a transceiver and at least one processor controlling the transceiver, wherein the at least one processor controls the transceiver; so as to provide V2V communication through at least one communication link between the transport vehicle device of the transport vehicle and the transport vehicle device of other transport vehicles in the plurality of transport vehicles, wherein the communication link is provided by direct radio link, or through the communication of the mobile radio network; wherein the method uses the root certificate associated with the transport vehicle sending the BSM to perform digital signature to each BSM; the root certificate is used for protecting the transmission of the BSM on the communication link; wherein at least one of the plurality of area root certificate issuing mechanism determines whether the identity of the root certificate associated with the corresponding transport vehicle in the plurality of transport vehicles is legal to the jurisdiction of the at least one area issuing authority.\n | 26. The non-transitory computer-readable medium according to claim 25, wherein at least two of the plurality of area issuing mechanisms use a certificate trust list; The certificate trust list includes at least one common root certificate that is identified as a legal root certificate in each of at least two of a plurality of corresponding jurisdictions.\n | 27. The non-transitory computer-readable medium according to claim 26, wherein each region awarding mechanism modifies a certificate trust list listing legal certificates in their jurisdictions using an elector-based root management.\n | 28. The non-transitory computer-readable medium according to claim 27, wherein the voter-based root management is performed using a ballot having a endorsement, wherein the voter-based root management is performed by using a rewritten vote. seeking to add or delete the area issuing mechanism of the root certificate in the certificate trust list, inquiring the majority of the voters identified by the area issuing mechanism, to sign the vote, so as to carry out endorsement or revocation to the root certificate.\n | 29. The non-transitory computer-readable medium according to claim 25, wherein the plurality of region awarding mechanisms are associated with respective jurisdictions that share a common border.\n | 30. The non-transitory computer-readable medium according to claim 25, wherein the root certificate authority issues a digital certificate including a root certificate, wherein the digital certificate certiates the ownership of the public key through the named body of the digital certificate.\n | 31. The non-transitory computer-readable medium according to claim 25, wherein the BSM includes data specific to a transport vehicle, and the data includes a time, a position, a speed, and a forward direction of a transport vehicle to which the BSM is transmitted.\n | 32. The non-transitory computer-readable medium according to claim 25, wherein the digital signature of the BSM is used as an authentication of the correctness and reliability of the data contained in the BSM.\n | 33. The non-transitory computer-readable medium according to claim 25, wherein the data of one or more transport vehicles safety on the transport vehicle of the plurality of transport vehicles receiving the BSM through the V2V communication is prior to the data of the BSM being accessed by the V2V communication; A digital signature of the BSM is analyzed by the transport vehicle safety\n | 34. The non-transitory computer-readable medium according to claim 25, wherein the one or more transport vehicle safety applications are autonomous or auxiliary driving applications.\n | 35. The non-transitory computer-readable medium according to claim 25, wherein prior to receiving data of the BSM by one or more transport vehicles safety on a transport vehicle in the plurality of transport vehicles received through the V2V communication, the transport vehicle is used to access the BSM; A digital signature of the BSM is analyzed by the transport vehicle safety\n | 36. The non-transitory computer-readable medium according to claim 25, wherein the one or more transport vehicle safety applications are autonomous driving or auxiliary driving applications.", "output": "The system has a transportation vehicle equipment that is located on a transportation vehicle of multiple transportation vehicles. The transportation vehicle equipment includes a transceiver and a processor controlling the transceiver. The processor is configured to control the transceiver to provide V2V communication through communication link between the transportation vehicle equipment of the transportation vehicle and transportation vehicle equipment of other transportation vehicles of the multiple transportation vehicles. The communication link is provided through either direct radio link or communication or mobile- radio network. A regional root CA of multiple regional root CA dictate whether identities that associate root certificates with respective transportation vehicles of multiple transportation vehicles are legitimate for the jurisdiction for the regional authority. INDEPENDENT CLAIMS are included for the following:a method for managing trust across multiple root CA in V2V communication; anda non-transitory computer readable medium storing program for managing trust across multiple root CA in V2V communication. System for managing trust across multiple root certificate authorities (CA) in vehicle-to- vehicle (V2V) communication between transportation vehicles in the form of continuous broadcast of basic safety message (BSM). The system manages trust across multiple root CA using both electors and regional certificate trust lists (CTL) in a inventive way. The drawing shows the schematic diagram of the system for managing trust across multiple root CA in V2V communication. 110,120Jurisdictions115Credential management system117,127CTL125Security credential management system manager", "instruction": "Please summarize the input" }, { "input": "Method for planning the track of the vehicleA method for operating a navigation system with the method the destination of the autonomous vehicle of the driver guide to a desired path along a selected route, comprising: a step of obtaining information about the vehicle in the area around the autonomous vehicle, step to determine the trajectory of the vehicle based on the obtained information, and the vehicle of the track with the selected route path to the step of comparing. If it is determined that one vehicle in the vehicle once travel along the selected route path, currently travelling or will be travelling along the selected route path along the selected route path, then generates and outputs to the autonomous vehicle of the driver of the following instruction of the one vehicle.|1. A method for operating a navigation system with the method the destination for the autonomous vehicle of the driver guide to a desired path along a selected route, the method comprising: the information of the vehicle obtained in the region around the autonomous vehicle; based on the obtained information to determine the track of the vehicle around the autonomous vehicle of the; comparing the trajectory of the vehicle around the autonomous vehicle with the selected route path, and if it is determined that one vehicle in the vehicle around the autonomous vehicle: -been traveling along the selected route path, - currently running along the selected route path, or-will travel along the selected route path, then generates and outputs to the autonomous vehicle of the driver of the following instruction of the one vehicle.\n | 2. The method according to claim 1, wherein the step of obtaining information about the other vehicle comprises: using sensor data generated by at least one sensor for detecting at least one position in the vehicle around the autonomous vehicle, speed, heading, steering signal and/or lane allocation, and based on at least one of in the vehicle of the autonomous vehicle around the at least one position of at least one of the speed, the heading, the turn signal and/or the lane in the assignment to determine the trajectory of the vehicle around the autonomous vehicle.\n | 3. The method according to claim 1 or 2, wherein said step of obtaining the information about the vehicle around the autonomous vehicle comprises detecting, from the vehicle of the autonomous vehicle at the periphery of at least one of color and/or brand and/or manufacturer and and/or a turn signal and/or at least one of the type, and generating the instruction, said instruction comprises the detection of the vehicle in the autonomous vehicle to be followed to the color, the brand, the company, at least one of the turn signal and the type.\n | 4. The process according to any one of said claims, wherein said step of obtaining the information about the vehicle around the autonomous vehicle includes using the vehicle-to-vehicle (V2V) interface in the step of data receiving to the autonomous vehicle from the vehicle.\n | 5. The process according to any one of said claims, wherein using an audible signal and/or optical signal outputs said instruction to the driver.\n | 6. The process according to any one of said claims, further comprising the step the obtained information about the around the autonomous vehicle of the vehicle transmitted to the server, wherein the server performs the step of determining the track of the vehicle around the autonomous vehicle of the.\n | 7. The method according to claim 6, further comprising the step of transmitting to the server the selected route path, wherein the server performs the step comparing the trajectory of the vehicle around the autonomous vehicle with the selected route path.\n | 8. A program, the program implementing at least one of the method according to claim 1 to 7, the program is executed on the computer.\n | 9. The program according to claim 8, wherein the program is stored on a non-transitory computer readable medium accessible by the server.\n | 10. A method for autonomous vehicle navigation system (1), the system comprising: a routing unit (11), the route selecting unit for selecting the autonomous vehicle to the destination route path, a sensor unit (12); the sensor unit for obtaining sensor data of the vehicle in the region around the autonomous vehicle of from a plurality of sensors, and a processing unit (13), said processing unit is used for determining the track of the vehicle around the autonomous vehicle and the for the determined track with the selected route path to compare; instruction unit (14), said instruction unit for generating instructions for following the route path, wherein if matching the selected route path of the track of a vehicle around the autonomous vehicle, then the instruction unit is configured to generate following instructions of the one vehicle, and output unit (15), said output unit outputs said instruction for the driver to the autonomous vehicle to follow the one vehicle.\n | 11. The said system (1) according to claim 10, wherein the plurality of sensors comprises at least one of a camera, a radar, a laser radar, an inertial measurement unit (IMU) and a GNSS receiver, said GNSS receiver is used for receiving position coordinate of the autonomous vehicle from a global navigation satellite system (GNSS).\n | 12. The said system (1) according to claim 10 or 11, wherein the output unit (15) includes a head-up display.\n | 13. at least one system (1) according to claim 10 to 12, further comprising a receiving unit (16), said receiving unit for receiving, from the other vehicle receives information about the vehicle in the area around the autonomous vehicle, wherein the processing unit is configured for based on the received information to determine the trajectory and/or said instruction unit is configured to generates the instruction based on the received information.\n | 14. at least one system (1) according to claim 10 to 13, further comprising a server, the server being configured to communicate with the routing unit (11), the sensor unit (12), the processing at least one of communicate and exchange data in unit (13), said instruction unit (14) and the output unit (15).\n | 15. at least one system (1) according to claim 10 to 14, wherein the processing unit (13) is located at the server.", "output": "The method involves obtaining information about vehicles in a region around the ego vehicle. Determine trajectories of the vehicles around the ego vehicle based on the obtained information. Compare the trajectories of the vehicles around the ego vehicle with the selected route path of the ego vehicle. If it is determined that one of the vehicles around the ego vehicle was driving, is currently driving, or will be driving along the selected route path then generate and output an instruction to the driver of the ego vehicle to follow the one vehicle. Detect one of a position, a velocity, a heading, a turn signal and a lane assignment of one of the vehicles around the ego vehicle using sensor data generated by one sensor;. INDEPENDENT CLAIMS are included for the following:a program implementing a method; anda navigation system for an ego vehicle. Method for operating a navigation system for guiding a driver of an ego vehicle to a desired destination along a selected route path. The server does not need to be a single centrally managed piece of hardware but may be implemented as a cloud computing network with the advantage of redundant components and simplified maintenance. The drawing shows a block representation of a navigation system. 11Routing unit12Sensor unit13Processing unit14Instruction unit16Reception unit", "instruction": "Please summarize the input" }, { "input": "Dynamically placing an internet protocol anchor point based on a user device and/or an applicationA device determines whether an application, utilized by a user device and associated with a network, is a low latency application or a best effort application. The device designates a first network device or a second network device as a designated network device to be an IP anchor point for the application based on a set of rules. The first network device is designated as the designated network device when the application is the low latency application, or the second network device is designated as the designated network device when the application is the best effort application. The device provides, to the user device, information informing the user device that the designated network device is to be the IP anchor point for the application, and provides, to the network, information instructing the network to utilize the designated network device as the IP anchor point for the application.What is claimed is: \n | 1. A device, comprising: \none or more memories; and \none or more processors, communicatively coupled to the one or more memories, to: \nreceive information indicating that a user device is utilizing an application associated with a network; \ndetermine whether the application is a low latency application or a best effort application; \ndesignate a first network device of the network or a second network device of the network as a designated network device to be an Internet protocol (IP) anchor point for the application based on a set of rules and based on determining whether the application is the low latency application or the best effort application, \nwherein the first network device is designated as the designated network device to be the IP anchor point for the application when the application is the low latency application, or \nwherein the second network device is designated as the designated network device to be the IP anchor point for the application when the application is the best effort application, \nwherein the set of rules includes two one or more of: \na rule indicating that the IP anchor point is to be as close to the user device as possible, \na rule indicating that the IP anchor point is to include a threshold amount of processing resources and memory resources, \na rule indicating that the IP anchor point is to be associated with a serving base station, \na rule indicating a timing advance distance between the IP anchor point and the user device, or \na rule indicating an operational pathloss between the IP anchor point and the user device, and \nthe one or more processors, when designating the first network device as the designated network device, are to: \napply a weight to each rule, of the set of rules, to generate a weighted set of rules, \ndetermine scores for a plurality of first network devices based on the weighted set of rules, and \nselect the first network device, from the plurality of first network devices, based on a score for the first network device being greater than scores associated with one or more other network devices from the scores for the plurality of first network devices; \nprovide, to the user device, information informing the user device that the designated network device is to be the IP anchor point for the application; and \nprovide, to the network, information instructing the network to utilize the designated network device as the IP anchor point for the application to permit the user device to utilize the designated network device as the IP anchor point for the application. \n | 2. The device of claim 1, wherein the one or more processors are further to: \nprovide, to the network, information instructing the network to utilize a third network device of the network as a control plane anchor point for the application of the user device. \n | 3. The device of claim 1, wherein: \nthe first network device is a user plane function (UPF) device provided at an edge of the network, and \nthe second network device is a UPF device provided at a central location of the network. \n | 4. The device of claim 1, wherein: \nthe low latency application includes one or more of: \nan autonomous driving application, \na real-time vehicle-to-vehicle (V2V) communication application, or \nan application that delivers video; and \nthe best effort application includes one or more of: \nan application to enable a web download, or \nan application to access the Internet. \n | 5. The device of claim 1, wherein the one or more processors are further to: \nreceive, from the user device, information indicating that a user stopped utilizing the application; and \nprovide, to the network, information instructing the network to stop utilizing the designated network device as the IP anchor point for the application. \n | 6. A non-transitory computer-readable medium storing instructions, the instructions comprising: \none or more instructions that, when executed by one or more processors of a device, cause the one or more processors to: \nreceive information indicating that a user device is utilizing an application associated with a network; \ndetermine whether the application is a low latency application or a best effort application; \ndesignate a first network device of the network or a second network device of the network as a designated network device to be an Internet protocol (IP) anchor point for the application based on a set of rules and based on determining whether the application is the low latency application or the best effort application, \nthe first network device being designated as the designated network device to be the IP anchor point for the application when the application is the low latency application, or \nthe second network device being designated as the designated network device to be the IP anchor point for the application when the application is the best effort application, \nwherein the set of rules includes two one or more of: \na rule indicating that the IP anchor point is to be as close to the user device as possible, \na rule indicating that the IP anchor point is to include a threshold amount of processing resources and memory resources, \na rule indicating that the IP anchor point is to be associated with a serving base station, \na rule indicating a timing advance distance between the IP anchor point and the user device, or \na rule indicating an operational pathloss between the IP anchor point and the user device, and \nthe one or more instructions, that cause the one or more processors to designate the first network device as the designated network device, cause the one or more processors to: \napply a weight to each rule, of the set of rules, to generate a weighted set of rules, \ndetermine scores for a plurality of first network devices based on the weighted set of rules, and \nselect the first network device, from the plurality of first network devices, based on a score for the first network device being greater than scores associated with one or more other network devices from the scores for the plurality of first network devices; \nprovide, to the user device, information informing the user device that the designated network device is to be the IP anchor point for the application; and \nprovide, to the network, information instructing the network to utilize the designated network device as the IP anchor point for the application to permit the user device to utilize the designated network device as the IP anchor point for the application. \n | 7. The non-transitory computer-readable medium of claim 6, wherein the instructions further comprise: \none or more instructions that, when executed by the one or more processors, cause the one or more processors to: \nprovide, to the network, information instructing the network to utilize a third network device of the network as a control plane anchor point for the application of the user device. \n | 8. The non-transitory computer-readable medium of claim 6, wherein: \nthe first network device is a user plane function (UPF) device provided at an edge of the network, and \nthe second network device is a UPF device provided at a central location of the network. \n | 9. The non-transitory computer-readable medium of claim 6, wherein: \nthe low latency application includes one or more of: \nan autonomous driving application, \na real-time vehicle-to-vehicle (V2V) communication application, or \nan application that delivers video; and \nthe best effort application includes one or more of: \nan application to enable a web download, or \nan application to access the Internet. \n | 10. The non-transitory computer-readable medium of claim 6, wherein the instructions further comprise: \none or more instructions that, when executed by the one or more processors, cause the one or more processors to: \nreceive, from the user device, information indicating that a user stopped utilizing the application; and \nprovide, to the network, information instructing the network to stop utilizing the designated network device as the IP anchor point for the application. \n | 11. A method, comprising: \nreceiving, by a device, information indicating that a user device is utilizing an application associated with a network; \ndetermining, by the device, whether the application is a low latency application or a best effort application; \ndesignating, by the device, a first network device of the network or a second network device of the network as a designated network device to be an Internet protocol (IP) anchor point for the application based on a set of rules and based on determining whether the application is the low latency application or the best effort application, \nthe first network device being designated as the designated network device to be the IP anchor point for the application when the application is the low latency application, or \nthe second network device being designated as the designated network device to be the IP anchor point for the application when the application is the best effort application, \nwherein the set of rules includes two one or more of: \na rule indicating that the IP anchor point is to be as close to the user device as possible, \na rule indicating that the IP anchor point is to include a threshold amount of processing resources and memory resources, \na rule indicating that the IP anchor point is to be associated with a serving base station, \na rule indicating a timing advance distance between the IP anchor point and the user device, or \na rule indicating an operational pathloss between the IP anchor point and the user device, and \ndesignating the first network device as the designated network device comprising: \napplying a weight to each rule, of the set of rules, to generate a weighted set of rules, \ndetermining scores for a plurality of first network devices based on the weighted set of rules, and \nselecting the first network device, from the plurality of first network devices, based on a score for the first network device being greater than scores associated with one or more other network devices from the scores for the plurality of first network devices; \nproviding, by the device and to the user device, information informing the user device that the designated network device is to be the IP anchor point for the application; and \nproviding, by the device and to the network, information instructing the network to utilize the designated network device as the IP anchor point for the application to permit the user device to utilize the designated network device as the IP anchor point for the application. \n | 12. The method of claim 11, further comprising: \nproviding, to the network, information instructing the network to utilize a third network device of the network as a control plane anchor point for the application of the user device. \n | 13. The method of claim 11, wherein: \nthe first network device is a user plane function (UPF) device provided at an edge of the network, and \nthe second network device is a UPF device provided at a central location of the network. \n | 14. The method of claim 11, wherein: \nthe low latency application includes one or more of: \nan autonomous driving application, \na real-time vehicle-to-vehicle (V2V) communication application, or \nan application that delivers video; and \nthe best effort application includes one or more of: \nan application to enable a web download, or \nan application to access the Internet. \n | 15. The method of claim 11, further comprising: \nreceiving, from the user device, information indicating that the user stopped utilizing the application; and \nproviding, to the network, information instructing the network to stop utilizing the designated network device as the IP anchor point for the application. \n | 16. The device of claim 1, wherein the one or more processors, when determining whether the application is the low latency application or the best effort application, are to: \ndetermine that the application is the low latency application based on a maximum allowed latency for the application. \n | 17. The device of claim 1, wherein the one or more processors, when applying the weight to each rule of the set of rules, are to: \napply different weights to different rules based on one or more of: \ninformation associated with the user device, \ninformation associated with the application, or \ninformation associated with the network. \n | 18. The non-transitory computer-readable medium of claim 6, wherein the one or more instructions, that cause the one or more processors to determine whether the application is the low latency application or the best effort application, cause the one or more processors to: \ndetermine that the application is the low latency application based on a maximum allowed latency for the application. \n | 19. The non-transitory computer-readable medium of claim 6, wherein the one or more instructions, that cause the one or more processors to apply the weight to each rule of the set of rules, cause the one or more processors to: \napply different weights to different rules based on one or more of: \ninformation associated with the user device, \ninformation associated with the application, or \ninformation associated with the network. \n | 20. The method of claim 11, wherein determining whether the application is the low latency application or the best effort application comprises: \ndetermining that the application is the low latency application based on a maximum allowed latency for the application.", "output": "The device has a memories and a processors is coupled to the memories to receive information (510) indicating that a user device is utilizing an application. The application is a low latency application or a best effort application is determined (520). A first network device of the network or a second network device of the network as a designated network device to be an internet protocol (IP) anchor point for the application is designated (530) based on determining whether the application is the low latency application or the best effort application. The information informing the user device that the designated network device is to be the IP anchor point is provided (540) to the user device for the application. The information instructing the network to utilize the designated network device as the IP anchor point is provided (550) to the network for the application which permits the user device to utilize the designated network device as the IP anchor point for the application. INDEPENDENT CLAIMS are included for the following:a non-transitory computer-readable medium storing instructions for dynamically placing an IP anchor point based on a user device and an application; anda method for dynamically placing an IP anchor point based on a user device and an application. Device such as user equipment, mobile phone e.g. smart phone and radiotelephone, laptop computer, tablet computer, desktop computer, handheld computer, gaming device, wearable communication device e.g. smart watch and pair of smart glasses, mobile hotspot device, fixed wireless access device, or customer premises equipment. The anchor point platform can apply a greater weight to rule R2 than rules R3-R5 since rule R2 can ensure that the IP anchor point is an edge user plane function (UPF) with sufficient resources to handle the low latency application. The different stages of the process for dynamically placing an IP anchor point based on a user device and an application are automated, which can remove human subjectivity and waste from the process, and which can improve speed and efficiency of the process and conserve computing resources. The drawing shows a flow chart illustrating a process for dynamically placing an IP anchor point based on a user device and an application. 510Step for receiving information520Step for determining application530Step for designating a first or second network device540Step for providing information informing the user device550Step for providing information instructing the network", "instruction": "Please summarize the input" }, { "input": "Systems and methods for transforming high-definition geographical map data into messages for vehicle communicationsA device may receive three-dimensional geographical map data for a geographical region associated with a vehicle device of a vehicle and may process the three-dimensional geographical map data, with a data model, to transform the three-dimensional geographical map data into transformed geographical map data with a format that corresponds to a particular standard. The device may generate a message based on the transformed geographical map data and may cause the message to be provided to the vehicle device. The device may perform one or more actions based on the message.What is claimed is: \n | 1. A method, comprising: \nreceiving, by a device, three-dimensional geographical map data for a geographical region associated with a vehicle device of a vehicle; \nprocessing, by the device, the three-dimensional geographical map data, with a data model, to: \nanalyze different layers of the three-dimensional geographical map data, \nidentify a portion of the three-dimensional geographical map data based on analyzing the different layers of the three-dimensional geographical map data, and \ntransform the portion of the three-dimensional geographical map data into transformed geographical map data with a format that corresponds to a particular standard; \ngenerating, by the device, a message based on the transformed geographical map data; \ncausing, by the device, the message to be provided to the vehicle device; and \nperforming, by the device, one or more actions based on the message. \n | 2. The method of claim 1, wherein the particular standard includes a Society of Automotive Engineers J2735 standard.\n | 3. The method of claim 1, wherein causing the message to be provided to the vehicle device comprises one or more of: \ncausing the message to be provided to the vehicle device via a multi-access edge computing device associated with the vehicle device; \ncausing the message to be provided to the vehicle device via a registration representational state transfer application program interface; or \ncausing the message to be provided to the vehicle device via a cellular vehicle-to-everything message broker. \n | 4. The method of claim 1, wherein performing the one or more actions comprises: \nreceiving new three-dimensional geographical map data associated with the geographical region; \nupdating the message based on the new three-dimensional geographical map data and to generate an updated message; and \ncausing the updated message to be provided to the vehicle device. \n | 5. The method of claim 1, wherein performing the one or more actions comprises: \ngenerating an alert based on the message; and \nproviding the alert to the vehicle device. \n | 6. The method of claim 1, wherein performing the one or more actions comprises: \ndetermining a location of the vehicle based on the message; and \ncausing an emergency service to be dispatched to the location of the vehicle. \n | 7. The method of claim 1, wherein the transformed geographical map data includes map data identifying one or more of: \none or more traffic lanes associated with the geographical region; \none or more intersections associated with the geographical region; \none or more traffic signals associated with the geographical region; \none or more sidewalks associated with the geographical region; or \none or more pedestrian lanes associated with the geographical region. \n | 8. A device, comprising: \none or more processors configured to: \nreceive three-dimensional geographical map data for a geographical region associated with a vehicle device of a vehicle; \nprocess the three-dimensional geographical map data, with a data model, to: \nanalyze different layers of the three-dimensional geographical map data, \nidentify a portion of the three-dimensional geographical map data based on analyzing the different layers of the three-dimensional geographical map data, and \ntransform the portion of the three-dimensional geographical map data into transformed geographical map data with a format that corresponds to a particular standard; \ngenerate a message based on the transformed geographical map data; \ncause the message to be provided to the vehicle device via a multi-access edge computing device associated with the vehicle device, via a registration representational state transfer application program interface, or via a cellular vehicle-to-everything message broker; and \nperform one or more actions based on the message. \n | 9. The device of claim 8, wherein the one or more processors, to perform the one or more actions, are configured to: \nreceive feedback from the vehicle device based on the message; and \nretrain the data model based on the feedback. \n | 10. The device of claim 8, wherein the one or more processors, to perform the one or more actions, are configured to one or more of: \ncause the vehicle device to determine an actual location of the vehicle based on the message; \ncause the vehicle device to provide an actual location of the vehicle to one or more other vehicles based on the message; \ncause the vehicle device to position the vehicle in a traffic lane based on the message; or \ncause traffic analytics for the geographical region to be generated based on the message. \n | 11. The device of claim 8, wherein the transformed geographical map data includes map data identifying one or more of: \none or more traffic lanes associated with the geographical region; \none or more intersections associated with the geographical region; \none or more traffic signals associated with the geographical region; \none or more sidewalks associated with the geographical region; or \none or more pedestrian lanes associated with the geographical region. \n | 12. The device of claim 8, wherein the vehicle includes one or more of: \nan autonomous robot, \na semi-autonomous vehicle, \nan autonomous vehicle, or \na non-autonomous vehicle. \n | 13. The device of claim 8, wherein the one or more processors, to perform the one or more actions, are configured to: \ncalculate traffic analytics for the geographical region based on the message; and \nprovide the traffic analytics to an entity associated with managing traffic for the geographical region. \n | 14. The device of claim 8, wherein the one or more processors, to perform the one or more actions, are configured to: \nreceive new three-dimensional geographical map data associated with the geographical region; and \nupdate the message based on the new three-dimensional geographical map data. \n | 15. A non-transitory computer-readable medium storing a set of instructions, the set of instructions comprising: \none or more instructions that, when executed by one or more processors of a device, cause the device to: \nreceive three-dimensional geographical map data for a geographical region associated with a vehicle device of a vehicle; \nprocess the three-dimensional geographical map data, with a data model, to: \nanalyze different layers of the three-dimensional geographical map data, \nidentify a portion of the three-dimensional geographical map data based on analyzing the different layers of the three-dimensional geographical map data, and \ntransform the portion of the three-dimensional geographical map data into transformed geographical map data with a format that corresponds to a particular standard; \ngenerate a message based on the transformed geographical map data; \ncause the message to be provided to the vehicle device; and \nperform one or more actions based on the message. \n | 16. The non-transitory computer-readable medium of claim 15, wherein the one or more instructions, that cause the device to perform the one or more actions, cause the device to: \nreceive new three-dimensional geographical map data associated with the geographical region; \nupdate the message based on the new three-dimensional geographical map data and to generate an updated message; \ncause the updated message to be provided to the vehicle device; and \nperform one or more additional actions based on the updated message. \n | 17. The non-transitory computer-readable medium of claim 15, wherein the one or more instructions, that cause the device to perform the one or more actions, cause the device to: \ngenerate an alert based on the message; and \nprovide the alert to a vehicle located in the geographical region. \n | 18. The non-transitory computer-readable medium of claim 15, wherein the one or more instructions, that cause the device to perform the one or more actions, cause the device to: \ndetermine a location of a vehicle in the geographical region based on the message; and \ncause an emergency service to be dispatched to the location of the vehicle. \n | 19. The non-transitory computer-readable medium of claim 15, wherein the one or more instructions, that cause the device to perform the one or more actions, cause the device to: \ncalculate traffic analytics for the geographical region based on the message; and \nprovide the traffic analytics to an entity associated with managing traffic for the geographical region. \n | 20. The non-transitory computer-readable medium of claim 15, wherein the transformed geographical map data includes map data identifying one or more of: \none or more traffic lanes associated with the geographical region; \none or more intersections associated with the geographical region; \none or more traffic signals associated with the geographical region; \none or more sidewalks associated with the geographical region; or \none or more pedestrian lanes associated with the geographical region.", "output": "The method involves receiving three-dimensional geographical map data for a geographical region associated with a vehicle device of a vehicle (410). The 3D map data is processed with a data model to transform the map data (420) into transformed map data with a format that corresponds to a particular standard e.g. Society of Automotive Engineers J2735 standard. A message is generated based on the transformed data (430). The message is caused to be provided to the vehicle device (440). A set of actions is performed by the device based on message (450). INDEPENDENT CLAIMS are included for: (1) a device, comprising: one or more processors configured to: receive three dimensional geographical map data for a geographical region associated with a vehicle device of a vehicle\\; (2) a non-transitory computer-readable medium storing a set of instructions. Method for generating a message based on the transformed geographical map data. The method utilizes a vehicle-to-vehicle (V2V) communication system to allow a user to communicate with the vehicle efficiently, and allows the V2V communications system to enable the user to receive information from the vehicle and the vehicle to communicate information to the user in a reliable manner. The drawing shows a flow diagram of the method for generating a message based on the transformed geographical map data.410Receiving Three Dimensional Geographical Map Data for a Geographical Region 420Processing the Three Dimensional Geographical Map Data with a Data Model 430Generating a Message based on the transformed geographical map data 440Causing the message to be provided to the vehicle device 450Performing one or more actions based on the message ", "instruction": "Please summarize the input" }, { "input": "Autonomous vehicles and systems thereforAbstract Title: Seatless vehicle with mobile charging \n\nA seatless autonomous vehicle which has the means to receive data, a battery 6 to power the vehicle and a charger arranged to charge a battery of another electrically powered vehicle. The vehicle may have a display for sending messages to other road users, the display may be mounted on the front or rear of the vehicle. The vehicle may have means to monitor its surroundings and send the information to other vehicles and/or back to a central hub. The vehicle may be a part of a fleet of vehicles which can communicate with each other (V2V) and a central hub. The hub may be cloud based. The battery of the vehicle may have apices at the hubs of the wheels. |\nCLAIMS \n\n1. A seatless road vehicle having an autonomous mode of operation, the vehicle including a wireless receiver for receiving data, a battery arranged to power the vehicle and a charger arranged to charge a battery of another, electrically powered, vehicle. \n | \n\n2. A vehicle according to claim 1, including an electronic illuminated display for providing instructions or information to drivers of other vehicles. \n | \n\n3. A vehicle according to claim 2, wherein the illuminated display is arranged on a rear surface and/or a front surface of the vehicle. \n | \n\n4. A vehicle according to claim 1, 2 or 3, including monitoring means for monitoring traffic conditions, road conditions and/or environmental conditions, and a transmitter for transmitting information gathered by the monitoring means. \n | \n\n5. A vehicle according to any preceding claim, including four wheels, wherein, viewed in plan, the battery occupies at least 60% of a rectangle having apices at hubs of the wheels. \n | \n\n6, A system comprising a fleet of vehicles, each according to any preceding claim, wherein each vehicle is arranged to communicate data including at least a location of the vehicle to at least one other vehicle in the fleet. \n | \n\n7. A system according to claim 6, including a control station arranged to coordinate control of the vehicles. \n | \n\n8. A system according to claim 7, wherein the control station is cloud based.", "output": "The seatless road vehicle comprises a wireless receiver for receiving data, where a battery (6) is arranged to power the vehicle, and a charger is arranged to charge a battery of another electrically powered vehicle. An electronic illuminated display is used for providing instructions or information to drivers of other vehicles. The illuminated display is arranged on a rear surface and a front surface of the vehicle. A monitoring unit is provided for monitoring traffic conditions, road conditions or environmental conditions. A transmitter is provided for transmitting information gathered by the monitoring unit. An INDEPENDENT CLAIM is included for a system comprises a fleet of vehicles. Seatless autonomous road vehicle for use with autonomous vehicle system (claimed). The seatless road vehicle has an autonomous mode of operation, and can rescue an electric vehicle with a flat battery, and electronic illuminated display provides instructions or information to drivers of other vehicles, and reduces the traffic congestion on highways. The drawing shows a schematic view of a battery and a motor of the vehicle.4Wheels 6Battery 8Motor ", "instruction": "Please summarize the input" }, { "input": "For example, the system and method for dynamic management and control of several WI-FI radio in the network of the moving object containing an autonomous vehicleA system for communication is provided, where the system comprises a context broker configured to gather context information for use in managing a plurality of radios, a Wi-Fi radio manager configured to manage the plurality of radio managers using the context information, and a plurality of radios, where each of the plurality of radio managers is configured to manage one of the plurality of radios for communication with another electronic device.|1. In a communication system, a context broker configured to collect context information to be used when managing multiple radios; The context information from the context broker is used; a Wi-Fi radio manager configured to manage a plurality of radio managers is provided; and a plurality of radio stations are provided. Each of the plurality of radio managers is configured to use the radio configuration information received from the Wi-Fi radio manager; to communicate with other electronic devices; and to manage each of the plurality of radio stations; and to provide a method for managing the radio communication system. The context information includes both the mobility information about the mobile vehicle associated with the context broker and the necessary items and/or requirements associated with the application or service; A communication system that manages a plurality of radio managers includes the configuration of at least one radio based on both the mobility information and the necessary items and/or the requirements associated with the application or the service.\n | 2. In the system described in claim 1, the context broker, the Wi-Fi radio manager, the plurality of radio managers, and the plurality of radio stations are located in the moving vehicle.\n | 3. In the system described in claim 2, the context information includes: the position of the mobile vehicle; the speed of the moving vehicle; the moving direction of the moving vehicle; the processing capability of the moving vehicle; and at least one of the resources for at least one vehicle outside the moving vehicle.\n | 4. In the system described in Claim 2, the context information includes infrastructure information about one or more infrastructure.\n | 5. In the system described in claim 4, the infrastructure information includes information about a neighbor access point (AP), a current path of the mobile vehicle, and one or more of the neighboring vehicles.\n | 6. In the system described in Claim 2, at least one of the plurality of radios is configured to connect an electronic device in the mobile vehicle to a network outside of the mobile vehicle.\n | 7. In the system described in Claim 2, the Wi-Fi radio manager is configured to provide the service of the mobile vehicle and the need for an application, and to supply one of the plurality of radio-specific power sources.\n | 8. In the system described in Claim 2, the Wi-Fi radio manager is configured to determine whether or not to turn on a particular power supply that has been powered off because of a context trigger that requires the use of one or more Wi-Fi radios among the plurality of radio stations. The context trigger is the system by the context information from the inside of the moving vehicle; the neighborhood of the moving vehicle; one or more APs, or the cloud server.\n | 9. In the system described in Claim 8, when one of the radio stations is powered up, the specific radio is based on the context information about the particular radio; vehicle-infrastructure (V2I) connection mode; A system that is set to vehicle-vehicle (V2V) connection mode; V2I scanning mode; V2V scanning mode; or access point (AP) mode.\n | 10. In the system described in claim 9, the system is configured to use at least one threshold value to determine when the Wi-Fi radio manager changes the configuration of the particular radio.\n | 11. In the system described in Claim 8, the respective weights are applied to the moving vehicle; the neighborhood of the surrounding of the moving vehicle; and the context information from the one or more APs and the cloud servers.\n | 12. In the system described in Claim 1, at least a portion of the contextual information is received from the cloud server.\n | 13. In the system described in claim 1, the Wi-Fi radio manager is configured to turn on or power off a specific one of the two or more radio stations.\n | 14. A communication method; a step of collecting context information to be used when managing a plurality of radios by a context broker; By using context information from the context broker, by using context information from the context broker, by a Wi-Fi radio manager configured to manage multiple radio managers, a step to determine how one of the plurality of radios should be configured, and; By the Wi-Fi radio manager, based on context information from the context broker, the configuration of the specific radio is presented to the radio manager. The method includes the steps of: using the radio configuration information received from the Wi-Fi radio manager; and configuring the particular radio for communication with other electronic devices. The context information includes both the mobility information about the mobile vehicle associated with the context broker and the necessary items and/or requirements associated with the application or service; The method for communication includes managing the plurality of radio managers, including the configuration of at least one radio based on both the mobility information and the necessary items and/or the requirements associated with the application or the service.\n | 15. In the method described in the claim 14, the configuration is a method for considering one or more of a signal power; a received signal strength indication (RSSI), an interference; a channel; and a frequency.\n | 16. In the method described in the claim 14, the context broker, the Wi-Fi radio manager, the plurality of radio managers, and the plurality of radio stations are provided in the moving vehicle.\n | 17. In the method described in the claim 16, the context information includes: the position of the moving vehicle; the speed of the moving vehicle; the moving method of the moving vehicle; the processing capability of the moving vehicle; and one or more of the resources for at least one vehicle outside the moving vehicle.\n | 18. In the method described in the claim 16, the context information includes infrastructure information for one or more infrastructure.\n | 19. In the method described in the claim 16, the Wi-Fi radio manager is configured to determine whether or not a power source is to be turned on for a context trigger that requires the use of one or more Wi-Fi radios within the plurality of radio stations. The context trigger is a method according to context information from one or more neighboring areas around the mobile vehicle; one or more neighboring areas around the mobile vehicle; or a cloud server.\n | 20. In the method described in the claim 16, the respective weights are applied to the moving vehicle, the neighborhood of the surrounding of the moving vehicle, the one or more APs, and the context information from each of the cloud servers.", "output": "The system has a context broker that is configured to gather context information for use in managing multiple radios. A WiFi radio manager is configured to manage multiple radio managers using the context information from the context broker. Each of multiple radio managers is configured to manage a respective one of multiple radios for communication with another electronic device using radio configuration information received from the WiFi radio manager. The context broker, the WiFi radio manager, multiple radio managers, and multiple radios are in a mobile vehicle (700). The context information comprises a location of the mobile vehicle, a speed of the mobile vehicle, a direction of travel of the mobile vehicle, processing capabilities of the mobile vehicle, and resources for vehicle external to the mobile vehicle. An INDEPENDENT CLAIM is included for a method for dynamic management and control of WiFi radio in network of mobile vehicle. System for dynamic management and control of WiFi radio in network of mobile vehicle. Uses include but are not limited to bus, truck, boat, forklift, human-operated vehicle, autonomous and/or remote controlled vehicles, boat, ship, speedboat, tugboat, barges, submarine, drone, airplane, and satellite, used in port, harbor, airport, factory, plantation, and mine. The communication network allows a port operator to improve the coordination of the ship loading processes and increase the throughput of the harbor by gathering real-time information on the position, speed, fuel consumption and carbon dioxide emissions of the vehicles. The communication network can operate in multiple modalities comprising various fixed nodes and mobile nodes, provide connectivity in dead zones or zones with difficult access, and reduce the costs for maintenance and accessing the equipment for updating/upgrading. The overall cost consumption per distance, time and vehicle/fleet is reduced. The data from vehicle is offloaded in faster and/or cheaper transfer manner and the overall quality experienced per application, service, or user is increased. The vehicle increases the data offloaded and reduces the costs or time of sending data over expensive or slow technologies. The time to first byte (TTFB) where the next available WiFi network detected is reduced. The drawing shows a block diagram of the communication devices in a vehicle. 700Mobile vehicle702,704,710,712,714Communication devices", "instruction": "Please summarize the input" }, { "input": "Systems and methods for vehicular positioning based on wireless fingerprinting data in a network of moving things including, for example, autonomous vehiclesCommunication network architectures, systems and methods for supporting a network of mobile nodes. As a non-limiting example, various aspects of this disclosure provide communication network architectures, systems, and methods for supporting a dynamically configurable communication network comprising a complex array of both static and moving communication nodes (e.g., the Internet of moving things). For example, systems and method for vehicular positioning based on wireless fingerprinting data in a network of moving things including, for example, autonomous vehicles.What is claimed is: \n | 1. A method of vehicular positioning of nodes of a radio frequency (RF) wireless network comprising a plurality of nodes disposed at respective fixed locations and a plurality of mobile nodes that reside in respective vehicles that move within a service area of the wireless network, and wherein each node of the plurality of nodes comprises one or more communication interfaces configured for scanning an RF wireless environment of the respective node, the method comprising: \nperiodically receiving respective wireless fingerprint sample data generated by each mobile node of the plurality of mobile nodes, the wireless fingerprint sample data comprising data elements characterizing RF signals received by the mobile node from RF signal sources during scanning of the RF wireless environment of the mobile node and a corresponding geographic location within the service area at which the RF signals were received; \nforming a collection of the wireless fingerprint sample data received from the plurality of mobile nodes; \nreceiving a request for an estimated geographic location of a particular mobile node of the plurality of mobile nodes; \nsearching the collection using a wireless snapshot comprising data elements characterizing RF signals received in a current RF wireless environment of the particular mobile node, to identify wireless fingerprint samples of the collection that match the data elements of the wireless snapshot; and \ncalculating an estimated location of the particular mobile node using the identified wireless fingerprint sample data. \n | 2. The method according to claim 1, wherein each mobile node of the plurality of mobile nodes comprises a wireless access point configured to provide wireless Internet access to end-user devices.\n | 3. The method according to claim 1, wherein each node of the plurality of nodes periodically wirelessly broadcasts its current geographic location to other nodes of the network.\n | 4. The method according to claim 1, wherein the scanning of RF signals within the service area of the wireless network is without regard to a route of travel of a vehicle in which the mobile node resides.\n | 5. The method according to claim 1, the method further comprising: \nadding the wireless snapshot and a respective estimated location of the particular mobile node to the collection as a wireless fingerprint sample, if the search fails to identify at least one wireless fingerprint sample that matches the wireless snapshot. \n | 6. The method according to claim 1, wherein the collection is indexed according to one or more of the data elements of each wireless fingerprint sample that characterize a signal source.\n | 7. The method according to claim 1, wherein the one or more communication interfaces are configured to scan and characterize RF signal sources comprising an RF signal of an IEEE 802.11p compliant vehicle to vehicle wireless communication standard and an RF signal compliant with a commercial cellular communication standard.\n | 8. A non-transitory computer-readable medium on which is stored instructions executable by one or more processors, the executable instructions causing the one or more processors to perform a method of vehicular positioning of nodes of a radio frequency (RF) wireless network comprising a plurality of nodes disposed at respective fixed locations and a plurality of mobile nodes that reside in respective vehicles that move within a service area of the wireless network, and wherein each node of the plurality of nodes comprises one or more communication interfaces configured for scanning an RF wireless environment of the respective node, the method comprising: \nperiodically receiving respective wireless fingerprint sample data generated by each mobile node of the plurality of mobile nodes, the wireless fingerprint sample data comprising data elements characterizing RF signals received by the mobile node from RF signal sources during scanning of the RF wireless environment of the mobile node and a corresponding geographic location within the service area at which the RF signals were received; \nforming a collection of the wireless fingerprint sample data received from the plurality of mobile nodes; \nreceiving a request for an estimated geographic location of a particular mobile node of the plurality of mobile nodes; \nsearching the collection using a wireless snapshot comprising data elements characterizing RF signals received in a current RF wireless environment of the particular mobile node, to identify wireless fingerprint samples of the collection that match the data elements of the wireless snapshot; and \ncalculating an estimated location of the particular mobile node using the identified wireless fingerprint sample data. \n | 9. The non-transitory computer-readable medium according to claim 8, wherein each mobile node of the plurality of mobile nodes comprises a wireless access point configured to provide wireless Internet access to end-user devices.\n | 10. The non-transitory computer-readable medium according to claim 8, wherein each node of the plurality of nodes periodically wirelessly broadcasts its current geographic location to other nodes of the network.\n | 11. The non-transitory computer-readable medium according to claim 8, wherein the scanning of RF signals within the service area of the wireless network is without regard to a route of travel of a vehicle in which the mobile node resides.\n | 12. The non-transitory computer-readable medium according to claim 8, the method further comprising: \nadding the wireless snapshot and a respective estimated location of the particular mobile node to the collection as a wireless fingerprint sample, if the search fails to identify at least one wireless fingerprint sample that matches the wireless snapshot. \n | 13. The non-transitory computer-readable medium according to claim 8, wherein the collection is indexed according to one or more of the data elements of each wireless fingerprint sample that characterize a signal source.\n | 14. The non-transitory computer-readable medium according to claim 8, wherein the one or more communication interfaces are configured to scan and characterize RF signal sources comprising an RF signal of an IEEE 802.11p compliant vehicle to vehicle wireless communication standard and an RF signal compliant with a commercial cellular communication standard.\n | 15. A system for vehicular positioning of nodes of a radio frequency (RF) wireless network comprising a plurality of nodes disposed at respective fixed locations and a plurality of mobile nodes that reside in respective vehicles that move within a service area of the wireless network, and wherein each node of the plurality of nodes comprises one or more communication interfaces configured for scanning an RF wireless environment of the respective node, the system comprising: \none or more processors operably coupled to storage and communicatively coupled to the plurality of nodes, the one or more processors operable to, at least: \nperiodically receive respective wireless fingerprint sample data generated by each mobile node of the plurality of mobile nodes, the wireless fingerprint sample data comprising data elements characterizing RF signals received by the mobile node from RF signal sources during scanning of the RF wireless environment of the mobile node and a corresponding geographic location within the service area at which the RF signals were received; \nform a collection of the wireless fingerprint sample data received from the plurality of mobile nodes; \nreceive a request for an estimated geographic location of a particular mobile node of the plurality of mobile nodes; \nsearch the collection using a wireless snapshot comprising data elements characterizing RF signals received in a current RF wireless environment of the particular mobile node, to identify wireless fingerprint samples of the collection that match the data elements of the wireless snapshot; and \ncalculate an estimated location of the particular mobile node using the identified wireless fingerprint sample data. \n | 16. The system according to claim 15, wherein each mobile node of the plurality of mobile nodes comprises a wireless access point configured to provide wireless Internet access to end-user devices.\n | 17. The system according to claim 15, wherein each node of the plurality of nodes periodically wirelessly broadcasts its current geographic location to other nodes of the network.\n | 18. The system according to claim 15, wherein the scanning of RF signals within the service area of the wireless network is without regard to a route of travel of a vehicle in which the mobile node resides.\n | 19. The system according to claim 15, wherein the one or more processors are further operable to: \nadd the wireless snapshot and a respective estimated location of the particular mobile node to the collection as a wireless fingerprint sample, if the search fails to identify at least one wireless fingerprint sample that matches the wireless snapshot. \n | 20. The system according to claim 15, wherein the collection is indexed according to one or more of the data elements of each wireless fingerprint sample that characterize a signal source.\n | 21. The system according to claim 15, wherein the one or more communication interfaces are configured to scan and characterize RF signal sources comprising an RF signal of an IEEE 802.11p compliant vehicle to vehicle wireless communication standard and an RF signal compliant with a commercial cellular communication standard.", "output": "The method involves receiving a request for an estimated geographic location of a particular mobile node of a set of mobile nodes. Collection of wireless fingerprint sample data is searched using a wireless snapshot that comprises data elements characterizing radio frequency (RF) signals received in a current RF wireless environment of the mobile node, to identify wireless fingerprint samples of the collection that match data elements of the wireless snapshot. An estimated location of the particular mobile node is calculated using identified wireless fingerprint sample data. INDEPENDENT CLAIMS are also included for the following:a non-transitory computer-readable medium comprising a set of instructions for vehicular positioning of nodes of an RF wireless networka system for vehicular positioning of nodes of an RF wireless network. Method for vehicular positioning of nodes i.e. internet of things nodes, of an RF wireless network e.g. city-wide vehicular network, shipping port-sized vehicular network and campus-wide vehicular network, associated with vehicles. Uses include but are not limited to a smartphone, tablet, smart watch, laptop computer, webcam, personal gaming device, personal navigation device, personal media device, personal camera and a health-monitoring device associated with automobiles, buses, lorries, boats, forklifts, human-operated vehicles and autonomous and/or remote controlled vehicles. The method enables the platform to be flexibly optimized at design/installation time and/or in real-time for different purposes so as to reduce latency, increase throughput, reduce power consumption and increase reliability with regard to failures based on the content, service or data. The method enables utilizing multiple connections or pathways that exist between distinct sub-systems or elements within the same sub-system to increase robustness and/or load-balancing of the network. The method enables gathering real-time information on position, speed, fuel consumption and carbon dioxide emissions of the vehicles, so that the communication network allows a port operator to improve the coordination of ship loading processes, increase throughput of the harbor and enhance performance of the positioning systems. The communication interfaces scan and characterize the RF signal sources with IEEE 802.11p compliant RF signals. The drawing shows a schematic block diagram of a communication network. 400Communication network", "instruction": "Please summarize the input" }, { "input": "System and method for telematics for tracking equipment usageSystems and methods are described for tracking information of an equipment including a telematics device configured to receive data from the equipment to determine a telematics information. The telematics information includes at least two of an equipment type, a location, a duration in the location, and miles travelled. A transmission device is configured to transmit the vehicle telematics information to at least one of a third party entity device, a government device and a mobile device.We claim: \n | 1. A system for tracking local information of an equipment on a vehicle, comprising: \na telematics device in the vehicle configured to receive at variable data sampling rate, raw data of vehicle telematics information comprising two or more of: energy usage, rate of energy consumption, equipment type, vehicle owner's information, a vehicle location, a duration of vehicle in the location, parking and moving violation, vehicle fines, distance travelled on the vehicle, and weight and size of equipment; and \na transmission device configured to compress the raw data of the vehicle telematics information and directly transmit through a network, the compressed raw data of the vehicle telematics information to at least one of a third party entity device, a government device and a mobile device to determine a usage charge based on the vehicle telematics information, and wherein the telematics device is configured to receive one or a combination of: public emergency alert announcement, captured images and associated data for matching to an object of interest in the public emergency alert announcement, optical sensors data, on-board laser and sonar pulsed sensor and imaging camera data to render the captured images and associated data for remote analysis by the at least one of the third party entity device, the government device and the mobile device. \n | 2. The system of claim 1, wherein the energy usage comprises total energy consumed by one or a combination of battery electric power, hydrogen fuel, natural gas, diesel fuel, solar power and gasoline, and the rate of energy consumption comprises per unit time measurement of one or a combination of battery electric power, hydrogen fuel, natural gas, diesel fuel, solar power and gasoline.\n | 3. The system of claim 1, wherein the vehicle comprises one of: transportation vehicles, recreation vehicles, industrial or home equipment, autonomous vehicles, flying vehicles.\n | 4. The system of claim 3, wherein the transportation vehicles comprise anyone of: a hybrid vehicle, an electric powered vehicle, a rental or a leased vehicle, a fleet managed vehicle, a car, a bus, a truck, wherein the recreation vehicles comprise anyone of: an all-terrain vehicle (ATV), an off-road vehicle, a drone, a boat, and the industrial/home equipment comprise anyone of: a power generator, a mining equipment, an agriculture equipment, a construction equipment.\n | 5. The system of claim 3, wherein for autonomous self-driving vehicles, vehicle telematics information may be communicated to an infrastructure network communication on distance driven in autonomous mode to levy a usage tax on vehicle to infrastructure; and for flying cars, a flight tax may be levied per trip and based on amount of fuel consumed and distance flown.\n | 6. The system of claim 1, wherein the telematics device, the government device and the mobile device associates a credit card, a debit card bank account, or through connection with a mobile device.\n | 7. The system of claim 1, wherein the government device charges vehicle owner based on received vehicle telematics information, comprising: usage charges, parking metering, moving violations, vehicle fines, state lines, specified highways, crossing determined bridges and car sharing charges.\n | 8. The system of claim 1, where the vehicle telematics information further includes information of a duration the equipment spends in determined geo-fenced locations.\n | 9. The system of claim 8, further including an electronic control unit configured to restrict a use of a fuel source or switch to an alternate fuel source based on a determined geo-fenced area.\n | 10. A method for tracking local information of an equipment in a vehicle, comprising: \nreceiving by a server, compressed raw data of vehicle telematics information which are compressed before being transmitted from a transmission device of a vehicle, the raw data of vehicle telematics information indicates energy and equipment use in the vehicle over a period of time, wherein the raw data of vehicle telematics information are received at variable data sampling rate by a telematics device, and the raw data of vehicle telematics information includes two or more of: energy usage, rate of energy consumption, equipment type, vehicle owner's information, a vehicle location, a duration of vehicle in the location, parking and moving violation, vehicle fines, distance travelled on the vehicle, and weight and size of equipment; and \nprocessing the raw data of the vehicle telematics information to determine a usage charge or a tax; and \ndirectly transmitting through a network, the usage charge or the tax to at least one of a third party entity device, a government device and a mobile device in order to determine a usage charge based on the vehicle telematics information, and wherein the telematics device is configured to receive one or a combination of: public emergency alert announcement, captured images and associated data for matching to an object of interest in the public emergency alert announcement, optical sensors data, on-board laser and sonar pulsed sensor and imaging camera data to render the captured images and associated data for remote analysis by the at least one of the third party entity device, the government device and the mobile device. \n | 11. The method of claim 10, wherein the energy usage comprises total energy consumed by one or a combination of battery electric power, hydrogen fuel, natural gas, diesel fuel, solar power and gasoline, and the rate of energy consumption comprises per unit time measurement of one or a combination of battery electric power, hydrogen fuel, natural gas, diesel fuel, solar power and gasoline.\n | 12. The method of claim 10, wherein the vehicle comprises one of: transportation vehicles, recreation vehicles, industrial or home equipment, autonomous vehicles, flying vehicles.\n | 13. The method of claim 12, wherein the transportation vehicles comprise anyone of: a hybrid vehicle, an electric powered vehicle, a rental or a leased vehicle, a fleet managed vehicle, a car, a bus, a truck, wherein the recreation vehicles comprise anyone of: an all-terrain vehicle (ATV), an off-road vehicle, a drone, a boat, and the industrial/home equipment comprise anyone of: a power generator, a mining equipment, an agriculture equipment, a construction equipment.\n | 14. The method of claim 12, wherein for autonomous self-driving vehicles, vehicle telematics information may be communicated to an infrastructure network communication on distance driven in autonomous mode to levy a usage tax on vehicle to infrastructure; and for flying cars, a flight tax may be levied per trip and based on amount of fuel consumed and distance flown.\n | 15. The method of claim 10, wherein the telematics device, the government device and the mobile device associates a credit card, a debit card bank account, or through connection with a mobile device.\n | 16. The method of claim 10, wherein the government device charges vehicle owner based on received vehicle telematics information, comprising: usage charges, parking metering, moving violations, vehicle fines, state or federal taxes, state lines, specified highways, crossing determined bridges and car sharing charges.\n | 17. The method of claim 10, where the telematics information further includes information of a duration the equipment spends in determined geo-fenced locations.\n | 18. The method of claim 10, further comprising restricting by an electronic control unit, a use of the fuel source or switching to an alternate fuel source based on a determined geo-fenced area.", "output": "The system comprises a telematics device (114) in the vehicle configured to receive at variable data sampling rate. The transmission device (115) configured to compress the raw data of the vehicle telematics information and directly transmit through a network. The compressed raw data of the vehicle telematics information to at least one of a third party entity device (104). A government device (150) and a mobile device (160) to determine a usage charge based on the vehicle telematics information. An INDEPENDENT CLAIM is included for a method for tracking local information of an equipment in a vehicle. System for tracking local information of an equipment on a vehicle. Minimizes cost of data transmission to the entity devices and/or other remote data locations. The drawing shows a block representation of a environment for tracking information. 104Third party entity device114Telematics device115Transmission device150Government device160Mobile device", "instruction": "Please summarize the input" }, { "input": "VEHICLE SYSTEM OF A VEHICLE FOR DETECTING AND VALIDATING AN EVENT USING A DEEP LEARNING MODELThe invention relates to a vehicle system (1) of a vehicle (2) configured to detect an event (E) and to broadcast said event (E) using a decentralized environmental notification message (DENM), wherein said vehicle system (1) comprises: \n - at least one camera sensor (10) configured to capture images (I1) of an environment of said vehicle (2), \n - an electronic control unit (11) configured to : \n - detect an event (E) using a primary deep learning model (M1) based on said images (I1), \n - apply an predictability level (A) on said event (E), said predictability level (A) being generated by said primary deep learning model (M1), \n - transmit said event (E) to a telematic control unit (12) if its predictability level (A) is above a defined level (L1), \n\n - said telematic control unit (12) configured to : \n - receive said event (E) from said electronic control unit (10) and broadcast a related decentralized environmental notification message (DENM) via a vehicle to vehicle communication (V2V) and/or a vehicle to infrastructure communication (V2I), \n - transmit at least one image (I1) and data details (D) of said event (E) to a server (3), \n - receive a primary validation information (30) of said event (E) from said server (3), said primary validation information (30) being generated by a secondary deep learning model (M2), and cancel the broadcasting of said decentralized environmental notification message (DENM) if said event (E) is not validated, \n - if said event (E) is validated, receive an updated instance (M3) of said primary deep learning model (M1) from said server (3) and transmit it to said primary electronic control unit (10) for updating said primary deep learning model (M1). \n\n|1. A vehicle system (1) of a vehicle (2) configured to detect an external event (E) and to broadcast said event (E) using a decentralized environmental notification message (DENM), wherein said vehicle system (1) comprises: \n * - at least one camera sensor (10) configured to capture images (11) of an environment of said vehicle (2),\n * - an electronic control unit (11) configured to : \n * - detect an event (E) using a primary deep learning model (M1) based on said images (11),\n * - determine a predictability level (A) of said event (E), said predictability level (A) being generated by said primary deep learning model (M1), (M1) for categorizing the different events (E), said events comprising accidents, road-block, animals on the road or on the pavement, pedestrians on the road or on the pavement, obstacles on the road or on the pavement and ambulance vehicles on the road,\n * - transmit said event (E) to a telematic control unit (12) of said vehicle system (1) if its predictability level (A) is above a defined level (L1),\n * - said telematic control unit (12) configured to : \n * - receive said event (E) from said electronic control unit (11) and broadcast a related decentralized environmental notification message (DENM) via a vehicle to vehicle communication (V2V) to other vehicles (6) in the environment of the vehicle (2) and/or via a vehicle to infrastructure communication (V2I), to infrastructures (7) in the environment of the vehicle (2),\n * - transmit at least one image (11) and data details (D) of said event (E) to a server (3), said data details (D) of said event (E) comprising a label (LB) of said event (E), a location (LO) of said event (E), a timestamp (TI) of said event (E), and the predictability level (A) of said event (E),\n * - receive a primary validation information (30) of said event (E) from said server (3), said primary validation information (30) being generated by a secondary deep learning model (M2), and cancel the broadcasting of said decentralized environmental notification message (DENM) if said event (E) is not validated,\n * - if said event (E) is validated, receive an updated instance (M3) of said primary deep learning model (M1) from said server (3) and transmit it to said electronic control unit (11) for updating said primary deep learning model (M1).\n | 2. A vehicle system (1) according to claim 1, wherein said electronic control unit (11) is configured to update said primary deep learning model (M1) with said updated instance (M3).\n | 3. A vehicle system (1) according to claim 1 or claim 2, wherein said telematic control unit (12) is further configured to: \n * - broadcast a periodic cooperative awareness message (CAM) based on said images (11) for stating the road conditions (R1) where said vehicle (2) is,\n * - receive a secondary validation information (31) of said road conditions (R1) from said server (3), said secondary validation information (31) being generated by said secondary deep learning model (M2),\n * - if said road conditions (R1) are validated, receive an updated instance (M3) of said primary deep learning model (M1) from said server (3) and transmit it to said electronic control unit (11) for update.\n | 4. A vehicle system (1) according to any of the preceding claims, wherein if the predictability level (A) of said event (E) is between a threshold (Th1) below the defined level (L1), the electronic control unit (11) is further configured to transmit a control signal (11a) to a human interface machine (20) of said vehicle (2) in order to have a confirmation of the predictability level (A) of said event (E).\n | 5. A vehicle system (1) according to any one of the preceding claims, wherein said event (E) is an accident, a road-block, an animal, a pedestrian, an obstacle, or an ambulance vehicle.\n | 6. A vehicle system (1) according to any of the preceding claims, wherein said primary deep learning model (M1) is associated to a geographical location (L3).\n | 7. A vehicle system (1) according to the preceding claim, wherein said vehicle system (1) comprises a plurality of primary deep learning models (M1) associated to different geographical locations (L3).\n | 8. A vehicle system (1) according to any of the preceding claims, wherein said at least one camera sensor (10) is a front camera.\n | 9. A vehicle system (1) according to any of the preceding claims, wherein said primary deep learning model (M1) and said secondary deep learning model (M2) are convolutional neural network (CNN) based.\n | 10. A vehicle system (1) according to any one of the preceding claims, wherein said vehicle (2) is an autonomous vehicle.\n | 11. A vehicle system (1) according to any one of the preceding claims, wherein if said electronic control unit (11) fails to detect an event (E), said telematic control unit (12) is further configured to send the images (11) captured by said at least one camera sensor (10) to said server (3).\n | 12. A method (4) comprising: \n * - a capture (E1) by at least one camera sensor (10) of a vehicle system (1) of a vehicle (2), of images (11) of the environment of said vehicle (2),\n * - a detection (E2) by an electronic control unit (11) of said vehicle system (1) of an external event (E) using a primary deep learning model (M1) based on said images (11),\n * - a determining (E3) by said electronic control unit (11) of an predictability level (A) of said event (E), said predictability level (A) being generated by said primary deep learning model (M1) for categorizing the different events (E), said events comprising accidents, road-block, animals on the road or on the pavement, pedestrians on the road or on the pavement, obstacles on the road or on the pavement and ambulance vehicles on the road,\n * - a transmission (E4) by said electronic control unit (11) of said event (E) to a telematic control unit (12) of said vehicle system (1) if its predictability level (A) is above a defined level (L1),\n * - the reception (E5) by said telematic control unit (12) of said event (E),\n * - the broadcasting (E6) by said telematic control unit (12) of a decentralized environmental notification message (DENM) related to said event (E) via a vehicle to vehicle communication (V2V) to other vehicles (6) in the environment of the vehicle (2) and/or via a vehicle to infrastructure communication (V2I), to infrastructures (7) in the environment of the vehicle (2)\n * - the transmission (E7) by said telematic control unit (12) of at least one image (11) and of data details (D) of said event (E) to a server (3), said data details (D) of said event (E) comprising a label (LB) of said event (E), a location (LO) of said event (E), a timestamp (TI) of said event (E), and the predictability level (A) of said event (E),\n * - the reception (E8) by said telematic control unit (12) of a primary validation information (30) from said server (3), said primary validation information (30) being generated by a secondary deep learning model (M2), and the cancellation (E9) by said telematic control unit (12) of said broadcasting if said event (E) is not validated,\n * - if said event (E) is validated, the reception (E10) by said telematic control unit (12) of an updated instance (M3) of said primary deep learning model (M1) from said server (3) and the transmission (E11) of said updated instance (M3) to an electronic control unit (11) of said vehicle system (1) for updating said primary deep learning model (M1).", "output": "The vehicle system (1) comprises one camera sensor (10) used to capture images of an environment of the vehicle (2, 6). The electronic control unit used to detect an event using a primary deep learning model based on the images. The predictability level on the event is applied. The predictability level is generated by the primary deep learning model. The event is transmitted to a telematic control unit (12) if its predictability level is above a defined level. The telematics control unit is used to receive the event from the electronic control unit (10) and broadcast a related decentralized environmental notification message through the vehicle to vehicle communication and vehicle to infrastructure communication. The image and data details of the event are transmitted to a server. INDEPENDENT CLAIMS are included for the following:a server comprises a secondary deep learning model; anda first method; anda second method. Vehicle system of a vehicle used to detect an event and to broadcast the event using a decentralized environmental notification message. The system obtains a better accuracy of the primary deep learning model and uses less memory in the vehicle and avoids going to a service center to update a vehicle's deep learning model and enhance the training of the secondary deep learning model. The drawing shows a schematic block diagram of a vehicle system. 1Vehicle system2, 6Vehicle10Camera sensor10Electronic control unit12Telematic control unit", "instruction": "Please summarize the input" }, { "input": "Defining and delivering parking zones to vehiclesTechniques are described for defining and delivering parking area information to a vehicle. The parking area information can be sent by a parking assistant device associated with a parking area and in response to receiving one or more messages from a vehicle system of the vehicle. Messages from the vehicle system indicate a location of the vehicle and are used by the parking assistant device to track a movement of the vehicle. The parking area information is sent in one or more responses messages from the parking assistant device and can include a rule for determining whether the vehicle is permitted to park in an unoccupied parking zone within the parking area or indicate a result of applying the rule.What is claimed is: \n | 1. A system in a vehicle, the system comprising: \na communications interface; and \na vehicle control system including one or more processors configured to: \ntransmit, to a computer device associated with a parking area and through the communications interface, one or more messages indicating a location of the vehicle; \nreceive, through the communications interface, one or more response messages from the computer device, wherein the one or more response messages include information about the parking area, and wherein the information includes a rule comprising one or more conditions that must be satisfied in order for the vehicle to be permitted to park in an unoccupied parking zone within the parking area; \ndecode the one or more response messages to extract the information, including the one or more conditions; and \nprocess the information in connection with a parking operation, wherein to process the information, the vehicle control system is configured to: \npresent the information on an audio or visual output device of the vehicle, the information being presented prior to performance of the parking operation, during performance of the parking operation, or both; or \ndetermine, using the information, an autonomous driving maneuver performed as part of the parking operation. \n | 2. The system of claim 1, wherein the one or more messages indicating the location of the vehicle comprise a vehicle-to-everything (V2X) message broadcasted by the system, and wherein the one or more response messages comprise a V2X message broadcasted by the computer device.\n | 3. The system of claim 1, wherein the information included in the one or more response messages indicates whether the vehicle is permitted, based on a result of applying the rule, to park in the unoccupied parking zone within the parking area.\n | 4. The system of claim 1, wherein the parking area includes multiple unoccupied parking zones, and wherein the information indicates a particular unoccupied parking zone as being preferred.\n | 5. The system of claim 1, wherein the one or more conditions include a time-based restriction on parking.\n | 6. The system of claim 1, wherein the one or more conditions include a parking restriction relating to an attribute of the vehicle or relating to an identity of an owner or driver of the vehicle.\n | 7. The system of claim 1, wherein: \nto process the information, the vehicle control system is configured to determine, using the information, the autonomous driving maneuver performed as part of the parking operation; and \nthe vehicle control system is configured to perform the parking operation autonomously as a self-parking operation that parks the vehicle into the unoccupied parking zone. \n | 8. The system of claim 1, wherein: \nto process the information, the vehicle control system is configured to determine, using the information, the autonomous driving maneuver performed as part of the parking operation; and \nthe vehicle control system is configured to perform the parking operation autonomously as a self-parking operation that parks the vehicle into a different parking zone than the unoccupied parking zone. \n | 9. A method comprising: \ntransmitting, from a vehicle system of a vehicle to a computer device associated with a parking area, one or more messages indicating a location of the vehicle; \nreceiving, by the vehicle system, one or more response messages from the computer device, wherein the one or more response messages include information about the parking area, and wherein the information includes a rule comprising one or more conditions that must be satisfied in order for the vehicle to be permitted to park in an unoccupied parking zone within the parking area; \ndecoding, by the vehicle system, the one or more response messages to extract the information, including the one or more conditions; and \nprocessing, by the vehicle system, the information in connection with a parking operation, wherein the processing comprises: \npresenting the information on an audio or visual output device of the vehicle, the information being presented prior to performance of the parking operation, during performance of the parking operation, or both; or \ndetermining, using the information, an autonomous driving maneuver performed as part of the parking operation. \n | 10. The method of claim 9, wherein the one or more messages from the vehicle system comprise a vehicle-to-everything (V2X) message broadcasted by the vehicle system, and wherein the one or more response messages comprise a V2X message broadcasted by the computer device.\n | 11. The method of claim 9, wherein the information included in the one or more response messages indicates whether the vehicle is permitted, based on a result of applying the rule, to park in the unoccupied parking zone within the parking area.\n | 12. The method of claim 9, wherein the parking area includes multiple unoccupied parking zones, and wherein the information indicates a particular unoccupied parking zone as being preferred.\n | 13. The method of claim 9, wherein the one or more conditions include a time-based restriction on parking.\n | 14. The method of claim 9, wherein the one or more conditions include a parking restriction relating to an attribute of the vehicle or relating to an identity of an owner or driver of the vehicle.\n | 15. The method of claim 9, wherein the parking operation is an autonomously performed self-parking operation that parks the vehicle into the unoccupied parking zone, and wherein the processing of the information in connection with the parking operation comprises determining, using the information, the autonomous driving maneuver performed as part of the parking operation.\n | 16. The method of claim 9, wherein the parking operation is an autonomously performed self-parking operation that parks the vehicle into a different parking zone than the unoccupied parking zone, and wherein the processing of the information in connection with the parking operation comprises determining, using the information, the autonomous driving maneuver performed as part of the parking operation.\n | 17. The method of claim 9, wherein the information about the parking area indicates a boundary of the unoccupied parking zone.\n | 18. The method of claim 9, wherein the one or more conditions are determined based on identification, by the computer device, of a pattern in usage of the parking area.\n | 19. A non-transitory computer-readable storage medium containing instructions that, when executed by one or more processors in a vehicle system of a vehicle, configure the vehicle system to: \ntransmit, to a computer device associated with a parking area, one or more messages indicating a location of the vehicle; \nreceive one or more response messages from the computer device, wherein the one or more response messages include information about the parking area, and wherein the information includes a rule comprising one or more conditions that must be satisfied in order for the vehicle to be permitted to park in an unoccupied parking zone within the parking area; \ndecode the one or more response messages to extract the information, including the one or more conditions; and \nprocess the information in connection with a parking operation, wherein the processing comprises: \npresenting the information on an audio or visual output device of the vehicle, the information being presented prior to performance of the parking operation, during performance of the parking operation, or both; or \ndetermining, using the information, an autonomous driving maneuver performed as part of the parking operation.", "output": "The method involves sending one or more messages that indicate the vehicle's location from the vehicle system 110 to a computer device connected to the parking area. One or more response messages are received from the computer device by the vehicle system. The one or more response messages has information about the parking area. The information has a rule to determine whether the vehicle is permitted to park in an unoccupied parking zone within the parking area. Also, the information indicates a result of applying the rule. The one or more response messages are decoded to extract the information. The information regarding parking operations is processed by the vehicle system. INDEPENDENT CLAIMS are included for: a computer-readable storage medium containing instructions. Method for defining parking zones and conveying information about the parking zones to a vehicle in order to assist in parking of the vehicle. The parking zones are predefined and generally comprise uniformly shaped spaces that are well-marked so as to make the parking zones easily identifiable to the driver even without the aid of the map data. The drawing shows a block diagram of a parking system. 100Parking system 110Vehicle system 130Communication network 140Computer system 142Datastore 144Parking area information ", "instruction": "Please summarize the input" }, { "input": "a network connection for automatically driving the vehicle dynamic behavior decision method in the environmentThe invention claims an automatic driving vehicle dynamic behavior decision method of network connection environment. the method comprises the following steps: step S1, in a V2X network environment, the surrounding road users gaining the surrounding environment information, and from the vehicle mass centre as the centre, performing region division with different radius, predicting the risk area, step S2 based on the surrounding road user surrounding environment information and the predicted risk area, performing the first stage of behavior decision, determining vehicle driving safety to ensure feasible action could take a set, step S3, behavior decision for the second stage: considering the un-safety constraint condition, from the feasible set of actions, final execution of the optimization selection action, driving behaviour decision.|1. A network dynamic behavior decision method of automatic driving vehicle environment, the vehicle is an automatic driving vehicle, wherein the method comprises the following steps: step S1, in a V2X network environment. gaining the surrounding environment information around the road user, and from the vehicle mass centre as the centre, performing region division with different radius, predicting the risk area, step S2, based on the surrounding road user surrounding environment information and the predicted risk area, performing the first stage of behavior decision. determining vehicle driving safety to ensure feasible action could take a set, step S3, behavior decision for the second stage: considering the non-safety constraint condition, from the feasible set of actions, final execution of the optimization selection action, driving behaviour decision.\n | 2. The automatic driving vehicle dynamic behavior decision method according to claim 1, wherein, in the step S1, predicting the risk region defined in the following way: from the vehicle mass centre as the centre, the risk area is a circular area of radius in a safe braking distance Lrisk, in the formula, vi is the own vehicle current speed, amax is the vehicle acceleration value, L is the vehicle length, if a surrounding road users located in the risk area, the surrounding road users defined as risk road users, defined from the vehicle centroid as centre, the security early warning distance Lp-risk is a circular area of radius risk region after removing the annular region is a potential risk region, wherein, adec is the maximum value of the self-vehicle deceleration, if some surrounding road users located in the potential risk area around the road users defined as a risk potential road users, defined from the vehicle the centroid as centre; safety pre-warning distance Lp-risk area outside the circular area of radius is safe area around, if some road users located outside the potential risk areas, or in the vehicle is out of communication range, the surrounding road users defined as safe road users.\n | 3. The automatic driving vehicle dynamic behavior decision method according to claim 2, wherein the step S2 comprises the following steps: step S21, surrounding road users in the risk area and risk potential region, calculating the risk degree C, wherein firstly calculating the surrounding road user in the risk area, then calculating the surrounding road users potential risk in the area risk degree C representing the automatic driving vehicle probability of conflict between the current state and the surrounding road user state. t is expected of the estimated collision time, if the risk area and risk potential region has two or more of the surrounding road users, t is minimum value of the predicted collision time of two or more estimated predicted collision occurs with each surrounding road users, if the risk area and risk potential region in not around the road user, t is greater than the setting value of tc, tc to avoid collision of the critical time, is set constant, when C=0, it is determined that there is no traffic conflict in this state. determining the risk metric value frisk is zero, and turning to step S23, when C=1, it is determined that potential traffic conflict exists in this state, turning to step S22, step S22, to calculate the risk metric value frisk, step S23. The risk metric value frisk for action selection, determining the feasible set of actions.\n | 4. The automatic driving vehicle dynamic behavior decision method according to claim 3, wherein the estimated, predicted collision time t is calculated by the following formula, t=min (TTC, PET, TTB), wherein Ξ is the vehicle position, Xj is the position of surrounding road followed by the user is from the vehicle current speed vi, vj is the other vehicle current speed, Li is the length from the vehicle, PET is the difference between the time the vehicle enters the conflict of ti time reach the conflict to other road users of tj, PET=t = | ti-tj | TTB for evaluating forward area, suitable for the vehicle, other vehicle the front scene, Ξ is the vehicle position, Xj is followed by other vehicle position, vi is the own vehicle current speed, Li is the length from the vehicle.\n | 5. The automatic driving vehicle dynamic behavior decision method according to claim 3, wherein the estimated, predicted collision time t calculated in the following way, when capable of distinguishing the scene, calculate the estimated only for estimation of the scene the collision time t, wherein, for the straight vehicle-following scene, t = TTC for intersection scene, t = PET for the vehicle after the forward collision scene, t = TTB; when the scene is complex, t=min (TTC, PET, TTB). wherein, Ξ is the vehicle position, Xj is the position of surrounding road followed by the user is from the vehicle current speed vi, vj is the other vehicle current speed, Li is the length from the vehicle, PET is the difference between the time the vehicle enters the conflict of ti time reach the conflict to other road users of tj, PET=t = | ti-tj | TTB for evaluating forward area, suitable for the vehicle, other vehicle the front scene, Ξ is the vehicle position, Xj is followed by other vehicle position, vi is the own vehicle current speed, Li is the length from the vehicle.\n | 6. The automatic driving vehicle dynamic behavior decision method according to claim 3, wherein, in the step S2, calculating the risk measurement value frisk by the following formula, or \n | 7. The automatic driving vehicle dynamic behavior decision method according to claim 1-6, wherein, in the step S3 of the second stage action decision, not include any influence on the security of the decision attribute, but considering the constraint function efficient soft, comfortable soft constraint function fc and traffic flow soft constraint function for the optimal decision.\n | 8. The automatic driving vehicle real-time trajectory planning method according to claim 7, wherein the efficient soft constraint function is defined as: wherein, t0 is vehicle initial departure time, tf is the destination arrival time, v (t) is the self-vehicle speed.\n | 9. The automatic driving vehicle real-time trajectory planning method according to claim 7, wherein the comfortable soft constraint function is defined as: wherein a is the vehicle acceleration, WorleeSol is the transverse acceleration, alon is the longitudinal acceleration.\n | 10. The automatic driving vehicle real-time trajectory planning method according to claim 7, wherein the traffic flow soft constraint function ft is defined as: minft= α (vave-vder) 2 + β (dave-dder) 2, wherein vave is the average speed levels of peripheral traffic flow before the decision, vder is average speed level of desired peripheral traffic flow decision after dave is peripheral traffic flow before the decision of an average vehicle distance, dder is the average vehicle distance decision periphery after the desired traffic flow, α, β is weight coefficient, is more than 0 less than 1.\n | 11. The automatic driving vehicle real-time trajectory planning method according to claim 7, wherein the second phase behavior decision step S3 is defined in the cost function J as follows: w1, w2, w3 is the weighting coefficient, all rates are more than 0 less than 1, and w1 +, w2 + w3=1, fe0, fc0, ft0 respectively represent the hypothesis according to the security decision state before continuing to execute after efficient, comfortable, and traffic flow function.", "output": "The method involves obtaining surrounding environment information by surrounding road from a vehicle mass center for performing region division with different radius to predict a risk area (S1). A first stage of behavior decision is performed (S2) to determine a vehicle driving safety to ensure feasible action based on the surrounding environment information of the surrounding road users and the estimated risk area. A non-safety constraint condition is considered (S3) from the feasible set of actions for selecting final execution of the optimization selection action for a driving behavior decision. Automatic driving vehicle network connection environment dynamic behavior decision method. The drawing shows a flow diagram illustrating an automatic driving vehicle network connection environment dynamic behavior decision method. '(Drawing includes non-English language text)' S1Step for obtaining surrounding environment information by surrounding road from a vehicle mass center for performing region division with different radius to predict a risk areaS2Step for performing first stage of behavior decision to determine a vehicle driving safety to ensure feasible action based on the surrounding environment information of the surrounding road users and the estimated risk areaS3Step for considering non-safety constraint condition from the feasible set of actions for selecting final execution of the optimization selection action for a driving behavior decision", "instruction": "Please summarize the input" }, { "input": "An intelligent network connection vehicle parallel driving control method based on ACP theoryThe invention relates to the technical field of parallel driving, more specifically, relates to an intelligent network parallel driving control method based on ACP theory, comprising: S10, establishing a parallel system based on ACP theory; collecting the intelligent traffic information in the actual space; and uploading to the parallel system; S20, the parallel system uses the intelligent traffic information to optimize and calculate the traffic flow and establish a virtual space which tends to be equivalent to the actual space; S30, optimizing and analyzing the intelligent traffic information in the virtual space, and optimizing the traffic flow and the driving vehicle: if there is better result verification passes, then turning to step S40; if there is no better result, keeping the current traffic flow vehicle driving instruction is not changed; S40, the virtual vehicle information passing through the new verification controls each vehicle in the actual space to execute the control command. The invention uses virtual traffic control and real vehicle driving control combination, virtual combination realizes intelligent and network control of vehicle and autonomous driving, optimizing traffic flow and reducing traffic accident risk.|1. An intelligent network-linked vehicle parallel driving control method based on ACP theory, the ACP theory is composed of artificial society, calculating experiment and parallel executing three parts; wherein the control method comprises the following steps: S10, establishing a parallel system based on ACP theory; collecting each intelligent traffic information in the actual space intelligent traffic system; the intelligent traffic information comprises vehicle speed, vehicle position, front and back vehicle distance, road information and road surface interaction information; the collected intelligent traffic information is uploaded to the parallel system; S20, the parallel system uses the intelligent traffic information collected in step S10 to optimize and calculate the traffic flow and establish a virtual space tending to the equivalent actual space; the virtual space comprises virtual intelligent traffic, road, information coupling the vehicle and the road and the vehicle; S30, in the virtual space established in step S20, optimizing and analyzing the intelligent traffic information, and using the early warning distance, braking distance and traffic accident rate to optimize the intelligent traffic information: if there is better result verification passes, then turning to step S40; if there is no better result, then keeping the current driving instruction is not changed, and continuously collecting and analyzing and adjusting the intelligent traffic information in the virtual space until the optimization result verification is passed, and then turning to step S40; S40, according to the intelligent traffic information verified in step S30, controlling each vehicle executing control command in the actual space, the control command comprises speed control, steering control and brake control; The intelligent traffic optimization in step S20 is calculated according to the following method: S21, calculating according to the following formula to obtain the speed difference of two adjacent vehicles, front and back vehicle distance and vehicle longitudinal distance: in the formula, SpdDif, i is the speed difference of the adjacent two vehicles, DistDif, i is the front and back distance, SLong, i is the ith vehicle longitudinal distance, ui represents the ith vehicle speed, ui-1 represents the i-1 vehicle speed, Sldi-1 is the position of the i-1 vehicle; Sldi is the position of the ith vehicle, SLong, 0 is the initial longitudinal distance of the vehicle; S22. The average traffic flow rate and the average traffic flow density solve follows: in the formula, SpdTrc, flow is average traffic flow speed, DensTrc, flow is average traffic flow density; S23. The average traffic flow rate according to the average traffic flow rate and average traffic flow density solve as follows: VolTrc, flow = SpdTrc, flow *DensTrc, flow type, VolTrc, flow is average traffic; calculated by the vehicle dynamics principle: in the formula, SLong, i is the ith vehicle longitudinal distance, ui represents the ith vehicle speed, FLj, i is four-wheel longitudinal force, mi is the vehicle quality; in the step S20, the coupling vehicle and the virtual road between the magic formula for modelling: μ L, ICV = DL, ICVsim [CL, ICVarctan (BL, ICV λ -EL, ICV (BL, ICV λ -arctan (BL, ICV λ)))] in the formula, μ L, ICV is longitudinal friction coefficient, BL, ICV is the rigidity coefficient; CL, ICV is shape parameter; DL, ICV is peak value parameter; EL, ICV is curvature parameter; λ is slip rate; using the upper type solve vehicle acceleration, as follows: in the formula, α i is acceleration, g is gravity acceleration, μ L, i, min, ICV and μ L, i, max, ICV is determined by vehicle-road coupling μ L, i, min, ICV represents the ith vehicle longitudinal friction coefficient lower limit; μ L, i, max, ICV represents the i vehicle longitudinal friction coefficient upper limit; in step S30, the pre-warning distance and braking distance is calculated according to the following method: S31, the performance evaluation index of the S31 /multi - vehicle interactive traffic flow system uses the collision time quantitative index TTCi, which is expressed as: in the formula, SLong, Dif, i represents the ith vehicle front and back vehicle distance, uDif, i represents the ith vehicle front and back speed difference; S32, using early warning index to judge whether it is possible to send traffic accident; the pre-warning index WIi is expressed as: S33, calculating the early warning distance SLong, Wr, i and braking distance SLong, Bk, i according to the following formula: in the formula, SLong, Bk, i is the braking distance, SLong, Wr, i is the early warning distance, uLong, 0 is the initial speed, uLong, i is i time speed; TBk, Delay is hardware delay time; TBk, Cmd is brake execution time; TResp, Delay is the response time of the driver; Select: taking is the threshold of TTC-1, so there is: in the formula, IdxNorm, TTC represents the normalized collision time index; taking WIThrd as the threshold of WI, comprising: in the formula, IdxNorm, WI represents normalized pre-warning index; In step S30, the traffic accident rate calculated by the following formula: in the formula; represents the average speed; E is the system parameter; adopting evaluation value evaluating the traffic flow; The evaluation values were affected by BL, ICV, CL, ICV, DL, ICV, EL, and ICV, i.e., ICV, ICV, ICV, ICV, and ICV. x = [BL, ICV, CL, ICV, DL, ICV, EL, ICV] Ts.t.xmin?x?xmax in the formula, ω i is a weighting factor; represents the ith vehicle ground friction estimation value, FZ μ represents ground friction force; represents the ith vehicle ground slip rate estimation value, xmin, xmax respectively BL, minimum boundary value of ICV, CL, ICV, DL, ICV, EL, ICV, the maximum boundary value; In the parallel system, the virtual vehicle satisfies the following dynamic balance: in the formula, Δ t is sampling time; and respectively is the target error of SLong, i, Disti and Spdi of the ith vehicle; Sdes, Long, i, Distdes, i and Spddes, i is the target value, Fi (k) is the control node force of the ith vehicle; ψ i, 1, ψ i, 2 and ψ i, 3 respectively weighting factor, Sldi, 0 represents the position of the 0 vehicle; in the parallel system, introducing multi-target cost function, for optimizing the traffic flow speed, vehicle distance, pre-warning distance and collision quantitative index, the multi-target cost function is expressed as: uCon = [F1, F2, ..., FN] Ts.t.uCon, Lim, min ?uCon, i?uCon, Lim, max formula, Q> 0 is a weighting factor; uCon, Lim, min and uCon, Lim, max is input limit value; and QPI is a control energy penalty function; ω TTC, i, ω WI, i and δ WI, i is the weight coefficient; ω WI, i and δ WI, i is the weight factor of IdxNorm, WI; \n | 2. The intelligent network parallel driving control method based on ACP theory according to claim 1, wherein in the step S10, the parallel system comprises a parallel intelligent traffic system for communication through V2X; a parallel driving management system and a parallel driving control system; the parallel driving management system collects and analyzes the intelligent traffic information of the parallel intelligent traffic system, evaluates and simulates the verification and sends the control command to the driving control system; the driving control system controls the vehicle speed in the actual space, steering or braking.", "output": "The method involves establishing a parallel system based on an algebra of communicating processes (ACP) theory. Intelligent traffic information is collected in an actual space intelligent traffic system, where the intelligent traffic information comprises vehicle speed, vehicle position, front and back vehicle distance, road information, and road surface interaction information. The collected intelligent traffic information is uploaded to the parallel system. A virtual space tending is established to an equivalent actual space. The actual space is controlled to execute a control command, where the control command comprises speed control, steering control, and brake control. ACP theory-based intelligent network connection vehicle parallel driving control method. The method enables optimizing the traffic flow and reducing traffic accident risk. The drawing shows a schematic representation of the ACP theory-based intelligent network connection vehicle parallel driving control method. (Drawing includes non-English language text).", "instruction": "Please summarize the input" }, { "input": "A networked autonomous fleet scheduling and cooperative control method based on event-triggeredThe invention claims a networked autonomous fleet scheduling and cooperative control method based on event-triggered. the method comprises: in autonomous fleet the vehicle controller receiving the status information relating to vehicle and vehicle-pacesetting wireless network transmission to generate a control signal, the vehicle to vehicle longitudinal dynamic mechanical analysis to establish mathematical model; considering the lead vehicle acceleration disturbance and based on the lead vehicle-following policy to establish a primary longitudinal structure of fleet model; considering the vehicle engine parameter uncertainty and performing discretization, and establishing a final motorcade longitudinal structure model, introducing an event triggering mechanism. establishing a controller structure model and solving the vehicle controller gain according to the controller gain of the vehicle and status information received by solving the acceleration at any time the vehicle so as to control the whole longitudinal vehicles. The invention improves the robustness of the network autonomous vehicles, effectively inhibit the frequent acceleration and deceleration of the vehicle to increase the comfort of passengers and reduce oil consumption.|1. A networked autonomous fleet based on event-triggered scheduling and cooperative control method, wherein it comprises the following steps: S1. autonomous controller of vehicle in the fleet related to receive state information of the pacesetting of the wireless network transmission vehicle and to generate a control signal, S2, a vehicle-to-vehicle for mechanical analysis to establish the linear longitudinal dynamics model, considering the acceleration of the lead vehicle disturbance and based on the lead vehicle-following policy to establish the primary train longitudinal structure model; S3, considering the vehicle engine parameter uncertainty and performing discretization, and establishing a final motorcade longitudinal structural model, S4, on considering the fleet of the uncertainty model based on longitudinal structure, into the event triggering mechanism, establishing a controller structure model; time delay system of S5, introducing autonomous fleet model, solving the vehicle controller gain. The controller gain of the vehicle and status information received by solving the acceleration of the vehicle at any time, obtained according to any time of acceleration control the entire longitudinal fleet.\n | 2. The method according to claim 1, the mentioned a kind of networked autonomous fleet scheduling and cooperative control method based on event-triggered, wherein, in the step S1, pacesetting related state information of the vehicle and vehicle comprising a front position, the relative speed pacesetting and the acceleration of the vehicle and the front vehicle.\n | 3. The method according to claim 1, the mentioned a kind of networked autonomous fleet scheduling and cooperative control method based on event-triggered, wherein in step S2, comprising the following steps: S2.1, for non-linear vehicle dynamics model, described as a first-order differential equation: wherein q0 is the position of the pacesetting vehicle, qi is the ith vehicle relative to the position of the reference point, vi is the speed of the ith vehicle acceleration ai is the ith car, respectively is a derivative of qi, vi, ai, engine input mi is the mass of the i-th vehicle, ci is the ith vehicle, σ is the air mass density and drag coefficient Ai is the ith vehicle cross-sectional area, cdi is the ith vehicle, dmi is the mechanical drag of the ith vehicle, engine power Fi is the ith car, is the air resistance of the ith vehicle, ξ i is the ith (i=1, 2. the number of the engine time constant .., n) vehicle, n is the vehicle in the fleet of S2.2, additional control input ui is the ith vehicle, then using feedback linearization method for nonlinear vehicle model, the the vehicle dynamics model of the nonlinear vehicle to obtain the ith vehicle linearization of longitudinal dynamics model: S2.3, autonomous expectation of fleet vehicle distance and the actual distance error can be described as: in the formula, Li is the length of the ith car, is the desired vehicle distance, δ i is the desired vehicle distance and the actual distance of the error, S2.4, x (t) = (δ, vi-1-vi, ai-1-ai] T, yi (t) = (δ, vi-1-vi, ai-1-ai, v0-vi, a0-ai] T, wherein v0 and a0 are the speed and the acceleration of the lead vehicle, the ui (t) is the ith vehicle additional control input at t time, defining state variables, measuring output and control amount are as follows: Assuming the engine constant ξ = ξ (i=1, 2,. .., n), then it can be known by the formula (3): wherein is the third derivative δ i; is the derivative of x (t), can be obtained: instruction: wherein, can be known by the analysis, if i=1, then: make can be obtained: in the formula, g=[001]T. is the derivative of the pacesetting vehicle acceleration a0; therefore, the primary longitudinal structure model of the fleet engine constant uncertainty is not introduced autonomous fleet longitudinal structural model is available state space expression primarily expressed as: wherein, G = (g 0 ... 0] T, \n | 4. The method according to claim 1, the mentioned a kind of networked autonomous fleet scheduling and cooperative control method based on event-triggered, wherein in step S3, considering the vehicle engine parameter uncertainty and performing discretization. The vehicle longitudinal dynamics model and the primary vehicle longitudinal structure model, and establishing a final motorcade longitudinal structure model, comprising: the uncertainty factor if considering the uncertainty of the engine parameter, introducing time-variant Δξ, the dynamic model of the i-th vehicle can be described as: in the formula | =fi (t) | Δξ, and fi (t) is continuously capable of measuring function, and satisfies fi2 (t) ? Di Lebesgue, Di>, 0, Di is a known matrix, and the absolute value | Δξ | of the lower boundary, the time-varying factor Δξ can influence the system, at this time, on the basis of state space expression (9), considering the engine constant uncertainty, then the autonomous vehicle longitudinal structure model usable state space expression further expressed as: is the uncertainty factor in the formula, the constant considering engine representing uncertainty autonomous fleet longitudinal structural model of the state space expression (11) performing discretization to obtain the final autonomous fleet longitudinal structure model as follows: in the formula, k is a positive integer, is an uncertainty factor absolute value | Δξ | of the deterministic boundary, represents the state space expression (11) performing discretization coefficient matrix corresponding.\n | 5. The method according to claim 1, the mentioned a kind of networked autonomous fleet scheduling and cooperative control method based on event-triggered, wherein in step S4, comprising: recording a vehicle state of the current time is x (k), the latest transmission state is x (sj), wherein sj represents the time of the current event-triggered, event triggering mechanism controller next sj (j=0, 1, 2, ...) time update control command, when x (k) and x (sj) satisfies :[x(k)-x(sj)]T Ω [x (k) - x (sj)] > μ electrotransfer (k) Ω x (k); (13), in the formula, Ω is the positive definite weighting matrix, k, sj is the positive integer, Osr (0, 1), constructing the output feedback controller to the vehicle: in the formula, is the controller gain after calculating, and respectively is a controller for the ith vehicle and vehicle distance of car, gain of the speed difference and the acceleration difference. respectively is the controller gain of the speed difference and the acceleration difference of the ith vehicle and pacesetting vehicle. information of sj time transmission time delay in a wireless network; the autonomous controller structure model of the vehicle as follows: in the formula, \n | 6. The method according to claim 1, the mentioned a kind of networked autonomous fleet scheduling and cooperative control method based on event-triggered, wherein in step S5, the self time delay system expression of the fleet model as follows: in the formula, the formula (13) is satisfied, β k=k-sj, ej (k) =0, when formula (13) is not met, the M=covering, 1, τm, +, τm is information through the delay bound of wireless network transmission, then wherein l is a non-zero positive integer, n is a non-zero positive integer not less than. \n | 7. The method according to claim 1, the mentioned a kind of networked autonomous fleet scheduling and cooperative control method based on event-triggered, wherein step S5 comprises the following steps: S5.1, established according to the step S3 of the final longitudinal structure of fleet model selects the Lyapunov-Krasovskii function: in the formula, δ (l) = x (l + 1) - x (l), P, Q, R are positive definite symmetric matrix to be solved, S5.2, calculating the forward differential selection of the Lyapunov-Krasovskii function. conditions the Δ V < 0, and introducing heat performance index and the H infinite property index, then the system has asymptotic stability as follows: given the parameters μ > 0 and the known time M, and positive definite weighting matrix W 0, Vc 0, there are >, 0, γ, >, 0, and suitable dimension matrix make the LMI is satisfied, in the formula marked then then satisfy the cost function J with bound J* and H-infinity performance | | y | | 2? | |ω | γ2 | 2; S5.3, by formula (3) to obtain: the formula (4) the third derivative, to obtain the following formula: The autonomous fleet controller structure model (15) to obtain: equations (18), (19) and (20) to obtain: putting the equation discretization of the sampling period is T, as follows: supposing the initial state condition of the autonomous vehicle control system is δ i (0) = 0, combining formula (21) and formula (22) and Z-transformation, the ith vehicle with the vehicle of the vehicle distance error transfer function is expressed as: wherein, S5.4, defined by the Z-transform of the known z=ej ω, and j=1 for all ω, 2, 3. .., n, so that the autonomous vehicle system satisfies the condition of the system with queue stability as follows: S5.5, by the (18) algebraic operation and Schur theorem, and (24) the queue stability in asymptotic stability condition of the in the step (18) to obtain the control gain of the controller satisfies bound via status information relating to vehicle and vehicle-pacesetting wireless communication network transmission, solving the acceleration of the vehicle at any time so as to control the whole longitudinal vehicles.", "output": "The method involves receiving relevant state information of a leading vehicle and a preceding vehicle by a vehicle controller through a wireless network to generate control signal. Mechanical analysis on the vehicle is carried out to establish a linearized longitudinal dynamics model of the vehicle. A preliminary longitudinal structure model of vehicle fleet is established based on lead vehicle-front vehicle following strategy. Uncertainty of vehicle engine parameters is considered. A final fleet longitudinal structure model is established. An event triggering mechanism is introduced based on the final fleet longitudinal structure model. Longitudinal fleet is controlled according to determined acceleration. Event triggering based vehicle networked autonomous fleet scheduling and co-controlling method. The method enables improving robustness of a network autonomous vehicle and effectively inhibiting frequent acceleration and deceleration of the vehicle to increase comfortableness of passengers and to reduce oil consumption. The drawing shows a block diagram illustrating a event triggering based networked autonomous fleet scheduling and co-controlling method. '(Drawing includes non-English language text)'", "instruction": "Please summarize the input" }, { "input": "Intersection autonomous vehicle dispatching and control method based on dynamic priorityThe invention claims an intersection autonomous vehicle scheduling and control method based on dynamic priority, comprising the following steps: obtaining the location information of the autonomous vehicle in the current intersection range; calculating the dynamic priority of the autonomous vehicle according to the positioning information; setting the planning motion state of the autonomous vehicle according to the dynamic priority; using the planned motion state as the reference to guide the corresponding autonomous vehicle to control the actual motion state. The intersection autonomous vehicle scheduling and control method based on dynamic priority, calculating the dynamic priority of the vehicle through the positioning information of each main vehicle, setting the planning motion state of each autonomous vehicle according to the autonomous vehicle priority of the high dynamic priority, avoids the high priority vehicle to avoid the low priority vehicle to decelerate, so as to improve the passing speed of the vehicle, reduce the passing time, has good real-time and high passing efficiency.|1. An intersection autonomous vehicle scheduling and control method based on dynamic priority, wherein it comprises the following steps: obtaining the location information of the autonomous vehicle in the current intersection range; calculating the dynamic priority of the autonomous vehicle according to the positioning information; setting the planning movement state of the autonomous vehicle according to the dynamic priority; guiding the corresponding autonomous vehicle to control the actual motion state by taking the planned motion state as the reference; wherein the step of obtaining the location information of the autonomous vehicle in the range of the current intersection further comprises the following steps: according to the trend of each lane of the current intersection, establishing a two-dimensional coordinate system intersection model of the current intersection at the position of entering the intersection and leaving the intersection; the step of calculating the dynamic priority of the autonomous vehicle according to the positioning information comprises the following steps: according to the speed of the autonomous vehicle, the maximum speed limit of the current intersection, the residence time of the current intersection and the distance calculating the value of the dynamic priority level to the nearest collision point, in the formula, PRi is the dynamic priority of the i-th autonomous vehicle Vi, vi is the speed of the i-th autonomous vehicle Vi, is the maximum speed limit of the current intersection, t is the current time, is the time of the autonomous vehicle Vi entering the current intersection, is the distance from the autonomous vehicle Vi to the nearest rush point; the step of planning operation state of autonomous vehicle according to the dynamic priority comprises the following steps: constructing a target function about a motion state plan; the target function is wherein si is the one-dimensional position of the autonomous vehicle Vi on the track, ai is the acceleration of the autonomous vehicle Vi, T is the period of the discrete system, Np is the planning time domain, si (Np-1) - si (0) is the advancing distance of the planning time domain Np, is the acceleration square in the planning time domain Np, k is the time of the autonomous vehicle Vi to the rush point; solving the target function by using the constraint condition as the optimization object; the constraint condition comprises one-dimensional motion equation of autonomous vehicle along the track, initial position and speed, speed and acceleration range constraint, and lane avoiding rear-end collision and different lanes avoid collision; planning the motion state of the autonomous vehicle according to the solving result.\n | 2. The intersection autonomous vehicle scheduling and control method based on dynamic priority according to claim 1, wherein the intersection model comprises a lane boundary, a parking line, a driving track line, and a punching point formed by intersecting each driving track line. the length between the collision point coordinate of each conflicting point in the two-dimensional coordinate system and the adjacent collision point on the same travelling track line and the length between the parking line and the conflict point.\n | 3. The intersection autonomous vehicle dispatching and control method based on dynamic priority according to claim 1, wherein it comprises the following steps before the step of obtaining the location information of the autonomous vehicle in the current intersection range. setting intersection scheduling center for processing intersection autonomous vehicle scheduling task, the intersection scheduling center uses V2X to communicate.\n | 4. The intersection autonomous vehicle scheduling and control method based on dynamic priority according to claim 1, wherein the step of obtaining the location information of the autonomous vehicle in the current intersection range comprises the following steps: collecting the autonomous vehicle positioning information by the sensor, the sensor comprises one or more of vehicle GPS, UWB, IMU and road side vision, road survey radar, the positioning information comprises autonomous vehicle real-time position, real-time orientation angle, real-time speed, real-time acceleration, real-time front wheel deflection angle and the time information of the vehicle.\n | 5. The intersection autonomous vehicle scheduling and control method based on dynamic priority according to claim 1, wherein the step of guiding the corresponding autonomous vehicle to control the actual motion state by taking the planned motion state as the reference comprises the following steps: mapping the planning motion state of the autonomous vehicle to the intersection model, the reference motion state and the reference control obtain in the intersection model; according to the reference motion state and the reference control input, combining the real-time motion state of the autonomous vehicle obtain the optimal desired motion input, controlling the actual motion state of the autonomous vehicle according to the expected motion input.\n | 6. The intersection autonomous vehicle dispatching and control method based on dynamic priority according to claim 5, wherein the reference movement state comprises a reference coordinate, a reference angle and a reference speed, and the reference control input comprises a reference acceleration and a reference front wheel deflection angle.\n | 7. The intersection autonomous vehicle scheduling and control method based on dynamic priority according to claim 6, wherein the step of controlling the actual motion state of the autonomous vehicle according to the expected motion input comprises the following steps: according to the real-time coordinate and the real-time speed of the autonomous vehicle, and the reference coordinate and the reference speed in the reference motion state, performing the state prediction and rolling optimization of the control time domain, and executing the first frame result after optimization.", "output": "The method involves obtaining location information of an autonomous vehicle in a current intersection range. A dynamic priority of the autonomous vehicle is calculated according to the location information. A planning movement state of the autonomous vehicle is set according to the dynamic priority. The corresponding autonomous vehicle is guided to control an actual motion state by taking the planned motion state as a reference. A two-dimensional coordinate system intersection model of the current intersection is established at a position of entering intersection. An intersection scheduling center for processing intersection autonomous vehicle scheduling task is set. Method for scheduling and controlling an intersection autonomous vehicle based on dynamic priority. The method enables calculating the dynamic priority of the vehicle through the positioning information of each main vehicle, setting planning motion state of each autonomous vehicle according to the autonomous vehicle priority of the high dynamic priority, and avoiding high priority vehicle to avoid low priority vehicle to decelerate so as to improve passing speed of the vehicle, reduce passing time, and ensure better real-time and high passing efficiency. The drawing shows a flow diagram illustrating a method for scheduling and controlling an intersection autonomous vehicle based on dynamic priority. (Drawing includes non-English language text).", "instruction": "Please summarize the input" }, { "input": "Method and apparatus of networked scene rendering and augmentation in vehicular environments in autonomous driving systemsA system and method is taught for vehicles controlled by automated driving systems, particularly those configured to automatically control vehicle steering, acceleration, and braking during a drive cycle without human intervention. In particular, the present disclosure teaches a system and method for generation situational awareness and path planning data and transmitting this information via vehicle to vehicle communications where one vehicle has an obstructed view to objects not within an obstructed view of a second vehicle.What is claimed is: \n | 1. A method comprising: \nreceiving a data related to a field of view of a sensor on a first vehicle; \ngenerating a first point cloud having a first granularity in response to the data; \ndetermining a bandwidth and a latency of a transmission channel; \ngenerating a second point cloud having a second granularity in response to the first point cloud and the bandwidth and latency of the transmission channel; and \ntransmitting the second point cloud to a second vehicle for use by an autonomous control system. \n | 2. The method of claim 1 further comprising determining a location of a first object within the field of view.\n | 3. The method of claim 2 further comprising determining a velocity of the first object.\n | 4. The method of claim 1 wherein the first granularity is higher than the second granularity.\n | 5. An apparatus comprising: \na sensor for receiving data related to a field of view of the sensor on a first vehicle; \na processor for generating a first point cloud having a first granularity in response to the data, determining a bandwidth and a latency of a transmission channel, for generating a second point cloud having a second granularity in response to the first point cloud and the bandwidth and latency of the transmission channel; and \na transmitter for transmitting the second point cloud to a second vehicle for use by an autonomous control system. \n | 6. The apparatus of claim 5 wherein the processor is further operative to determine a location of a first object within the field of view.\n | 7. The apparatus of claim 6 wherein the processor is further operative to determine a velocity of the first object.\n | 8. The apparatus of claim 5 wherein the first granularity is higher than the second granularity.", "output": "The method involves receiving data related to a field of view of a sensor on a first vehicle (305). A first point cloud with a first granularity is generated in response to the data. Bandwidth and latency of a transmission channel are determined. A second point cloud with a second granularity is generated in response to the first point cloud and the bandwidth and latency of the transmission channel. The second point cloud is transmitted to a second vehicle (315) for use by an autonomous control system. A location of a first object within the field of view is determined. Velocity of the first object is determined, where the first granularity is higher than the second granularity. An INDEPENDENT CLAIM is also included for an apparatus for facilitating networked scene rendering and augmentation in vehicular environments in autonomous driving systems. Method for facilitating networked scene rendering and augmentation in autonomous driving systems of a passenger car. Can also be used for motorcycle, lorry, sport utility vehicle (SUV), recreational vehicle (RV), marine vessel and aircraft. The method enables extending and augmenting three-dimensional (3D) vision beyond line-of-sight perception range through vehicle to everything (V2X) communication and providing efficient object rendering solution using suites of adaptive information retrieval protocols based on available V2X bandwidth and latency requirements. The method enables augmenting existing 3D sensing ability by sharing different perspectives to avoid line-of-sight obstruction so as to provide extended vision for better surrounding awareness and path planning of future vehicles. The method enables re-sampling particles for selecting weights with numerical magnitude, thus increasing accuracy of predicted and sensor-corrected object position. The drawing shows a schematic view of a system for facilitating networked scene rendering and augmentation in vehicular environments in autonomous driving systems. 300System for facilitating networked scene rendering and augmentation in vehicular environments in autonomous driving systems305, 310Vehicles315, 320Regions of coverage335Pedestrian", "instruction": "Please summarize the input" }, { "input": "Parking service privacy protection system and method based on v2pThe invention claims a v2p-based parking service privacy protection system and method, comprising: a blind signature certificate generating part (PKI), a parking lot terminal (PLT), a parking lot service provider (PSP), an automatic driving vehicle (AV) and an intelligent mobile phone (SM). the user registers PKI; PKI returns the blind certificate; PLT registers to PSP; PSP returns signature key to PLT; the user submitting the application service request to the PSP through SM; returning the request token SESS; the user uses the SESS to query the parking lot information to the PSP; the PSP returns to the parking lot according to the pseudo position of the user; the user selects the parking lot, and sends the subscription request and signature to the PSP; the PSP forwards the request and the signature to the corresponding PLT, after the PLT is verified, the parking permission code is generated and sent to the PSP; the PSP signs the license code and returns to the user. the user sends the return information to the AV through SM, the AV reaches the appointed PLT, after checking the information, the parking is finished. The advantages of the invention are as follows: user experience is better, more safety, higher efficiency.|1. A v2p-based parking service privacy protection system, wherein it comprises: a blind signature certificate generating part (PKI), a parking lot terminal (PLT), a parking lot service provider (PSP), an automatic driving vehicle (AV) and an intelligent mobile phone (SM); the blind signature certificate generating position (PKI): PKI is responsible for auditing the specific information of the user, specifically representing the validity of the registered file submitted by the user; PKI can access the port of the public security department; the identity information of the user is submitted to the public security department for authentication by means of homomorphic encryption; if the authentication is passed, the PKI generates a suitable blind signature certificate for the user; Automatic driving automobile (AV): AV has autonomous capability, and also has communication capability based on cellular network, making it can be directly connected with other entities in the network; AV can accept the order of user; the intelligent mobile phone (SM) is bound with the AV; SM is owned by the user and under the control of the user, the user can install the parking application program and use the application program to finish the subscription process; A parking service provider (PSP): PSP is a online server, providing a parking service on demand for the user, comprising: searching the nearby parking space; booking the parking space and subscribing service; the subscription service is the registered user who pays the membership fee to enjoy these convenient services; the service can be used as the intelligent mobile phone application program to be issued to the user; parking lot terminal (PLT): PLT is the terminal of the parking lot owner deployment, responsible for monitoring and managing the parking lot through IoT device; In addition, PLT loads the real-time state of the parking lot to the PSP, so as to attract more vehicles; the real-time state comprises: parking fee, vacant parking space and high altitude map.\n | 2. The working method of parking service privacy protection system based on v2p according to claim 1, wherein it comprises the following steps: 1: the user registers with the PKI; after the registration, the PKI returns the blind certificate certuficate; to 2.PLT PSP registration; after the registration is successful, the PSP returns the signature key Rab; 3, the user submitting the application service request to the PSP through the SM; after the PSP verification is successful, returning the request token SESS to the user; 4, the user uses the SESS to inquire the parking lot information to the PSP; the PSP returns a parking lot in a certain range according to the pseudo position of the user; 5, the user selects one parking lot, and sends the reservation request Req and signature σ to the PSP; 6.PSP Req | |σ to the corresponding PLT, after PLT verification, generating parking permission code c, and sending the c to the PSP; 7.PSP is Sigc, and the c | | sigc is returned to the user; 8. The user sends the c | | Timestamp | SESS | Sigc to AV through SM, AV reaches the appointed PLT, after checking the information, the parking is finished.\n | 3. The method according to claim 2, further comprising: system installation, registration, service, parking and malicious user exposure and key revocation; The following definitions of the symbols used are as follows: λ is a safety parameter; G, GT bit bilinear cyclic mapping group; p is a mass number whose length is λ; g1 is generating element; H (), H ' (); is three non-encryption hash functions; is private key and public key of PSP; X, Y, Z is and x, y, z belongs to Zp; e (.) is bilinear mapping pairing function μ; μ is a daily authentication key; Ω; Ψ is three data storage sets; is the private key of PLT and public key; Rab is the signature key of PLT; the certificate is the user blind signature certificate; Timestamp is the current time slot; the SESS is the voucher for parking every time; The system installation includes: PKI initialization: PKI selecting one parameter g as its own identification code, and using the RSA algorithm to generate own public and private key pair; according to the RSA algorithm, PKI selection: random prime number b, c, and b is not less than 2512, c is not less than 2512, making n = b * c, a random number As the PKI self-public key, wherein and then PKI calculation through the same equation equation obtaining d, taking d as its own private key; PKI publishing system parameters (e, n, g), and secret storing (b, c, d), PSP operation registration algorithm; a bilinear mapping group (G, GT) of a large prime number p> 2 λ is created, wherein λ is a safety parameter and e (...) represents a bilinear mapping such as e: G * G-GT; in the form, g1 is the generating element of G and e (g1, g1) is defined as gT; H: (0, 1) *, ZP, H ' apos;: (0, 1) *, G and ZP-ZP is three encrypted hash functions; Public key of PSP is set as a belongs to Zp is randomly selected, and a is a private key; PSP randomly selected prime number p, q wherein q | p-1, p is not less than 2512, q is not less than 2160, and p is not less than g; PSP selecting x, y, z belongs to Zp and calculating and μ is a key selected by the PSP and changed daily; then Yuan group the public parameter is published in the system; Finally, PSP using Bloom filter initialization three empty sets and attention, μ, Ω, Ψ, Ψ is reset by PSP, so as to ensure the subscription credential of the user is only effective on the day; The registration includes: 1. User registration: (1.1) before using the parking service provided by the PSP, the user registering registration to the designated PKI through the identity card; (1.2) PKI verifying the user information, returning to the user a blind certificate issued by PKI = ((M 'apos;, j), (Y', U ', z', j ', S' 1, S ' 2), B); 2.PLT: (1.1) PLT creating user name and password, and registering in the terminal; (1.2) PLT uploads the identity information (e.g., electronic commercial parking lot license) to the PSP, and PSP verifies the qualification of the parking lot; (1.3) after the verification is passed, PLT will create a key pair wherein b is randomly selected in the ZP, calculating the signature key and the public key B is sent to the PSP; (1.4) PSP storing B, parking information and finishing the registration; The service includes: 1. Verification of User Certificate (1.1) the vehicle user Vi submitting the application and certificate to the PSP, wherein 1 ?i?s is the total s of the vehicle user, and Vi represents the ith user; Firstly, PSP verifies the legality of the blind certificate; in the proof process, the user acts as a certifier; the PSP acts as a verifier: BV sends the PSP, T6, HMACk2 (certificate | T6), yi, H (xi); wherein xi is the private key selected by the user and stored in the local; (1.2) if the certificate is legal and in the validity period, then the verification is successful, PSP in Ω searching H (xi), if it does not exist in Ω, receiving yi1 ?i?s sent by SM, allowing the vehicle user Vi to join in the group, and generating a temporary session token SESS; sending it back to the user, and storing the blind certificate in the database; if there is Ω, then the user through SM re-selecting xi, until the H (xi) does not exist in the library; For the vehicle user Vi, the PSP safety (yi, tn) to PKI, and PKI stores (yi, tp) in the local database; otherwise, the PSP returns the failure; (1.3) the user stores the session token SESS; (1.4) if there is new added user needs to use the group public key updating algorithm to the public key of the PSP of the group public key; 2. Spark inquiry: (2.1) by using the geographic undifferentiated mechanism to interfere the user current real position (latitude, longitude, radius) (lat 'lon' rng ' rng) = DP (lat, lon, rng, ε); (2.2) the user through sending (lat ', lon', rng ') and SESS to the PSP to set parking requirement and request nearby parking information; (2.3) PSP screening parking lot not meeting the condition, and returning the parking lot list in the query range; 3. Advance of the Carpark: (3.1) the user selects one parking lot from the returned list; sending the reservation request Req and the signature σ to the PSP, wherein Req=Info | | SESS | | Timestamp information relates to the trivial reservation information; the timestamp represents the current timestamp; (3.2) User Computing as a subscription token, sending the U to the PSP, and performing non-interactive zero knowledge knowledge with the PSP, wherein the user plays the certifier; the PSP plays the verifier: (3.3) after receiving the request, if the certificate is successful and the token U is not present in the step, the PSP receives the request and adds the U to the step; otherwise, the PSP rejects the request; (3.4) the PSP sends the Req | |σ to the corresponding PLT; (3.5) PLT through signature σ and public parameters (g, m, u, c, h) to verify the validity of the signature, after successful verification, PLT will generate a unique random character string as temporary parking license code c, storing it in the local database, and sending it back to PSP; (3.6) PSP marking the c as Sigc=H ' (c | | Timestamp | SESS) a, storing the SESS in the token pool, and returning the c | | sigc back to the user; The parking includes: 1. Sparking Request: (1.1) the user through SM the c | | Timestamp | SESS | Sigc and parking information is sent to AV; (1.2) the AV is switched to the automatic driving mode and is driven to the selected parking lot according to the received information; 2. Inspection: (2.1) when connected to the PLT, AV the c | | Timestamp | SESS | | Sigc is sent to the PLT; (2.2) PLT by checking verifying the signature Sigc; if it is correct, the PLT searches c in the database and ensures whether the AV has reserved the parking space; If c is found in its local database, PLT deletes c and allows AV to be parked therein; otherwise, PLT returns failure and refuses to provide service; (2.3) PLT by selecting the random θ?Zp re-signature Sigc is and the Sig'c is sent to the AVs as the acknowledgement receipt; 3. Reprovisioning of the subscription information: (3.1) the AV forwards the receipt Sig'c to the SM of the user, and notifies the parking confirmation message on the SM of the user; (3.2) after waiting for random delay, the user by sending c | | Timestamp | | SESS | | Sig'c | | U to the PSP to apply for resetting its own subscription information, so as to realize the second subscription; (3.3) the PSP receives the reset request, checking the validity condition of the credential reset request through the following two conditions: Condition 1: PSP verifies the signature by the following formula: if the equation is satisfied, then satisfying the condition; condition 2: the PSP searches Uin in the sum Ψ; if the U is present in the S, and does not exist in Ψ, then the PSP adds U to the Ψ and deletes the U in the S, then the condition is satisfied; if any one is not completed, the PSP rejects the request and returns a failure; otherwise, the PSP is successfully returned, the user can perform the parking space reservation by U; The malicious user exposure and key revocation are: if the anonymous identity wants PSP to initiate attack, in this case, PSP combined PLT applies to PKI to open the identity of malicious user, PSP collecting related subscription request sent by malicious user (π, ζ, p, Req), using the same equation c = yk (mod pk) to calculate the public key yk of the malicious user; searching the self database to find the blind signature certificate of the malicious user and submitting it to the PKI; PKI according to the blind signature certificate submitted by the PSP, searching the real identity of the malicious user in the library, and penalty, such as refusing to generate a new blind signature and so on.\n | 4. The method according to claim 3, wherein the first and second data are the same. The user registration is implemented by the following algorithm: PKI blind signature generating user certificate, assuming that the user uses smart mobile phone SM at PKI registration, PKI randomly selecting 3 random generating elements R, R1, R2 belongs to G11) SM selecting a random number ξ SM, and calculating M=ASM = ξ SMR1 + R2; ρ = e (R, QPKI), ρ 1 = e (R1, QPKI), ρ 2 = e (R2, QPKI), y=e (Ppub, QPKI); then SM sends IDSM, M, T1 to LTA; 2) selecting random number Q?G1 by PKI; and calculating e = (M, Γ PKI), a=e (R, Q), δ = e (M, Q), U=rR, Y=rQPKI; then PKI sends z, a, δ to the registered user, U, Y, T2HMACK1 (z | |a | |δ | | U | | Y | | T2) 3) SM selecting random number α; β; γ; λ; μ; σ, u; and calculating M '= α M, A=e (M', QPKI) δ '= δ u α Av, z' = z α, a '= au ρ v, Y' = λ Y + λ μ QPKI-γ Hi (j); U '= λ U + γ Ppubl = λ -1H2 (M', Y ', U', A, B, z ', a', δ ') + μ, j' = lu, k1 = e (Γ SM, QPKI) then SM sending 1, T3 to PKI; HMACk1 (1 | | T3) 4) PKI calculates S1 = Q + 1 Γ PKI, S2 = (r + 1) Γ PKI + rH1 (j) and sending S1, S2, T4, HMACk1 (S1 | S2 | | T4) to SM; if the formula e (R, S1) = ayl, e (M, S1) = δ zl, SM calculating S' 1 = uS1 + VQPKIS' 2 = αS2; then the limited part blind signature of (M 'apos;, j) is (Y', U ', z', j ', S' 1, S '2) and the blind signature generated by the vehicle user SM is = ((M', j), (Y ', U', z ', j', S '1, S' 2); B) J is the expiration time of the blind certificate, and Ti is a time stamp for preventing double attacks.\n | 5. The method according to claim 3, wherein the first and second data are the same. the user certificate verification is realized by the following algorithm: the PSP verifies the user certificate issued by the PKI and establishes a group: the PSP establishes the group formed by the user using the service and is a group administrator; according to the Chinese remaining theorem, based on the public key of the group member, the PSP can calculate to generate a group public key; the PSP can use the group public key to verify the validity of the signature when the parking service request is requested; when there is a member in the group adding or exiting, PSP updating the group public key according to the Chinese remaining theorem algorithm, using the Schnorr signature algorithm; 1) PSP calculating A=e (M ' apos;, QPKI); if A is not equal to 0, calculating i = H4 (A, B, QPSP, time), wherein time is the binary representation of the current time; PSP to SM sending the gei2) SM calculating r1 = i (ξ x α) + β, r2 = i α + σ then SM sending r1, r23) PSP, respectively formula a '= e (P, S' 1) y-j ', δ' = e (M ', S' 1) z ' -j ', if formula e (S' 2, R) = e (Y '+ H3 (M', Y ', U', A, z ', a', δ) QPKI, Ppub) × e (H1 (j), U') is established, the signature is legitimate; when and only when when the PSP receives the certificate, the certificate is legal.\n | 6. The method according to claim 3, wherein the first and second data are the same. The PSP generation group public key algorithm is as follows: the PSP uses the received public key of s user; calculating the group public key by the same equation equations: The value of the equation of the same equation is wherein P=p1p2 ... ps=p1P1 = p2P2 = ... = psPs; i= 1, 2, ..., s, and p'i is the positive integer solution of the same equation p'ipi = 1 (mod pi) i= 1, 2, ...; C is the group public key, RSU selects one of safety hash function h and publishing parameter (g, m, u, c, h); Table 1 the existing group public key \ny1 y2…yi…ysAs shown in Table 1, the group public key generated for PSP.\n | 7. The method according to claim 3, wherein the first and second data are the same. the parking lot reservation (3.1) SM signature algorithm is as follows: using Scjnorr signature algorithm to sign the message, if the user SM wants to sign the message Req, firstly, SM selecting a random number and calculating f=g ω (mod p), π = h (f | |Req), ζ = ω-xk π (modq), wherein g is the identity identification code of the PKI, xk is the private key of the vehicle user SM, p, q is the prime number selected by the vehicle user SM of the PSP; then σ = (π, ζ, pk) is the signature of the vehicle user to the message Req; The knowledge proof algorithm in the parking lot reservation (3.2) is as follows: certifier 1) formula rewriting is 2) selecting ρ, ρ v?Zp, calculating Δ = U p, η = H (X, Y, Z); 3) μ, η, Δ; sending to the PSP; verifier 1) PSP receiving μ, Δ; calculating n = H (X, Y, Z); Inspection if it is established, it is proved that it is known; The algorithm of the PLT verification SM signature message in the parking lot reservation (3.5) is as follows: PLT can pass the signature σ = (π, ζ, pk) and public parameters (g, m, u, c, h) verifies the validity of the message: 1) calculating c = yk (mod pk), obtaining the public key yk of the vehicle user Vk; 2) checking whether the public key yk is in the middle; if so, executing the step 33), 4) if formula = h (f ' | Req) is established, then the signature message is the vehicle user Vk signature, and opening the message; 5) ending.\n | 8. The method according to claim 3, wherein in the user authentication service (1.4) group public key updating algorithm is as follows: 1) for the new user Vs + 1 verified by the user certificate, PSP stores the vehicle user Vs + 1 and the corresponding blind certificate in the database, and updating table 1 is table 2: Table 2 after updating the group member public key \ny1 y2…yi…ysys+12) PSP calculating new group public key through the same equation equations: the value of the same equation equation is wherein Pnew=p1p2 ... psps + 1 = Pps + 1; the calculation formula of Pinew and P ' inew is as follows: Input: Pi, Pi ' apos;, pi (1 ?i?s + 1) 1) if 1 ?i?s, then calculating Pinew=Pips + 1; wherein Because P 'inewPinew = 1 (mod pi) and PiPi' = 1 (mod pi); 2) if i=s + 1, calculating 3) Output: Pinew and P ' inew (1 ?i?s + 1) under the scheme, can realize the high-efficient adding of the new member, at the same time, does not affect the key of the existing member, only needs to update the group public key; after updating, PSP publishing new parameter group (g, m, u, c, h).\n | 9. The method according to claim 3, wherein The malicious user disclosure and the specific member revocation algorithm in the key revocation are as follows: setting the current group there are s vehicle user; Vk represents any one group member; if the vehicle user Vk (1 ?k?s) wants to exit the group, Vk only needs to exit the application for sending to the PSP; the PSP updating the public key yk of the database Vk is y'k, and making the same equation y'k = yk (mod pk) is not established; and calculating a new group public key by the same equation set: the solution of the same equation set is: formula refers to FUFUFU; The updated current member public key table is shown in Table 3: Table 3 after revocation of the group member public key \ny1 y2…yk-1yk+1…ys+1after the member revocation is completed, the same equation c' = yk (mod pk) and π = h (f | | M) are not established, the subscription request of the user cannot be verified, but in this process, the key of the original vehicle user will not be changed.\n | 10. The method according to claim 3, wherein The algorithm of geographical indistinguishable in the safety given parameter (i.e., default privacy level can be set as low \" = = O: 01, \" = = = 004, high \" = = 001), the actual position any point generated after processing the probability density function of the noise mechanism (planar Laplacian) is the Euclidean distance between the two can be expressed as it also can be expressed as polar coordinate model wherein rad and theta are the distance and angle between the real position and the blur position; in order to fuzzy real position θ should be randomly selected from [0, 2 π), rad is preferably set as wherein W-1 is Lambert W function (-1 branch) and p should be [0; 1) randomly selecting; In addition, there are two conversion functions: LatLonToCartesian and CartesianToLatLon; Realization and (x, y)-(lat ', lon') conversion; Therefore, and in addition; wherein τ is the precision parameter, default value τ = 0.95.", "output": "The system has a smart phone (SM) that is owned by a user and under a control of the user. The user installs a parking application and uses an application to complete a booking process. A PSP is an online server that provides the users with on-demand parking services and including finding nearby parking spaces, making parking reservations and subscribing to services. A subscription service is provided for registered users who pay membership fees to enjoy convenient services and released to the users as a SM application. A PLT is a terminal deployed by a parking lot owner and responsible for monitoring and managing a parking lot through IoT equipment. The PLT uploads a real-time status of the parking lot to the PSP to attract more vehicles. The real-time status includes parking fees, vacant parking spaces and high altitude maps. An INDEPENDENT CLAIM is included for a privacy protection method for parking services based on v2p. Privacy protection system for parking services based on v2p. The system includes better user experience, more safety and higher efficiency. The drawing shows a block diagram of a privacy protection system for parking services based on v2p. (Drawing includes non-English language text)", "instruction": "Please summarize the input" }, { "input": "Ecological driving method of automatic driving vehicle through signal intersectionThe invention claims an automatic driving vehicle through signal intersection ecological driving method, collecting road traffic information and self information through V2V/V2I technology, obtaining the control condition, then according to the current signal lamp state judging can pass through the uniform speed, if it can, uniform speed through the signal intersection area, otherwise, performing ecological driving control to the automatic driving vehicle, in the ecological driving control stage, constructing automatic driving vehicle motion state equation and cost function, using the Ponteri gold minimum value principle to respectively optimal control track solving for the intersection upstream area and the intersection downstream area, making the automatic driving vehicle pass through the signal intersection area according to the optimal track; The advantages are as follows: it can meet the actual driving requirement of the driver, meets the actual road traffic environment condition, and the control process is simple, the calculation amount is small, the control real-time performance can be ensured, so as to effectively reduce the automatic driving vehicle energy consumption, improve the travel efficiency, reduce the pressure caused by each vehicle-road device.|1. An automatic driving vehicle through signal intersection ecological driving method, wherein it comprises the following steps: step 1: dividing the signal intersection area into an intersection upstream area, intersection central area and the intersection downstream area, the intersection upstream area represents the area where the automatic driving vehicle starts to be controlled, the road section distance of the intersection upstream area is equal to the distance from the starting position of the automatic driving vehicle to obtain the intersection signal lamp configuration and the state information to the intersection stopping line, the distance of the section of the intersection upstream area is marked as l1, the specific value of l1 is defined according to the communication range of V2V/V2I technology; intersection central area is the signal intersection physical area, intersection central area of the road distance is equal to the intersection stop line to the intersection central area ending position of the distance, intersection central area of the road distance is marked as 1; the intersection downstream area represents the area where the automatic driving vehicle is controlled; the intersection downstream area is started from the intersection central area ending position; the road distance of the intersection downstream area is recorded as 12, and 12 is determined according to the parking distance safety the road section of the automatic driving vehicle passing through the signal intersection area is marked as L, L=l1 + 1 + l2; step 2: when the automatic driving vehicle runs into the upstream area of the signal intersection, using V2V/V2I technology to obtain the road traffic information of the signal intersection area, the road traffic information comprises intersection upstream area route l1, intersection central area route 1, intersection downstream area route l2, intersection signal lamp timing and state information and road information; the automatic driving vehicle obtains the current vehicle position in real time by GPS technology, using the vehicle-mounted sensor device to obtain the current vehicle speed, current acceleration and road traffic condition information; step 3: starting from the automatic driving vehicle driving signal intersection area, controlling the automatic driving vehicle according to the obtained intersection signal lamp timing and state information, making the automatic driving vehicle not to stop and leave the intersection as much as possible; the specific control process is as follows: when the automatic driving vehicle enters the signal intersection upstream area, the current vehicle speed is v0, if the current signal state is green light, and the current green light to the next red light time interval not less than the time when the automatic driving vehicle drives to the intersection stop line at uniform speed at the current speed v0, namely controlling the automatic driving vehicle to pass through the signal intersection according to the current speed v0 at uniform speed, ending the control; if the current signal state is green light, and the current green light to the next red light time interval TG is less than the time when the automatic driving vehicle drives to the intersection stop line at uniform speed at the current speed v0, namely then entering the step 4 to control the ecological driving of the signal intersection; if the current signal state is a red light, and the time of the vehicle running to the stop line at a uniform speed of the current vehicle speed v0 is not less than the current red light remaining time TR, i.e. then controlling the automatic driving vehicle to pass through the intersection according to the current speed v0 at uniform speed, ending the control; if the current signal state is a red light, and the current red light remaining time TR is greater than the time when the vehicle travels to the intersection stop line at a uniform speed at the current vehicle speed v0, i.e. then entering the step 4 to control the ecological driving of the signal intersection; step 4: performing ecological driving control to the signal intersection, specifically as follows: 4.1. the automatic driving vehicle enters the intersection upstream area or the intersection downstream area to restart timing, the intersection upstream area and the intersection downstream area starting position are marked as 0, the timing starting time is marked as 0, the automatic driving vehicle at the intersection upstream area or the intersection downstream area running at a certain time is marked as t ; when driving on the intersection upstream area, the position of the automatic driving vehicle t time is denoted as s (t), the speed is recorded as v (t), the acceleration is marked as u (t), u (t) is used as the control output of the automatic driving vehicle at the time t when the vehicle is running at the upstream area of the intersection; when running at the intersection downstream area, the position of the automatic driving vehicle t time is recorded as s' (t), the speed is recorded as v '(t), the acceleration is marked as u' (t), u ' (t) is used as the control output of the automatic driving vehicle at the time t when the vehicle is running at the downstream area of the intersection; according to the optimal control theory, the signal intersection upstream area running of the automatic driving vehicle motion state vector is described as: the motion state equation is obtained by motion state vector x (t), expressed as: wherein, represents the derivative of the motion state vector x (t), f (x (t), u (t)) represents the motion state equation function, represents the change rate of the driving position s (t) at t time, represents the change rate of the driving speed v (t) at t time, namely the acceleration u (t); 4.2, constructing cost function for ecological driving of the automatic driving vehicle at the signal intersection: wherein F represents the cost function, tf represents the intersection upstream area control process or the intersection downstream area control process of the terminal time; L (x (t), u (t)) is the cost function of the optimal control target, the cost function first item in order to realize the optimal control of the travel time cost, the second item is the energy consumption cost of the automatic driving vehicle, q (t) is the instantaneous energy consumption of the automatic driving vehicle t time, | | is the absolute value symbol; η 1 represents the cost corresponding to the time cost weight, η 2 represents the energy consumption cost corresponding cost weight, η 1, η 2 the value range is [0, 1], and the two cannot be simultaneously the value is 0; wherein the instantaneous energy consumption q (t) of the automatic driving vehicle t time is represented by the following formula: in the formula (3), Pm (t) is the motor power loss of the automatic driving vehicle, Pt (t) is the power loss caused by the resistance of the automatic driving vehicle, Pg (t) is the energy obtained by the automatic driving vehicle of the accelerating or deceleration, m is the mass sum of the automatic driving vehicle and the vehicle personnel, g is the gravity coefficient, fr1 is the rolling friction coefficient of the automatic driving vehicle, r is the motor equivalent resistance of the automatic driving vehicle, K is the product of the armature constant and magnetic flux of the automatic driving vehicle, k is the air resistance coefficient of the automatic driving vehicle, R is the tire radius of the automatic driving vehicle; 4.3, solving the signal intersection area automatic driving vehicle driving track, as follows: 4.3.1, according to the Ponteri gold minimum principle, determining the Hamiltonian function H [x (t), u (t), λ], as shown in formula (4): H [x (t), u (t), λ] = L (x (t), u (t)) + λ f (x (t), u (t)) = η 1 + η 2 | q (t) | + λ 1v (t) + λ 2u (t) (4), wherein the state equation λ is the co-state vector, λ 1 and λ 2 are co-state vector elements, the relational expression is constraint f (x (t), u (t)) is less than or equal to 0; 4.3.2, intersection upstream region optimal control solving, specifically as follows: the initial time of the automatic driving vehicle entering the intersection upstream area is 0, the intersection upstream area automatic driving vehicle initial time movement state vector is when the automatic driving vehicle reaches the intersection stop line time is marked as tf1, the intersection upstream area automatic driving vehicle terminal time of the motion state vector vf1 is the speed of the automatic driving vehicle at the time tf1; when the current signal state is green light, the intersection traffic efficiency is not affected, the speed of the automatic driving vehicle reaching the intersection stop line time tf1 is vmax, vmax is the road limit speed maximum obtained by the automatic driving vehicle, then vf1 = vmax, when the current signal state is a red light, when the automatic driving vehicle reaches the intersection stop line time tf1 signal state becomes green, tf1 = TR; to solve the intersection upstream area automatic driving vehicle optimal control output, it needs to satisfy the following formula: at this time, obtaining the expression of λ 1 and λ 2: then putting the expression of λ 1 and λ 2 into the Hamilton function H [x (t), u (t), λ], and then so as to obtain u (t) of the general expression; so as to automatically drive the motion state vector of the vehicle initial time and the motion state vector of the terminal time according to the intersection upstream region, calculating to obtain the intersection upstream region terminal time tf1, terminal time speed vf1 and control output u (t), the judging condition umin is less than or equal to u (t) ?umax is satisfied, umin is the minimum acceleration of automatic driving vehicle performance, umax is the maximum acceleration of automatic driving vehicle performance, if the condition is satisfied, then the u (t) calculated at this time is the intersection upstream area optimal control output if u (t) is less than umin, then u (t) = umin, u (t) as the intersection upstream area optimal control output if u (t) is greater than umax, then u (t) = umax, u (t) as the intersection upstream area optimal control output 4.3.3, the intersection downstream area optimal control solving, specifically as follows: the intersection downstream area distance is determined safety the parking distance, then wherein fs is the sliding friction coefficient of the driving road; the road information obtained by V2V/V2I technology is provided, the motion state vector of the automatic driving vehicle at the intersection downstream area t moment Motion state equation wherein s' (t) is the position of the intersection downstream area automatic driving vehicle at time t, v't) is the speed of the automatic driving vehicle at the time t at the intersection downstream area, u't) is the acceleration of the automatic driving vehicle at the time t at the intersection downstream area, namely the control output of the intersection downstream area, the initial time when the automatic driving vehicle reaches the downstream area of the intersection is 0, and the state vector of the initial time of the automatic driving vehicle at the downstream area of the intersection in order to make the automatic driving vehicle to finally recover the initial speed of the signal intersection area, namely the speed of the terminal time tf2 of the automatic driving vehicle driving out the intersection downstream area is v0, namely the state vector of the automatic driving vehicle at the intersection downstream area terminal time, λ 'the intersection downstream area co-state vector, λ' 1 and λ ' 2 are the intersection downstream area co-state vector element, the relational expression is to the intersection downstream area automatic driving vehicle optimal control output, it needs to satisfy: at this time, obtaining the expression of λ '1 and λ' 2: then bringing the expression of λ '1 and λ' 2 into the Hamilton function H [x '(t), u' (t), λ '], and then so as to obtain u ' (t) of the general expression; so as to automatically drive the motion state vector of the vehicle initial time and the motion state vector of the terminal time according to the intersection downstream region, calculating to obtain the intersection downstream region terminal time tf2 and control output u't), the judging condition umin is not more than u't) ?umax, if the condition is satisfied, then calculating the u ' (t) is the intersection downstream area optimal control output if u't) is less than umin, then u '(t) = umin, u' (t) as the intersection downstream area optimal control output if u't) is greater than umax, then u '(t) = umax, u' (t) as the intersection upstream area optimal control output 4.4, controlling the automatic driving vehicle to acceleration passing through the intersection upstream area, when the automatic driving vehicle drives the intersection upstream area, entering the intersection central area, the speed is vf1, at the intersection central area, controlling the automatic driving vehicle to pass through at uniform speed vf1, when the automatic driving vehicle drives the intersection central area, entering the intersection downstream area, controlling the automatic driving vehicle to acceleration passing through the downstream region of the intersection; when the automatic driving vehicle driving out signal intersection area, ending the ecological driving control.", "output": "The method involves dividing a signal intersection area into an intersection upstream area, an intersection central area and an intersection downstream area. The intersection upstream and intersection downstream areas are determined as an area. Road traffic information of the signal intersection is obtained by using a vehicle-to-vehicle/vehicle-to-infrastructure (V2V/V2I) technology when the automatic drive vehicle runs into the upstream area of the intersection area of signal intersection. The current vehicle speed, current acceleration and road traffic condition information are obtained using a vehicle-mounted sensor device. An automatic driving vehicle is controlled to pass through the intersection according to the current speed at uniform speed to end the control. An ecological driving of the signal intersection is controlled. The ecological driving control is performed to the signal intersection. Method for ecologically driving automatic driving vehicle through signal intersection. The method enables satisfying the actual driving requirement of the driver, and meeting the road traffic environment condition, and the control process is simple, so that the calculation amount is small, and the control real-time performance can be ensured, thus effectively reducing the automatic driving vehicle energy consumption, improving the travel efficiency, and reducing the pressure caused by each vehicle-road device. The drawing shows a schematic view of the method for ecologically driving automatic driving vehicle through signal intersection (Drawing includes non-English language text).", "instruction": "Please summarize the input" }, { "input": "Inter-vehicle communication system using NFT authenticationThe present invention detects vehicle and pedestrian signals from images collected through a vision sensor mounted on the vehicle, then transmits the signal status and location information of the vehicle to surrounding vehicles in a broadcast manner, and transmits the signal status and location information of the vehicle to surrounding vehicles based on the location information from the following vehicle. This is about a vehicle-to-vehicle communication system using NFT authentication that allows the signal information to be confirmed after specifying the preceding vehicle on the same route, but also ensures the reliability of the information through authentication of the issued NFT information.|1. In an autonomous vehicle-to-vehicle communication system that communicates with a relay station and surrounding vehicles, the first communication unit 111 communicates with the vehicle through the relay station 150, and provides information on the vehicle number and owner through authentication of vehicle and owner information. An issuing unit 112 that issues and manages NFTs to enable tracking of information, registration, and renewal date, and an authentication unit that performs authentication of the requested NFT from a vehicle that has received an NFT of another vehicle and notifies the result to the vehicle requesting authentication. Authentication server 110 having a unit 113;\nA second communication unit 121 that is mounted on an autonomous vehicle and communicates with the relay station 150 and surrounding vehicles, and a first storage unit that receives and stores NFTs from the authentication server 110 through the second communication unit 121. Unit 122, a signal transmission unit 123 that transmits the NFT of the first storage unit 122 to a surrounding vehicle through a beacon to request authentication, and the authentication server 110 of the NFT received from the surrounding vehicle An authentication module 120 including a second storage unit 124 that is authenticated from and stores the NFT received upon completion of authentication for a set time; \nAn information transmission unit 141 that is mounted on an autonomous vehicle and transmits the acquired traffic information along with the NFT stored in the first storage unit 122 to surrounding vehicles as a message, and the NFT of the message received from the surrounding vehicle is In the case of NFT stored in the second storage unit 124, an information transmission and reception module 140 including an information processing unit 142 that processes the traffic information included; It is mounted on an autonomous vehicle and consists of a vision sensor 131 that photographs the front of the vehicle, a location confirmation unit 132 that generates current location information of the vehicle, and an image obtained through the vision sensor 131. It further includes an information analysis unit 133 that analyzes and acquires vehicle and pedestrian signal information, and an information acquisition module 130 that acquires the location information and signal information as traffic information; Further comprising, the information acquisition module 130 includes a distance calculation unit 134 that calculates distance information to itself by analyzing location information of surrounding vehicles corresponding to the traffic information being processed, and analyzes the distance information. It further includes a determination unit 135 that determines whether there is a preceding vehicle on the same route, and a traffic analysis unit 136 that analyzes processed traffic information and generates traffic flow information, and the information processing unit 142 is configured to monitor surrounding vehicles. If the NFT of the message received from is not stored in the second storage unit 124, authentication is requested by the authentication server 110, and if the NFT is authenticated, the message is stored in the second storage unit 124 and then traffic The information acquisition module 130 processes the information, and the information acquisition module 130 compares the traffic information included when receiving the message with a plurality of preceding vehicles on the same route at a distance of a set range, and the information verification unit (113) A vehicle-to-vehicle communication system using NFT authentication, characterized in that it further includes. \n | 2. delete\n | 3. delete\n | 4. delete\n | 5. delete", "output": "The system has a communication unit communicated with a vehicle through a relay station, and an authentication server provided with an authentication unit that performs authentication of a near field communication (NFT) requested in the vehicle. Another communication unit is mounted on an autonomous vehicle, and communicates with the relay station. A storage unit receives and stores the NFTs from the authentication server through the latter communication unit. An information transmission and reception module includes an information processing unit that processes traffic information. Autonomous vehicle-to-vehicle communication system. The system achieves safe and rapid communication between driving vehicles by obtaining signal information, prevents unreasonable passing and tailgating at intersections, and prevents accidents, traffic jams and accidents. The system quickly processes traffic information messages through pre-reception authentication process, thus reducing information processing time and providing reliable communication between certified vehicles. The signal and traffic flow can be safely identified, and various traffic situation information such as occurrence of accident or congestion, as well as signal information can be delivered. The drawing shows a schematic view of an autonomous vehicle-to-vehicle communication system(Drawing includes non-English language text).", "instruction": "Please summarize the input" }, { "input": "Vehicle trajectory prediction near or at traffic signalA system and method for determining a predicted trajectory of a human-driven host vehicle as the human-driven host vehicle approaches a traffic signal. The method includes: obtaining a host vehicle-traffic light distance d x and a longitudinal host vehicle speed vx that are each taken when the human-driven host vehicle approaches the traffic signal; obtaining a traffic light signal phase Pt and an traffic light signal timing Tt; obtaining a time of day TOD; providing the host vehicle-traffic light distance dx, the longitudinal host vehicle speed vx, the traffic light signal phase Pt, the traffic light signal timing Tt, and the time of day TOD as input into an artificial intelligence (AI) vehicle trajectory prediction application, wherein the AI vehicle trajectory prediction application implements an AI vehicle trajectory prediction model; and determining the predicted trajectory of the human-driven host vehicle using the AI vehicle trajectory prediction application.The invention claimed is: \n | 1. A method for determining a predicted trajectory of a human-driven host vehicle as the human-driven host vehicle approaches a traffic signal, wherein the method is carried out by one or more electronic controllers, and wherein the method comprises the steps of: \nobtaining a host vehicle-traffic light distance d x and a longitudinal host vehicle speed vx that are each taken when the human-driven host vehicle approaches the traffic signal;\nobtaining a traffic light signal phase P t and a traffic light signal timing Tt, wherein the traffic light signal phase Pt represents a phase of the traffic signal taken when the human-driven host vehicle approaches the traffic signal, and wherein the traffic light signal timing Tt represents an amount of time elapsed since a last phase change of the traffic signal taken when the human-driven host vehicle approaches the traffic signal;\nobtaining a time of day TOD; \nproviding the host vehicle-traffic light distance d x, the longitudinal host vehicle speed vx, the traffic light signal phase Pt, the traffic light signal timing Tt, and the time of day TOD as input into an artificial intelligence (AI) vehicle trajectory prediction application, wherein the AI vehicle trajectory prediction application implements an AI vehicle trajectory prediction model; and\ndetermining the predicted trajectory of the human-driven host vehicle using the AI vehicle trajectory prediction application. \n | 2. The method of claim 1, wherein the method further includes obtaining a front vehicle state XFV, wherein the front vehicle state includes a front-host vehicle distance rt and a front-host vehicle speed {dot over (r)}t, and wherein the providing step further includes providing the front vehicle state XFV as input into the AI vehicle trajectory prediction application.\n | 3. The method of claim 2, wherein the front vehicle state XFV is obtained at the one or more electronic controllers based on front vehicle base information that is obtained at the front vehicle and then sent via vehicle-to-vehicle (V2V) communications to the one or more electronic controllers.\n | 4. The method of claim 1, wherein the traffic light signal phase Pt and the traffic light signal timing Tt are obtained from a traffic light control system that is present at an intersection where the traffic light is located.\n | 5. The method of claim 1, wherein the predicted trajectory is obtained at an autonomous vehicle that is approaching the traffic light and that is separate from the human-driven host vehicle.\n | 6. The method of claim 5, wherein the method is carried out at the autonomous vehicle as the human-driven host vehicle approaches the traffic light.\n | 7. The method of claim 6, wherein the autonomous vehicle obtains the traffic light signal phase Pt and the traffic light signal timing Tt from a traffic signal system located at an intersection where the traffic light is located.\n | 8. The method of claim 7, wherein the autonomous vehicle receives the traffic light signal phase Pt and the traffic light signal timing Tt via vehicle-to-infrastructure (V2I) communications from roadside equipment that is a part of the traffic signal system.\n | 9. The method of claim 6, wherein the autonomous vehicle receives the traffic light signal phase Pt and the traffic light signal timing Tt from a traffic signaling control system that is located remotely from the traffic light.\n | 10. The method of claim 6, wherein the host vehicle-traffic light distance dx, the longitudinal host vehicle speed vx are obtained at the autonomous vehicle via V2V communications with the host vehicle.\n | 11. The method of claim 1, wherein the vehicle-traffic light distance dx, the longitudinal host vehicle speed vx, the traffic light signal phase Pt, and the traffic light signal timing Tt are each associated with an associated time that is no more than a predetermined amount different than another one of the associated times.\n | 12. The method of claim 1, wherein the AI vehicle trajectory prediction model is or includes a neural network.\n | 13. The method of claim 12, wherein the AI vehicle trajectory prediction model is a deterministic or a model that predicts one or more most-probable trajectories.\n | 14. The method of claim 12, wherein the AI vehicle trajectory prediction model is a probabilistic model that returns a probability distribution of predicted trajectories, and wherein the predicted trajectory is obtained by sampling a trajectory from the probability distribution of predicted trajectories.\n | 15. The method of claim 14, wherein the neural network is a mixture density network.\n | 16. The method of claim 12, wherein the neural network is a deep neural network.\n | 17. The method of claim 1, wherein the method further includes the step of causing an autonomous vehicle to obtain the predicted trajectory of the human-driven vehicle, wherein the autonomous vehicle is configured to: \nobtain the predicted trajectory of the human-driven vehicle, and \ncarry out an autonomous vehicle operation based on the predicted trajectory of the human-driven vehicle. \n | 18. A method for determining a predicted trajectory of a human-driven host vehicle as the human-driven host vehicle approaches a traffic signal, wherein the method is carried out by one or more electronic controllers, and wherein the method comprises the steps of: \nobtaining a host vehicle-traffic light distance d x and a longitudinal host vehicle speed vx that are each taken when the human-driven host vehicle approaches the traffic signal, wherein the host vehicle-traffic light distance dx and the longitudinal host vehicle speed vx each have an associated time;\nobtaining a front vehicle state X FV, wherein the front vehicle state includes a front-host vehicle distance rt and a front-host vehicle speed {dot over (r)}t;\nreceiving one or more wireless signals that indicate a traffic light signal phase P t and a traffic light signal timing Tt, wherein the traffic light signal phase Pt represents a phase of the traffic signal taken when the human-driven host vehicle approaches the traffic signal, and wherein the traffic light signal timing Tt represents an amount of time elapsed since a last phase change of the traffic signal taken when the human-driven host vehicle approaches the traffic signal, wherein the traffic light signal phase Pt and the traffic light signal timing Tt each have an associated time, wherein the associated times of the host vehicle-traffic light distance dx, the longitudinal host vehicle speed vx, the traffic light signal phase Pt, the traffic light signal timing Tt, the front vehicle state includes a front-host vehicle distance rt, and the front-host vehicle speed {dot over (r)}t are within a maximum allowable time difference with respect to one another;\nobtaining a time of day TOD; \nproviding the host vehicle-traffic light distance d x, the longitudinal host vehicle speed vx, the traffic light signal phase Pt, the traffic light signal timing Tt, the front vehicle state XFV, and the time of day TOD as input into an artificial intelligence (AI) vehicle trajectory prediction application, wherein the AI vehicle trajectory prediction application implements an AI vehicle trajectory prediction model, and wherein the AI vehicle trajectory prediction model is or includes a neural network; and\ndetermining the predicted trajectory of the human-driven host vehicle using the AI vehicle trajectory prediction application. \n | 19. The method of claim 18, wherein the host vehicle-traffic light distance dx and the longitudinal host vehicle speed vx are both obtained at an autonomous vehicle through receiving one or more wireless signals from the human-driven host vehicle via vehicle-to-vehicle (V2V) communications.\n | 20. The method of claim 18, wherein the host vehicle-traffic light distance dx and the longitudinal host vehicle speed vx are both obtained at the autonomous vehicle through receiving one or more wireless signals from a remote server.", "output": "The method (200) involves obtaining (210) a host vehicle-traffic light distance and a longitudinal host vehicle speed that are taken when a human-driven host vehicle approaches a traffic signal. A traffic light signal phase and a traffic light timing are obtained (220). A time of day is obtained (240). The host vehicle traffic light distance, longitudinal vehicle speed, signal phase, timing and the time are provided (250) as input into an artificial intelligence (AI) vehicle trajectory prediction application. A predicted trajectory of the host vehicle is determined (260) using the AI vehicle trajectory application by electronic controllers. Method for determining predicted trajectory of human-driven host vehicle as vehicle approaches and departs from traffic signal by using artificial intelligence vehicle trajectory prediction model. The method enables predicting how human drivers respond to traffic signals for purposes of successfully carrying out and/or improving autonomous driving in areas having traffic signals. The method allows the AI vehicle trajectory prediction model to map states of human-driven vehicles and a corresponding state of a traffic signal. The drawing shows a flowchart illustrating the method for determining a predicted trajectory of a human-driven vehicle.200Method for determining a predicted trajectory of a human-driven vehicle 210Step for obtaining a host vehicle-traffic light distance and a longitudinal host vehicle speed 220Step for obtaining traffic light signal phase and a traffic light timing 240Step for obtaining time of day 250Step for providing host vehicle traffic light distance, longitudinal vehicle speed, signal phase, timing and the time 260Step for determining predicted trajectory of the host vehicle ", "instruction": "Please summarize the input" }, { "input": "Apparatus for switching driving mode in autonomous driving vehicle, method thereof and computer recordable medium storing program to perform the methodThe present invention relates to an apparatus for switching a driving mode of an autonomous vehicle, a method therefor, and a computer-readable recording medium having recorded thereon a program for performing the method. The present invention relates to a camera unit for photographing a driver of the vehicle. A sensor unit including a plurality of sensors for dividing the outside of the vehicle into a plurality of areas and detecting objects of the plurality of divided areas, and a plurality of systems for controlling longitudinal and lateral driving of the vehicle; A control unit, a position information unit for deriving position information of the vehicle using a GPS (Global Positioning System) signal, a differential GPS (DGPS) signal, and an inertial sensor signal, the camera unit, the sensor unit, the driving unit, and the And a controller including two or more processors for controlling the vehicle including a location information unit, wherein the controller is configured to control driving of the vehicle. In the autonomous driving mode in which the autonomous driving level is controlled according to any one of a fully autonomous driving level controlled by the processor and a semi-autonomous driving level controlled by both the driver's operation and the processor. Determining whether at least one function of the sensor unit, the driving unit, the location information unit and the control unit is a fail state other than the normal state, and if the fail state is a vehicle device for performing a reboot, and the method and method therefor Provided is a computer readable recording medium having recorded thereon a program to be executed.|1. A vehicle apparatus for switching a driving mode of an autonomous vehicle, the vehicle apparatus comprising: a camera unit photographing a driver of the vehicle; A sensor unit for dividing the outside of the vehicle into a plurality of areas and including a plurality of sensors for detecting objects of the plurality of divided areas; A driving unit controlling a plurality of systems for controlling longitudinal and lateral movements of the vehicle; A location information unit for deriving location information of the vehicle using a GPS (Global Positioning System) signal, a differential GPS (DGPS) signal, and an inertial sensor signal; And at least two processors controlling the vehicle apparatus including the camera unit, the sensor unit, the driving unit, and the location information unit. And an autonomous driving level according to any one of an autonomous driving level at which the driving of the vehicle is controlled by the processor, and a semi-autonomous driving level at which the driving of the vehicle is shared and controlled by the driver. In the autonomous driving mode to determine whether at least one of the function of the sensor unit, the driver, the location information unit and the processor is in a fail state other than the normal state, if the fail state, perform a reboot, the reboot After that, whether the autonomous driving level is the complete autonomous driving level or the semi-autonomous driving level, and the state of each of the sensor unit, the driving unit, the location information unit, and the processor, driving of the vehicle is controlled by the driver. Limited driving mode controlled by a processor, and the driving of the vehicle is controlled by the driver Manual driving mode of any one of controlling so as to switch to the mode of the mode-switching module is controlled by; including, and then the mode conversion module to reboot in the autonomous mode of the semi-autonomous navigation level, If it is possible to detect all of the plurality of regions by using some of the plurality of sensors of the sensor unit, and switches to the limited driving mode, if it is unable to detect any of the plurality of regions to switch to the manual driving mode, the driving unit If it is not possible to control at least one of the longitudinal and transverse driving of the vehicle through, switch to the manual driving mode, and if it is possible to control both the longitudinal and transverse driving of the vehicle, switch to the restricted driving mode, When the accuracy of the location information is degraded according to the availability of the GPS signal, the DGPS signal, and the inertial sensor signal of the location information unit, the control unit switches to the limited driving mode and, if the position information cannot be calculated, switches to the manual driving mode. If both processors of the control unit are normal, the system enters the limited driving mode, and if either function is disabled, Vehicle device for a driving mode switching of autonomous vehicles, characterized in that the transition to the driving mode.\n | 2. The method of claim 1, wherein the mode switching module photographs the driver through the camera unit when the autonomous driving mode of the fully autonomous driving level is switched to the manual driving mode, identifies the state of the photographed driver, and then the driver. If the vehicle can not perform the manual driving mode, the vehicle device for switching the driving mode of the autonomous vehicle, characterized in that for switching to the risk minimization operation (MRM) mode.\n | 3. delete\n | 4. The method of claim 2, wherein the mode switching module may detect all of the plurality of regions by using some of the plurality of sensors of the sensor unit after the reboot in the autonomous driving mode of the fully autonomous driving level, If only one area cannot be detected, the vehicle switches to the limited driving mode, and if it is unable to detect two or more areas of the front side or the side of the plurality of areas, it switches to the manual driving mode, and the longitudinal direction and If it is not possible to control at least one of the transverse driving, switch to the manual driving mode, and if the accuracy of the position information of the position information deterioration or cannot calculate the position information, switch to the manual driving mode, and If both processors are disabled, switch to manual driving mode, and if only one of the two processors is disabled, enter limited driving mode. Vehicle apparatus for switching the driving mode of the autonomous vehicle, characterized in that for switching.\n | 5. The method of claim 2, wherein the mode switching module cannot detect the front region of the vehicle through the sensor unit in the risk minimization operation mode, or cannot detect two or more regions of the side region in the passenger seat direction of the vehicle, If the lateral driving of the vehicle cannot be controlled through the driving unit, the position information cannot be calculated through the position information unit, or both processors are disabled, after decelerating, the vehicle stops at a driving lane after deceleration. The accuracy of the position information calculated by the position information unit, or the two or more areas of the side area in the direction of the driver's seat can not be detected through the sensor unit, the longitudinal operation of the vehicle can not be controlled through the drive unit; Is deteriorated, the operation of the autonomous vehicle, characterized in that for performing the operation to stop in the safety zone or shoulder Vehicle apparatus for a mode change.\n | 6. The vehicle apparatus of claim 2, further comprising a communication unit configured to perform V2I communication, which directly communicates with a traffic server through a roadside device, and V2V communication, which directly communicates with a vehicle device of another vehicle. When the operation mode is changed to the minimized mode, a warning signal indicating that the risk minimization operation is performed is transmitted through the communication unit, and the transmitted warning signal is sequentially transmitted to the devices of the plurality of different vehicles in the order of approaching the vehicle. A vehicle apparatus for switching the driving mode of the autonomous vehicle.\n | 7. A method for switching a driving mode of a vehicle apparatus of an autonomous vehicle, the method comprising: a fully autonomous driving level controlled by a processor of the vehicle apparatus, and a semi-autonomous driving level controlled by the processor by a driver's operation of the vehicle; Performing autonomous driving according to any one of autonomous driving levels; A sensor unit which divides the outside of the vehicle into a plurality of areas and includes a plurality of sensors for detecting objects in the plurality of divided areas, and controls a plurality of systems for controlling the longitudinal and transverse driving of the vehicle. A driving unit, a position information unit for deriving position information of the vehicle using a GPS (Global Positioning System) signal, a differential GPS (DGPS) signal, and an inertial sensor signal, the sensor unit, the driving unit, and the position information unit. Determining whether at least one function of the control unit including two or more processors for controlling the vehicle is a fail state rather than a normal state; Performing a reboot if the determination result results in the failing state; And after the rebooting, depending on the state of the sensor unit, the driving unit, and the location information unit, whether the autonomous driving level is the full autonomous driving level or the semi-autonomous driving level, and the state of the plurality of items. Switching to one of a limited driving mode in which the driving of the vehicle is controlled by the driver and the processor and a manual driving mode in which the driving of the vehicle is controlled by the driver's operation; In the step of switching from the autonomous driving mode of the semi-autonomous driving level to the one of the limited driving mode and the manual driving mode after the rebooting, If it is possible to detect all of the plurality of regions by using some of the plurality of sensors of the sensor unit, and switches to the limited driving mode, if it is unable to detect any of the plurality of regions to switch to the manual driving mode, the driving unit If it is not possible to control at least one of the longitudinal and transverse driving of the vehicle through, switch to the manual driving mode, and if it is possible to control both the longitudinal and transverse driving of the vehicle, switch to the restricted driving mode, When the accuracy of the location information is degraded according to the availability of the GPS signal, the DGPS signal, and the inertial sensor signal of the location information unit, the control unit switches to the limited driving mode and, if the position information cannot be calculated, switches to the manual driving mode. If both processors of the control unit are normal, the control unit switches to the limited driving mode, and either function of the two processors is disabled. A method for the operation mode switching, characterized in that the switching between the manual driving mode.\n | 8. A computer-readable recording medium having recorded thereon a program for performing the method for switching the driving mode according to claim 7.", "output": "The apparatus (100) has a camera unit for photographing a driver of a vehicle (10). A sensor unit divides an outer side of the vehicle into multiple areas, and is provided with multiple sensors for detecting objects of the divided areas. A driving unit controls multiple systems for controlling longitudinal and lateral movements of the vehicle. A location information unit derives location information of the vehicle using a global positioning system (GPS) signal, a differential GPS (DGPS) signal and an inertial sensor signal, where a manual driving mode of a control unit is switched to a limited driving mode when the location information of the vehicle is not calculated. INDEPENDENT CLAIMS are also included for the following:a method for switching a driving mode of an autonomous vehiclea computer-readable recording medium for storing a set of instructions for performing a method for switching a driving mode of an autonomous vehicle. Apparatus for switching a driving mode of an autonomous vehicle. The manual driving mode of the control unit is switched to the limited driving mode when the location information of the vehicle is not calculated so as to perform autonomous driving of the vehicle and minimize risk of the driver by diagnosing states of multiple modules of the vehicle. The drawing shows a schematic view of an apparatus for switching a driving mode of an autonomous vehicle. '(Drawing includes non-English language text)' 10Vehicle100Apparatus for switching driving mode of autonomous vehicle200Roadside device300Traffic server", "instruction": "Please summarize the input" }, { "input": "AUTONOMOUS DRIVING GUIDANCE SYSTEM AND OPERATION METHOD THEREOFIn the control server for monitoring the traffic state of a plurality of road sections according to an embodiment of the present application, the control server communicates with a plurality of vehicles and a plurality of traffic lights, the plurality received from the plurality of traffic lights Based on the traffic information for each road section for each of the road sections, the driving route change of the plurality of vehicles is induced. \n\n|1. A control server for monitoring traffic conditions of a plurality of road sections, the control server communicating with a plurality of vehicles and a plurality of traffic lights, and a road section for each of the plurality of road sections received from the plurality of traffic lights A first vehicle communication unit for inducing a change in a driving route of the plurality of vehicles based on the traffic information, and each of the plurality of vehicles is in short-distance communication with the plurality of traffic lights; A time information recording unit storing time information communicated through the first vehicle communication unit;\nA second vehicle communication unit that transmits departure and destination information to the control server; And a driving route determining unit for driving the corresponding vehicle according to any one driving route selected by a passenger among at least one driving route received from the control server through the second vehicle communication unit. \n | 2. delete\n | 3. The method of claim 1, wherein the driving route determining unit calculates a passage time of the at least one driving route, and automatically determines a driving route in which the passing time of the at least one driving route is the shortest time according to whether or not the occupant is selected. Control server decided by.\n | 4. According to claim 1, The control server, A first server communication unit for wireless communication with the plurality of vehicles; And first and second traffic lights located within a predetermined distance in the starting and destination information received through the first server communication unit, and selecting a driving path through the plurality of road sections. Control server including a selection unit.\n | 5. The control server according to claim 4, wherein the driving route selection unit selects at least one link signal light located within a predetermined distance among a plurality of candidate signal lights located within a predetermined area between the first and second traffic lights.\n | 6. The method of claim 4, wherein the driving route selection unit maps the at least one link signal light to a map map stored in advance according to a preset search radius and azimuth range to derive a plurality of candidate routes, and the plurality of candidate routes. The control server selecting the at least one driving route for the departure and destination information in the order of the shortest distance.\n | 7. According to claim 1, The control server, A second server communication unit for wireless communication with the plurality of traffic lights;\nA traffic state determination unit for determining a traffic state for the plurality of road sections based on the traffic information for each road section; And a storage DB for classifying and storing the traffic information and the traffic conditions for each road section. \n | 8. The method of claim 7, wherein the traffic state determining unit sets a confidence interval for each road section according to an average and a standard deviation of the passage time for each road section included in the traffic information for each road section, and the passage time for each road section. The confidence intervals for each road section are compared, and when a passage time of any one of the road sections of the road section is included in the corresponding trust section, a weighted moving average method is used for the passage time of the one road section. A control server that calculates an average driving time and deletes the passing time of any one of the road sections from the storage DB when the passing time of any one of the road sections out of the corresponding confidence section.\n | 9. According to claim 1, Each of the plurality of traffic lights, The first infrastructure communication unit for receiving the vehicle information through the V2I communication with the plurality of vehicles, and transmits the current traffic light information;\nA second infrastructure communication unit for wireless communication with the control server through a network; \nA section identification unit for identifying any one of the plurality of road sections according to the previous traffic light information included in the vehicle information; \nA passage time identification unit responsive to the previous traffic light information to identify a driving time for any one of the road sections; And a traffic volume identification unit for identifying traffic volume for any one of the road sections based on vehicle model information included in the vehicle information. \n | 10. The method of claim 9, The passage time identification unit, the time interval and the signal waiting time between the first communication time and the second communication time with the current traffic light included in the current traffic light information included in the previous traffic light information Control server for calculating the driving time for any one of the road sections by adding up.\n | 11. The method according to claim 9, wherein the traffic amount identification unit accumulates one of the number of traveling vehicles of the preset vehicle type when the vehicle type information corresponds to a preset vehicle model, and according to the accumulated number of traveling vehicles, the one of the roads Acquiring the traffic volume for the vehicle type preset in the section, and obtaining the traffic volume according to the total number of vehicles in any one of the road sections when the vehicle model information does not correspond to the preset vehicle type or the traffic volume for the preset vehicle type is acquired. Control server.\n | 12. A plurality of vehicles that autonomously drive from the origin to the destination along a plurality of road sections;\nA plurality of traffic lights located on one side and the other side of each of the plurality of road sections, and communicating with the plurality of vehicles to obtain traffic information for each road section; And a control server that induces a change of a driving route of the plurality of vehicles based on the traffic information for each road section, and each of the plurality of vehicles includes a first vehicle communication unit for short-range communication with the plurality of traffic lights. ; A time information recording unit storing time information communicated through the first vehicle communication unit; \nA second vehicle communication unit that transmits departure and destination information to the control server; And a driving route determining unit for driving the corresponding vehicle according to any one driving route selected by a passenger among at least one driving route received from the control server through the second vehicle communication unit. \n | 13. A method of operating an autonomous driving guidance system, comprising: autonomous driving of a plurality of vehicles from a source to a destination along a plurality of road sections;\nAcquiring traffic information for each road section through V2I communication with the plurality of vehicles by a plurality of traffic lights located at one side and the other of each road section among the plurality of road sections; \nMonitoring, by a control server, traffic conditions for a plurality of road sections according to traffic information for each road section, which is wirelessly transmitted from the plurality of traffic lights through a network; And inducing, by the control server, a change in a driving route of the plurality of vehicles based on traffic conditions for a plurality of road sections, and the autonomous driving of the plurality of vehicles comprises: Transmitting origin and destination information; \nReceiving traffic information for at least one driving route and at least one driving route determined according to the traffic state of the plurality of road sections from the control server; \nCalculating a passing time for the at least one driving route; \nProviding the passing time and the traffic information for the at least one driving route to a passenger; And automatically determining a driving route in which the passing time is the shortest time among the at least one driving route, according to whether or not the occupant selects the driving method. \n | 14. delete\n | 14. The method of claim 13, wherein the step of inducing a route change for autonomous driving comprises: receiving departure and destination information from any one of the plurality of vehicles; Selecting at least one link traffic light based on the positions of the first and second traffic lights located within a predetermined distance to the departure and destination information;\nDeriving a plurality of candidate paths by mapping the at least one link signal light to a pre-stored map map according to a preset search radius and azimuth range; And selecting the at least one driving path and providing it to the vehicle in the shortest distance among the plurality of candidate paths. \n | 16. The method of claim 13, wherein the monitoring of traffic conditions of the plurality of road sections comprises: receiving traffic information for each road section from the plurality of traffic lights;\nComparing whether the number of driving vehicles for a predetermined vehicle type of one of the plurality of road sections corresponds to a first threshold; \nDetermining a certain road section as a caution state when the number of driving vehicles for a predetermined vehicle type of the road section corresponds to the first threshold; \nComparing whether the total number of vehicles in one of the road sections corresponds to a second threshold when the number of vehicles for a predetermined vehicle type in the one section does not correspond to the first threshold; And if the total number of vehicles in any one of the road sections corresponds to the second threshold, judges the one section of the road in the state of caution, and the total number of vehicles in the one section of the road does not correspond to the second threshold. If not, the operation method of the autonomous driving guidance system comprising the step of determining any one of the road section as a safe state. \n | 17. The method of claim 13, wherein the monitoring of traffic conditions of the plurality of road sections comprises: setting a confidence interval for each road section according to an average and a standard deviation of a passing time included in the traffic information for each road section;\nComparing the passage time for each road section and the confidence interval for each road section; \nCalculating an average driving time through a weighted moving average method for the passing time of any one of the road sections when the passing time of any one of the road sections is included in the corresponding confidence section; And deleting the passage time of the one road section when the passage time of the one road section out of the corresponding confidence section among the passage times for each road section. \n | 14. The method of claim 13, wherein the step of acquiring traffic information for each of the road sections by the plurality of traffic lights comprises: receiving vehicle information through the V2I communication from any one of the plurality of vehicles;\nTransmitting current traffic light information to the any one vehicle; \nDetecting previous traffic light information from the vehicle information; \nIdentifying one of the plurality of road sections according to the previous traffic light information and the current traffic light information; And transmitting the traffic information for any one of the road sections to the control server. \n | 19. The method of claim 18, wherein the step of acquiring traffic information for each of the road sections by the plurality of traffic lights comprises: identifying a first communication time with a previous traffic light and a second communication time with a current traffic light based on the vehicle information. step;\nDetermining whether a signal waiting time is included in a time period between the first and second communication times; \nIf a signal waiting time is included in the time period, summing the time period and the signal waiting time to determine a driving time for the one road section; And when the signal waiting time is not included in the time period, determining the time period as the driving time for the one road section. \n | 20. The method of claim 18, wherein the step of obtaining traffic information for each of the road sections by the plurality of traffic lights comprises: extracting vehicle model information from the vehicle information;\nComparing whether the vehicle model information corresponds to a preset vehicle model; \nAccumulating one in the number of vehicles of a preset vehicle that has traveled on any one of the road sections when the vehicle model information corresponds to a preset vehicle model; \nAcquiring traffic volume for a predetermined vehicle type in any one of the road sections through a predetermined discrimination coefficient according to the accumulated number of traveling vehicles; \nWhen the vehicle model information does not correspond to a preset vehicle model, or when a traffic volume of the preset vehicle model is obtained, the one of the ones may be determined through the predetermined discrimination coefficient according to the total number of vehicles traveling on one of the road sections. Obtaining traffic volume for the entire vehicle of the road section; And transmitting the traffic volume for the preset vehicle and the traffic volume for the entire vehicle to the control server at a predetermined time period.", "output": "The system (10) has a control server (300) for monitoring traffic conditions of road sections. A control server is communicated with vehicles (100-1-100-N) and traffic lights (200-1-200-N). A first vehicle communication unit induces change in a driving route of the vehicles based on traffic information, where the vehicles are in short-distance communication with the traffic lights. A time information recording unit stores the time information communicated through the first vehicle communication unit. A second vehicle communication unit transmits departure and destination information to the control server. A driving route determining unit drives a corresponding vehicle according to the driving route that is selected by a passenger and received from the control server through the second vehicle communication unit. An INDEPENDENT CLAIM is included for a method of operating an autonomous driving guidance system. Autonomous driving guidance system for a vehicle. The system induces autonomous driving from a departure point to an destination along an optimized road section for vehicles by reflecting the traffic condition of entire road. The drawing shows a block diagram of an autonomous driving guidance system. (Drawing includes non-English language text). 10Autonomous driving guidance system50Network100-1-100-NVehicles200-1-200-NTraffic lights300Control server", "instruction": "Please summarize the input" }, { "input": "Vehicle secondary accident prevention navigation, vehicle secondary accident prevention system and method using the sameThe vehicle secondary accident prevention system according to an embodiment of the present invention receives and stores driving information from an in-vehicle navigation or an in-vehicle information system, performs accident determination based on vehicle driving information, and determines whether an accident A control server that can generate an accident notification message and transmit it to the navigation of the corresponding vehicle or, if it is not an accident, generate a step-by-step driving warning notification message according to the degree of driving speed and transmit it to the navigation of the vehicle.|1. delete\n | 2. delete\n | 3. It receives and stores driving information from the in-vehicle navigation or in-vehicle information system, determines whether there is an accident risk according to the degree of accident or driving speed based on the driving information of the vehicle, and generates an accident notification message according to whether the accident is determined and a control server capable of transmitting to the navigation of the corresponding vehicle, or generating a step-by-step driving warning notification message according to the degree of driving speed in case of no accident and transmitting it to the navigation of the vehicle, wherein the control server transmits the driving information to the vehicle a collection unit for receiving data from my navigation or information system through a communication network and storing it in a database;\nA communication unit that provides a communication protocol compatible to be connected to a communication network and transmits an accident notification message according to whether the driving information is received or an accident determination or a driving warning notification message according to the degree of driving speed to the navigation device or the mobile terminal; \nIt is provided from the collecting unit that receives the driving information from the in-vehicle navigation or the in-vehicle information system, and based on the driving information of the vehicle, determines whether or not a vehicle accident occurs or sets the risk reference value for the driving speed step by step, according to the degree of the driving speed an accident judgment unit that determines whether there is an accident risk; \nA notification unit that generates an accident notification message according to whether the accident determination unit determines an accident or generates a step-by-step driving warning notification message according to the degree of driving speed, and transmits the generated accident notification message or driving warning notification message to a mobile terminal or navigation system ; \nReceives autonomous driving information through ITS traffic information or V2X communication of an autonomous driving server connected through an in-vehicle autonomous driving system or communication network if the vehicle is an autonomous vehicle, and determines whether an accident occurs in connection with the driving information Further comprising a determination support unit that connects with the ITS or in-vehicle autonomous driving system or autonomous driving server, receives traffic information or autonomous driving information, and provides the information to the accident determination unit, wherein the accident determination unit includes the information contained in the driving information. In order to detect a sudden change in driving speed, acceleration information is received, a reference value for acceleration is set, and a vehicle accident is determined depending on whether the reference value is exceeded, or driving information is received between adjacent vehicles as traffic information in connection with the ITS. and comparing the driving speed between the front vehicle and the rear vehicle included in the driving information with the normal average speed of the section to determine whether there is an accident between vehicles or whether there is an accident risk equivalent to an accident, and the control server is As driving information that becomes It is possible to receive autonomous driving information through ITS traffic information, an in-vehicle autonomous driving system or V2X communication of an autonomous driving server connected through a communication network, and determine whether an accident has occurred in connection with the driving information through the accident determination unit. Vehicle secondary accident prevention system, characterized in that. \n | 4. delete\n | 5. delete\n | 6. delete\n | 7. delete\n | 8. A method for preventing secondary vehicle accidents using the system for preventing secondary vehicle accidents according to claim 3, comprising: periodically collecting vehicle driving information by connecting the control server to a navigation system or an in-vehicle information system through a communication network;\ndetermining, by the control server, whether a vehicle accident has occurred by checking a change in vehicle speed using acceleration information included in the collected vehicle information; \nWhen the control server is determined to be a vehicle accident, generating an accident notification message, and transmitting the accident notification message to a mobile terminal set to receive notification from the control server Second vehicle accident prevention method comprising the step of.", "output": "The system has an accident determination unit which determines whether a vehicle accident occurs according to whether the reference value is exceeded. The driving information between adjacent vehicles is received as traffic information in connection with an intelligent transportation system (ITS) (400). Determination is made whether there is an accident between vehicles or the risk of an accident equivalent to an accident by comparing the driving speed between the front and rear vehicles with the normal average speed of the section. A control server (200) receives autonomous driving information as driving information as a reference for accident determination, ITS traffic information, and autonomous driving information through vehicle-to everything (V2X) communication of an autonomous driving server connected through an in-vehicle autonomous driving system or a communication network (500), and determines whether an accident has occurred in connection with the driving information. An INDEPENDENT CLAIM is included for a method for preventing secondary vehicle accidents using system for preventing secondary vehicle accidents. System for preventing secondary vehicle accidents, used for quickly transmitting information about traffic accident to disaster center or driver. The leakage of personal information is prevented by protecting data when collecting personal information based on block chain. The drawing shows a block diagram of the vehicle secondary accident prevention system. (Drawing includes non-English language text) 100Navigation system200Control server300Mobile terminal400ITS500Communication network", "instruction": "Please summarize the input" }, { "input": "Lateral Control Mode Decision Method for Truck PlatooningThe present invention relates to a method for determining a platooning lateral control operation mode for trucks, and more particularly to a control method for lateral control of a group of autonomous vehicles, such as heavy trucks, performing platooning according to the operation mode. According to the present invention, when a following vehicle joins a platooning rank, a preceding vehicle through a Platooning Control Unit (PCU) and a Platooning Lateral Control System (PLCS) mounted on the following vehicle controlling steering of the following vehicle in a following mode by receiving driving information of the vehicle; detecting a plurality of events by monitoring an operation state of the following vehicle in the Platooning Lateral Control System (PLCS); and performing at least one of releasing the following mode, leaving the queue, or transferring control to a driver when the plurality of events occur. . According to the present invention, there is an advantage in that it is possible to present an effective lateral control mode determination method according to the occurrence of an event during platooning.|1. When the following vehicle joins the platooning rank, the driving information of the preceding vehicle is transmitted through the Platooning Control Unit (PCU) and Platooning Lateral Control System (PLCS) mounted on the following vehicle. receiving the reception and controlling steering of the following vehicle in a following mode;\ndetecting a plurality of events by monitoring an operation state of the following vehicle in the Platooning Lateral Control System (PLCS); and performing at least one of releasing the following mode, leaving the queue, or transferring control to a driver when the plurality of events occur. \n | 2. The freight vehicle of claim 1, wherein the controlling of the steering of the following vehicle in the following mode comprises checking a vehicle to vehicle (V2V) communication state of the following vehicle with the preceding vehicle or a state of a lane detection sensor. How to determine the platooning lateral control operating mode.\n | 3. The method according to claim 2, wherein when the communication state or the lane detection sensor state is normally operated, generating a route for performing a function of following the preceding vehicle and maintaining a lane. .\n | 4. The platooning platooning lateral control system of claim 1, wherein the monitoring of the operating state of the following vehicle further comprises determining whether a Platooning Lateral Control System (PLCS) of the following vehicle is operating normally. How to determine the directional control mode of operation.\n | 5. The platooning lateral control system (PLCS) of claim 1 , wherein the plurality of events include a decrease in reliability of lane information of the preceding vehicle or the following vehicle, reception of a lane change request, occurrence of an emergency braking situation, and a Platooning Lateral Control System (PLCS). A method for determining an operation mode of a freight vehicle platooning lateral control, characterized in that at least one of receiving a stop signal from the vehicle and generating a cut-in of another vehicle.\n | 6. The method of claim 5 , further comprising: reconfirming the driving information of the preceding vehicle when the reliability of the lane information is reduced or a lane change request is received; and determining whether the reliability is restored or the lane change is completed.\n | 7. The method of claim 5, further comprising maintaining the steering angle until the following vehicle stops when the emergency braking situation occurs.\n | 8. [Claim 6] The determination of the operation mode of the platooning platooning control operation of trucks according to claim 5, comprising maintaining an existing lane for a predetermined time when a stop signal is received from the platooning lateral control system (PLCS). Way.\n | 9. The method of claim 5 , further comprising: maintaining an existing lane for a predetermined time when a cut-in of the other vehicle occurs; and determining whether the other vehicle departs for the predetermined time period.\n | 10. The method of claim 9, wherein when the other vehicle departs, the existing steering is maintained as it is, and when the other vehicle does not depart, the vehicle platooning lateral control operation mode is determined, comprising maintaining the existing lane for a predetermined time. Way.", "output": "The method involves transmitting the driving information of the preceding vehicle through the Platooning Control Unit (PCU) and Platooning Lateral Control System (PLCS) mounted on the following vehicle, receiving the reception and controlling steering of the following vehicle, and detecting an event by monitoring an operating state of the following vehicle through sensors mounted on the following vehicle. When the event occurs, a step of changing the following, is followed by leaving the queue, or transferring control to a driver. INDEPENDENT CLAIMS are included for the following:a computer readable recording medium andan apparatus for determining a lateral control operating mode of a following vehicle. Method for determining a lateral control operation mode of platooning vehicles. The method presents an effective lateral control mode determination according to the occurrence of an event during platooning. The drawing shows a control process of a Platooning Control Unit (PCU) and a Platooning Lateral Control System (PLCS).", "instruction": "Please summarize the input" }, { "input": "Longitudinal queue associated vehicle system under influence of communication time delay and fuzzy control method thereofThe invention claims a longitudinal queue associated vehicle system under the influence of communication time delay and fuzzy control method thereof, by collecting signal and calculating to obtain the speed of the vehicle, acceleration, position information, associating the error and error change rate between the vehicles, starting from the one-way strong coupling, describing the longitudinal queue control problem with CACC function as the discrete correlation system under the influence of the communication time delay, constructing the correlation system model; then using partial decomposition method and lyapunov function method, obtaining communication time delay ensuring queue associated system stable condition, ignoring the vehicle state information beyond the communication time delay, only using vehicle state information in the communication time delay range; at last, using fuzzy PID control algorithm, combining the vehicle distance with the communication delay upper bound design the longitudinal queue cruise control strategy. The invention can reasonably respond to the acceleration or deceleration behavior of the front vehicle in the proposed control strategy, reach the expected CACC control performance, effectively compensate the influence of the communication delay.|1. A longitudinal queue associated vehicle system under influence of communication time delay, wherein it comprises a workshop communication module, an information collecting module, a cooperative decision module, a motion control module, wherein the workshop communication module, through the global positioning system (GPS) and the vehicle (V2V) communication, receiving and sending the state information of the vehicle; information collecting module, collecting the autonomous vehicle and queue vehicle comprises speed, position and acceleration information, calculating the actual distance between the front vehicle and the autonomous vehicle by the collected information, the relative speed of the front vehicle and the autonomous vehicle, the relative acceleration of the front vehicle and the autonomous vehicle and the error change rate; cooperative decision module, the relative vehicle distance and vehicle distance error of the front vehicle and the autonomous vehicle as input, according to the expected vehicle distance and actual vehicle distance error, combining the workshop kinematics model, ignoring the vehicle state information beyond the communication time delay; using the available vehicle state information in the communication time delay range ensuring the stable queue; a motion control module, comprising a control decision module, a fuzzy controller and a PID controller, for controlling the autonomous vehicle to follow the front vehicle at a desired speed and keep the safety distance; the control decision module determines the acceleration and deceleration behaviour of the autonomous vehicle according to the expected vehicle distance and the vehicle distance error; fuzzy controller according to the expected vehicle distance error and expected vehicle distance error change rate, dynamically outputting three PID parameters; PID controller according to the PID parameter output by the fuzzy controller to control the decision layer to determine the motion control mode, controlling the autonomous vehicle.\n | 2. The longitudinal queue associated vehicle system under the influence of communication time delay according to claim 1, wherein the relative speed of the front vehicle and the autonomous vehicle provided by the vehicle state collecting module according to the safety state collecting module, communication time delay and associated system characteristic, the design control target is vehicle distance error and speed error.\n | 3. The longitudinal queue associated vehicle system under the influence of communication time delay according to claim 1, wherein the workshop kinematics model is starting from the one-way strong coupling, and the longitudinal queue control problem with CACC function is described as the discrete state space equation under the influence of the communication time delay, constructing the longitudinal queue associated vehicle system under the influence of the communication time delay.\n | 4. The longitudinal queue associated vehicle system under the influence of communication time delay according to claim 1, wherein the fuzzy controller is based on the workshop kinematics model. using the membership function fuzzy quantization to obtain the fuzzy input quantity and output the coefficient value of the PID controller by using the deviation of the expected vehicle distance and the actual vehicle distance and the vehicle distance error change rate.\n | 5. A fuzzy control method of longitudinal queue associated vehicle system under the influence of communication time delay, applied to the longitudinal association system under the influence of communication time delay according to claim 1, wherein it comprises the following steps: a. collecting the movement state information of the front vehicle and the autonomous vehicle in the queue, using the sensor and wireless communication technology to obtain the vehicle driving, the braking process comprises position, vehicle speed and acceleration information, and calculating to obtain the speed error of other vehicle, acceleration error and error change rate; b, designing the control target, obtaining the position, vehicle speed and acceleration information, the communication time delay, association characteristic fusion and vehicle distance error is designed as the control target; c, obtaining communication time delay upper bound; d, designing the PID controller.\n | 6. The fuzzy control method of longitudinal queue associated vehicle system under the influence of communication time delay according to claim 5, wherein in the step c, obtaining the communication time delay upper limit comprises the following sub-steps: c-1, establishing a communication time delay influence the longitudinal queue associated system model, starting from the one-way strong coupling characteristic, the longitudinal queue control problem with CACC function is described as the discrete correlation system under the influence of communication time delay, constructing the communication time delay influence the queue vehicle CACC associated system model; c-2, stability analysis, the longitudinal correlation system model established in step c-1 uses partial decomposition method and lyapunov function method to perform stability analysis and obtain the communication time delay ensuring the stable association queue.\n | 7. The fuzzy control method of longitudinal queue associated vehicle system under the influence of communication time delay according to claim 5, wherein the design of the PID controller in the step d is specifically as follows: designing the input quantity of the fuzzy controller: using membership function fuzzy quantization to obtain the vehicle space error and vehicle distance error change rate to obtain two corresponding fuzzy input quantity, the fuzzy language value of vehicle distance error and vehicle distance error change rate is L, M, S and ZO, wherein L represents large, M represents moderate, S represents small, ZO represents zero, fuzzy controller according to fuzzy control rule designed to fuzzy reasoning and obtaining fuzzy output; combining the workshop distance control and communication time delay to control the stability of the whole CACC queue, designing the PID controller u is: u (k) = KP* e (k) + KI* ?e (k) + KD* (e (k + 1) - e (k)), wherein KP is the proportional coefficient, KI is the integral coefficient, KD is differential coefficient.\n | 8. The fuzzy control method of longitudinal queue associated vehicle system under the influence of communication time delay according to claim 7, wherein for the input quantity, the Gauss type membership function is adopted, and the triangle membership function is adopted for the output quantity.\n | 9. The fuzzy control method of longitudinal queue associated vehicle system under the influence of communication time delay according to claim 7, wherein the fuzzy control rule comprises: when the vehicle distance error is too large, accelerating the response speed of the rear vehicle, meanwhile, it avoids the over-range control effect caused by the start of the CACC array system; when the vehicle distance error is too small, adjusting the proportional coefficient and the integral coefficient, the CACC queue associated system has good steady state performance, at the same time, adjusting the differential coefficient, avoiding the system oscillation at the balance point; when the vehicle distance error is in the middle and so on, it should make the CACC queue associated with the system response, at the same time, ensure the response speed of the CACC queue system.", "output": "The system has a workshop communication module for receiving and sending state information of a vehicle through a global positioning system (GPS ) and vehicle (V2V) communication. An information collecting module collects speed, position and acceleration information of an autonomous vehicle. A cooperative decision module determines acceleration and deceleration behavior of the autonomous vehicle according to expected vehicle distance and vehicle distance error. A fuzzy controller dynamically outputs three proportional-integral-derivative (PID) parameters according to the expected car distance error and an expected car error change rate. The PID controller controls a decision layer to determine a motion control mode according to a PID parameter output by the fuzzy controller to control the autonomous car to control. An INDEPENDENT CLAIM is included for a fuzzy control method of longitudinal queue associated vehicle system under the influence of communication time delay. Longitudinal queue associated vehicle system under influence of communication time delay for longitudinal cooperative adaptive cruise control (CACC) under vehicle-vehicle communication. The acceleration or deceleration behavior of the front vehicle in the proposed control strategy is reasonably responded, the expected CACC control performance is reached, and the influence of the communication delay is effectively compensated. The drawing shows a flowchart illustrating the fuzzy control method of the longitudinal queue associated vehicle system. (Drawing includes non-English language text)", "instruction": "Please summarize the input" }, { "input": "System and Method for Automated Evaluation of Driving Ability of Self-Driving CarsA system and method for automatic driving ability evaluation of an autonomous driving system are provided. In the autonomous driving capability automatic evaluation system including a plurality of sensors, an integrated sensor platform, a battery and a switch proposed in the present invention, the integrated sensor platform uses each corresponding small processor for sensor data collected through a plurality of sensors. Autonomous driving processing unit that performs pre- and post-processing, receiving multiple post-processed sensor data from the autonomous driving processing unit, detecting lanes through sensor fusion, measuring vehicle speed, distance from the vehicle in front, whether emergency braking is activated, a sensor unit that detects objects and measures the relative position of objects in real time, and each scenario based on road traffic laws, including lane keeping scenarios and stability evaluations, lane change scenarios and stability evaluations to evaluate the results measured by the sensor unit It includes an evaluation unit that conducts star stability evaluation and a remote unit that transmits data to remotely monitor through an application to a driver who wants to evaluate the progress of the autonomous driving ability automatic evaluation evaluated by the evaluation unit.|1. In the autonomous driving ability automatic evaluation system including a plurality of sensors, an integrated sensor platform, a battery, and a switch, the integrated sensor platform pre-processes and post-processes sensor data collected through a plurality of sensors through respective corresponding small processors. an autonomous driving processing unit that performs; After receiving multiple sensor data post-processed by the autonomous driving processing unit, through sensor fusion, lane detection, vehicle speed measurement, distance to the vehicle in front, emergency braking operation, object detection, and object relative position are displayed in real time. a sensor unit to measure; an evaluation unit that performs stability evaluation for each scenario based on road traffic laws including lane keeping scenarios and stability evaluations, lane change scenarios and stability evaluations to be evaluated for results measured by the sensor unit; and a remote unit for transmitting data to remotely monitor the progress of the automatic evaluation of the autonomous driving ability evaluated by the evaluation unit through an application to a driver who wants to be evaluated, and the sensor unit includes a V2X OBU (Vehicle to Everything On-Board Unit)) The data transmitted from the GNSS (Global Navigation Satellite System) sensor and INS (Inertial Navigation System) sensor are combined using UDP (User Data Protocol) or CAN (Controller Area Network) communication, and the message output from the V2X sensor Among the three (Message Set), BSM (Basic Safety Message), SPAT (Signal Phase and Timing Message), TIM (Traveler Information Message), By combining RSA (Road Side Alert Message), data required for traffic light recognition (SPAT), data required for sensor fusion (BSM), TIM, and RSA are divided based on the distributed processing sensor combination method, and output from the GNSS sensor It receives latitude, longitude, and altitude, configures an environment for outputting own vehicle location and relative vehicle location data based on a distributed processing sensor combination method, and configures an environment for outputting roll, pitch, and yaw output from the INS sensor. (yaw), receives speed data and configures an environment for measuring the position and condition of its own vehicle based on a distributed processing sensor combination method, measuring speed, distance to other vehicle, emergency braking operation, object detection and object Performs sensor combination using an algorithm required to derive relative vehicle position data of, and the autonomous driving processing unit, Distributed processing method is used to perform pre-processing and post-processing through respective small processors that process multiple sensor data in order to generate data necessary for sensor fusion of distributed processing method for sensor data collected through multiple sensors. Autonomous driving capability automatic evaluation system.\n | 2. delete\n | 3. The autonomous driving capability of claim 1, wherein the evaluation unit receives an evaluation preparation signal from a driver who wants to be evaluated remotely through an application, and then based on a New Car Assessment Program (NCAP) stored in a road traffic law database and road traffic laws and regulations. The evaluation score is calculated using the results measured by the sensor unit according to the evaluation algorithm for each scenario for automatic evaluation, and the evaluation items of the automatic evaluation of autonomous driving ability include the lane keeping scenario and stability evaluation, and the lane change scenario and stability evaluation items. Autonomous driving ability evaluation system including.\n | 4. The driver of claim 1, wherein the remote unit remotely controls on/off of the autonomous driving ability automatic evaluation system, receives an evaluation item to be evaluated from the driver through an application in a TCP/IP method, and is evaluated An autonomous driving ability automatic evaluation system that transmits the evaluation preparation signal received remotely through the application from the evaluator to the evaluation unit, and displays the evaluation progress of the current vehicle and the evaluation score calculated through the evaluation unit together with the evaluation start notification to the driver.\n | 5. In the automatic evaluation method for autonomous driving capability of an automatic driving capability automatic evaluation system including a plurality of sensors, an integrated sensor platform, a battery, and a switch, the autonomous driving processing unit of the integrated sensor platform receives sensor data collected through a plurality of sensors, respectively. Performing pre-processing and post-processing through a corresponding small processor of; The sensor unit of the integrated sensor platform receives multiple sensor data post-processed by the autonomous driving processing unit and uses sensor fusion to detect lanes, measure vehicle speed, distance from vehicle in front, emergency braking operation, object detection, and object detection. Measuring the relative position of in real time; Performing stability evaluation for each scenario based on road traffic laws including lane keeping scenario and stability evaluation, lane change scenario and stability evaluation to be evaluated by the evaluation unit of the integrated sensor platform for the results measured by the sensor unit; and transmitting data so that the remote unit of the integrated sensor platform remotely monitors the progress of the automatic evaluation of autonomous driving capability evaluated by the evaluation unit through an application to a driver who wants to be evaluated, and the sensor unit of the integrated sensor platform performs autonomous driving. Receives multiple sensor data post-processed by the processing unit and measures lane detection, vehicle speed measurement, distance to the vehicle in front, emergency braking operation, object detection, and object relative position through sensor fusion in real time. In the step, the data transmitted from the V2X OBU (Vehicle to Everything On-Board Unit) sensor, GNSS (Global Navigation Satellite System) sensor, and INS (Inertial Navigation System) sensor is transmitted through UDP (User Data Protocol) or CAN (Controller Area Network) Combined using communication, Data required for traffic light recognition by combining BSM (Basic Safety Message), SPAT (Signal Phase and Timing Message), TIM (Traveler Information Message), and RSA (Road Side Alert Message) among message sets output from V2X sensors (SPAT), data required for sensor fusion (BSM), TIM, and RSA are divided based on the distributed processing sensor combination method, and the latitude, longitude, and altitude output from the GNSS sensor are received and based on the distributed processing sensor combination method to configure an environment for outputting the position data of own vehicle and the position of the other vehicle, receive the roll, pitch, yaw, and speed data output from the INS sensor, and use the distributed processing sensor combination method. Based on this, it configures an environment for measuring the location and status of its own vehicle, and measures speed, distance from the other vehicle, emergency braking operation, The self-driving processing unit of the integrated sensor platform, which performs sensor coupling using an algorithm necessary to detect an object and derive relative vehicle position data of the object, through each corresponding small processor for sensor data collected through a plurality of sensors. The step of performing pre-processing and post-processing includes pre-processing and post-processing through respective small processors that process a plurality of sensor data in order to generate data necessary for sensor fusion of a distributed processing method with respect to sensor data collected through a plurality of sensors. An automatic evaluation method for autonomous driving capability using a distributed processing method that performs post-processing.\n | 6. delete\n | 7. The method of claim 5, wherein the evaluation unit of the integrated sensor platform performs stability evaluation for each scenario based on road traffic laws, including lane keeping scenarios and stability evaluations, lane change scenarios and stability evaluations to be evaluated with respect to the results measured by the sensor unit. The step to be performed is to receive the evaluation preparation signal remotely through the application from the driver who wants to be evaluated, and then to automatically evaluate the autonomous driving ability based on the NCAP (New Car Assessment Program) and road traffic laws stored in the road traffic law database. The evaluation score is calculated using the results measured by the sensor unit according to the evaluation algorithm for each scenario, and the evaluation items for the automatic evaluation of autonomous driving ability are autonomous driving including lane keeping scenario and stability evaluation, lane change scenario and stability evaluation items. How to automatically evaluate your abilities.\n | 8. The autonomous driving ability automatic evaluation system according to claim 5, wherein the remote unit of the integrated sensor platform transmits data to remotely monitor through an application to a driver who wants to evaluate the progress of the autonomous driving ability automatic evaluation evaluated by the evaluation unit Remotely control the on/off of the vehicle, receive the evaluation item to be evaluated from the driver through the application in TCP/IP method, and receive the evaluation preparation signal remotely input through the application from the driver who wants to be evaluated to the evaluation unit When transmitted, an automatic evaluation method for autonomous driving capability that displays the evaluation progress of the current vehicle and the evaluation score calculated by the evaluation unit along with an evaluation start notification to the driver.", "output": "The system has an integrated sensor platform (100) for pre-processing and post-processing sensor data collected through multiple sensors through a small processor. An autonomous driving processing unit (110) receives the post-processed sensor data from the system. An evaluation unit (130) performs stability evaluation for each scenario based on road traffic laws. A remote unit (140) transmits data to a driver to receive an evaluation of the progress of automatic evaluation of autonomous driving ability evaluated by evaluation unit. An INDEPENDENT CLAIM is also included for a method for automatic evaluation of autonomous driving capability of self-driving car. The system is useful for automatically evaluating autonomous driving capability of self-driving car. The driver can evaluate the safety and reliability of the autonomous driving system without restriction of location. The development of autonomous driving systems that comply with road traffic regulations and provisional driving permit regulations is promoted. The drawing shows a diagram illustrating the configuration of an integrated sensor platform of an automatic driving ability evaluation system of an autonomous driving system (Drawing includes non-English language text).100Integrated sensor platform110Autonomous driving processing unit130Evaluation unit140Remote unit", "instruction": "Please summarize the input" }, { "input": "A following control method of automatic driving vehicle, system and terminal and storage mediumThe invention claims a following control method of automatic driving vehicle, system and terminal and storage medium. The following control method through establishing the communication connection of the pilot vehicle and the following vehicle, and sending the request to the interactive terminal to obtain the user authorization, so as to generate a pilot signal based on the pilot vehicle information transmitted to the following vehicle, so as to control the following vehicle to follow the driving state of the pilot vehicle according to the pilot signal. The running state control of the pilot vehicle comprises the longitudinal control and the transverse control of the vehicle. The following control method of the automatic driving vehicle is shared by real-time information, the sensing ability of the member vehicle to the environment vehicle is expanded, so that the queue control accuracy and the traffic efficiency are improved. At the same time, the vehicle has a queue driving function, improves the aerodynamic performance, effectively reduces the following vehicle wind resistance area, reduces the vehicle power output. shortening the distance between the vehicle and the vehicle on the basis of ensuring the safety, reducing the speed fluctuation, which can effectively improve the traffic efficiency and reduce the energy consumption.|1. A following control method of automatic driving vehicle, wherein it is used for controlling the pilot vehicle in a preset fleet to guide and follow the vehicle, the following control method comprises the following steps: S1, establishing the communication connection of the pilot vehicle and the following vehicle, the establishing method comprises the following steps: S11, determining that the pilot vehicle is in the V2V communication range; S12, obtaining the pilot command of the pilot and controlling the following vehicle to enter the following mode; S13, judging whether the following vehicle is based on predetermined information to lock the pilot vehicle; is, executing S14; S14, based on the position information of the vehicle near the vehicle identifying the potential vehicle list, and receiving response of V2V information from the potential vehicle list, S15, judging whether the navigation vehicle is authorized and the communication of the following vehicle is successful; if so, executing S16; S16, when the following vehicle is in the normal vehicle tracking distance, judging whether the following vehicle is in the V2V communication range; if so, controlling the pilot vehicle to directly transmit the pilot vehicle information to the following vehicle; otherwise, using the redundant V2V signal to transmit the pilot vehicle information to the following vehicle; S2, sending a request to the interactive end to obtain the user authorization; S3, generating a pilot signal based on the pilot vehicle information transmitted to the following vehicle, so as to control the following vehicle to follow the driving state of the pilot vehicle according to the pilot signal; the driving state control of the pilot vehicle comprises longitudinal control and transverse control of the vehicle; wherein the longitudinal control logic of the pilot vehicle is: real-time collecting speed deviation of the pilot vehicle actual speed and a predetermined speed, and calculating according to the speed deviation to obtain the acceleration pedal control quantity of the pilot vehicle, gear shifting control quantity and brake control quantity, so as to utilize a switching logic to control from the accelerator pedal, selecting one of the shift control and the brake control to compensate the speed deviation; the transverse control logic of the pilot vehicle is: real-time collecting the steering deviation of the expected steering and actual steering of the pilot vehicle, and calculating according to the steering deviation to obtain steering control quantity and gear shifting control quantity, so as to use the switching logic to select one of the gear shifting control and steering control to compensate the steering deviation; wherein the sum of the current steering and steering pre-estimation of the pilot vehicle is used as the steering feedback quantity, the difference between the expected steering and the steering feedback quantity is used as the input steering deviation.\n | 2. The following control method of automatic driving vehicle according to claim 1, wherein the running state control of the following vehicle comprises the longitudinal following control of the vehicle, the transverse following control and the signal lamp following control. wherein the following control logic of the signal lamp of the following vehicle is: obtaining the predetermined light state according to the guide signal, calculating the light state deviation of the following vehicle current light state and the predetermined light state, so as to obtain the lamp light signal control information according to the light state deviation and controlling the current light state of the following vehicle.\n | 3. The following control method of the automatic driving vehicle according to claim 2, wherein the control object of the signal lamp following control comprises a steering lamp of the vehicle, a brake lamp, an energy recycling lamp and warning light recycling lamp.\n | 4. The following control method of automatic driving vehicle according to claim 1, wherein in S13, when the following vehicle is not based on the predetermined information locking the pilot vehicle, executing S21; S21, controlling the following vehicle to continuously receive V2V information from the nearby vehicle.\n | 5. The following control method of automatic driving vehicle according to claim 1, wherein in S15, when the unauthorized pilot vehicle and the following vehicle are successful, returning to S14 to re-confirm the V2V communication signal of said potential pilot vehicle list.\n | 6. A navigation-following vehicle V2V cooperative guiding system, wherein it uses the following control method of the automatic driving vehicle according to any one of claims 1 to 1 to 5; the V2V Collabored pilot system of the pilot-following vehicle comprises: a pilot vehicle cooperative guide module, which is set in the pilot vehicle, comprising a pilot V2V communication interface, a pilot navigation control module and a pilot information analysis processing module; the navigation V2V communication interface sends the V2V information of the pilot vehicle and receives the V2V information of the following vehicle; the pilot navigation control module is used for controlling the driving state of the pilot vehicle; the navigation information analysis processing module is used for analyzing V2V information received through the pilot V2V communication interface, so as to identify each nearby vehicle and/or identifying nearby vehicle associated with the specific passenger identification data, and a following vehicle system guide module, which is set in the following vehicle, comprising a following guide control module, a following V2V communication interface and following information analysis processing module; the following guide control module is used for controlling the running state of the following vehicle; the following V2V communication interface is used for sending the V2V information of the following vehicle and the V2V information of the receiving pilot vehicle; the following information analysis processing module is used for analyzing V2V information received by the following V2V communication interface, so as to identify the pilot vehicle.\n | 7. The V2V cooperative guiding system of the pilot-following vehicle according to claim 6, wherein the pilot vehicle cooperative guiding module further comprises a pilot redundant communication interface; the following vehicle cooperative guide module further comprises a following redundant communication interface; wherein the navigation redundant communication interface and the following redundant communication interface are corresponding to each other for transmitting and receiving V2V information between each other, the V2V information comprises information using DSRC communication type.\n | 8. The navigation-following vehicle V2V cooperative guiding system according to claim 7, wherein the navigation redundant communication interface and the following redundant communication interface adopt one or more of the following communication interface types: mobile phone Wi-Fi network, a Wi-Fi network, a ZigBee, a Z-wave communication, a vehicle communication to the infrastructure, the vehicle to the pedestrian communication, the vehicle device communication and the vehicle to the network communication.\n | 9. A computer terminal, comprising a memory, A processor and a computer program stored on the memory and capable of running on the processor, wherein the processor executes the program when executing the following control method of the automatic driving vehicle according to any one of claims 1 to 1 to 5.\n | 10. A computer readable storage medium, on which a computer program is stored, wherein when the program is executed by a processor, the following control method of the automatic driving vehicle according to any one of claims 1 to 1 to 5 is realized.", "output": "The method involves establishing (S1) a communication connection of a pilot vehicle and a following vehicle. The request is sent (S2) to the interactive end to obtain the user authorization. A pilot signal is generated (S3) based on the pilot vehicle information transmitted to the following vehicle, so as to control the following vehicle to follow the driving state of the pilot vehicle according to the pilot signal. The steering deviation of the expected steering and actual steering of the pilot vehicle is calculated in real time, according to the steering deviation to obtain steering control quantity and gear shifting control quantity, so as to use the switching logic to select one of the gear shifting control and steering control to compensate the steering deviation. The difference between the expected steering and the steering feedback quantity is used as the input steering deviation. INDEPENDENT CLAIMS are included for: 1. a navigation-following vehicle V2V cooperative guiding system; 2. a computer terminal; and 3. a computer readable storage medium storing computer program for performing process for controlling automatic driving vehicle. Following-up control method for controlling automatic driving vehicle. The sensing ability of the member vehicle to the environment vehicle is expanded, so that the queue control accuracy and the traffic efficiency are improved. The vehicle has a queue driving function, improves the aerodynamic performance, effectively reduces the following vehicle wind resistance area, reduces the vehicle power output, shortens the distance between the vehicle and the vehicle on the basis of ensuring the safety, and reduces the speed fluctuation, which can effectively improve traffic efficiency and reduce the energy consumption. The drawing shows a flow diagram illustrating the process for controlling automatic driving vehicle. (Drawing includes non-English language text)S1Step for establishing communication connection of pilot vehicle and following vehicle S2Step for sending request to interactive end to obtain user authorization S3Step for generating pilot signal based on pilot vehicle information ", "instruction": "Please summarize the input" }, { "input": "PHEV hybrid vehicle group optimization control method of queue management and adaptive cruise controlThe invention claims a PHEV hybrid vehicle group optimization control method of queue management and self-adaptive cruise control, wherein the control method comprises the following steps; step one, clustering the driving data of multiple working conditions by multiple characteristic parameters to determine multiple driving styles; step , determining the vehicle lane-changing safety area according to the obtained vehicle state information and the surrounding environment information; step three, in the self-adaptive cruise control, based on the safety and comfort, restraining the vehicle distance, vehicle speed and acceleration, and reasonably recycling the braking energy when braking; when it is close to the crossroad, the queue is recombined again to make the vehicle group pass through in order; step four, based on the sample data in the training environment of Soft Actor-Critic strengthening learning algorithm, continuously iteratively updating according to the set loss function, finally obtaining the optimal strategy capable of making the vehicle group rearrange the queue according to the driving style of different drivers on the road not more than three lanes; The invention can rationally plan the vehicle queue according to the driving intention of the driver so that the vehicle queue can efficiently drive.|1. A PHEV hybrid vehicle group optimization control method for queue management and adaptive cruise control, wherein the mixed vehicle group is a vehicle group composed of an intelligent network vehicle and a conventional autonomous vehicle of manual driving; step one, clustering the driving data of multiple working conditions by multiple characteristic parameters to determine multiple driving styles; step , determining the vehicle lane-changing safety area according to the obtained vehicle state information and the surrounding environment information; step three, in the self-adaptive cruise control, based on the safety and comfort, restraining the vehicle distance, vehicle speed and acceleration, and reasonably recycling the braking energy when braking; when it is close to the crossroad, the queue is recombined again to make the vehicle group pass through in order; step , based on the sample data in the SoftActor-Critic strengthening learning algorithm training environment, continuously iteratively updating according to the set loss function, finally obtaining the optimal strategy capable of making the vehicle group rearrange the queue according to the driving styles of different drivers on the road not more than three lanes.\n | 2. The PHEV hybrid vehicle group optimization control method for queue management and adaptive cruise control according to claim 1, wherein the vehicle at the head in the vehicle queue of the vehicle group is intelligent network vehicle, the queue length is limited to be not more than 8 vehicles; In step one, the specific classification method of different driving styles is to reduce the dimension of the characteristic parameters from the historical data of a large number of manual driving vehicles through the principal component analysis method, and then the classified three driving styles are obtained by K-mean algorithm clustering, and the different driving styles are embodied in the following characteristic parameters: average longitudinal velocity the maximum longitudinal vehicle speed v max, the minimum longitudinal vehicle speed v min, the longitudinal vehicle speed standard deviation e v, longitudinal acceleration average value/> the maximum value of the longitudinal acceleration ax max, the minimum value of the longitudinal acceleration ax min, the standard deviation of the longitudinal acceleration sigma x, the average value of the transverse acceleration/> transverse acceleration maximum value ay max, transverse acceleration minimum value ay min, transverse acceleration standard deviation sigma y, vehicle head time distance THW, collision time parameter TTC, minimum vehicle head distance DHWmin; The standard deviation [e] of the longitudinal vehicle speed is calculated as follows: The vehicle head time interval THW is the time taken from the vehicle head of the main vehicle to the vehicle tail of the front vehicle under the current vehicle speed, the collision time parameter TTC is the time needed by the collision between the main vehicle and the front vehicle under the current state, and the calculation formula is as follows: .upsilon. rel=.upsilon. p- .upsilon. wherein drel is the relative distance between two vehicles, vp is the speed of the main vehicle, vf is the speed of the front vehicle; the distance between the vehicle heads is the distance between the vehicle head of the main vehicle and the vehicle head of the front vehicle in the same lane, the larger the value is, the larger the distance between the two vehicles is, the smaller the possibility of collision accident between the two vehicles is; on the other hand, the accident possibility of the two vehicles is larger; The smaller the distance between the vehicle heads, the more aggressive the driving of the driver is reflected, so the minimum distance between the vehicle heads is selected as the characteristic parameter index of the driving style.\n | 3. The PHEV hybrid vehicle group optimization control method for queue management and adaptive cruise control according to claim 1, wherein in step two, the driving road of the vehicle queue is three lanes, the vehicle group recombination process relates to lane changing operation, the network-connected vehicle obtains the surrounding vehicle and environment information through V2V communication and vehicle-mounted sensor, the information comprises vehicle speed, vehicle body length, torque, power, vehicle distance, vehicle position information, intersection signal lamp phase; The method for detecting the lane-changing security area used in the lane-changing operation is as follows: The method is based on vehicle kinematics, in the method, when the vehicle is at a certain position, the vertex coordinates [xp1 (t), yp1 (t)] in the right front end direction are expressed as: wherein: v p (t), theta p (t) are respectively the main vehicle speed and yaw angle of the vehicle; tm is the initial time point of the main vehicle lane changing, tn is the final time point; Similarly, the coordinates of the left front vertex [xp2 (t), yp2 (t)], the left rear vertex [xp3 (t), yp3 (t)] and the right rear vertex [xp4 (t), yp4 (t)] of the main vehicle at time t are: in the formula: a is the length of the vehicle; b is the width of the vehicle; in the vehicle lane changing process, the main vehicle analyzes the reasonable vehicle lane changing safety area according to the condition that the main vehicle does not collide with the surrounding vehicle; assuming that the main vehicle is changed to the left at a certain time point in the future, the collision condition with the front vehicle is that the vehicle speed of the vehicle is greater than that of the front vehicle, the vehicle distance is gradually shortened, and the collision condition between the right front vertex of the main vehicle and the left rear vertex of the front vehicle will occur; The collision point is set as S1, and the collision time point is set as the coordinate of the collision point S1/> is represented by: in the formula: v f (t) is the speed of the front vehicle of the current lane at t moment, D1 is the distance between the main vehicle and the front vehicle of the lane; if there is a collision with the rear vehicle of the lane-changing target lane, the speed of the main vehicle is less than that of the rear vehicle, the distance between the two vehicles is reduced along with the time, at this time, the lane-changing target lane is changed, the left top point of the main vehicle is collided with the rear vehicle of the target lane; setting the collision point as S2, and the collision time point as then according to the vehicle structure size and autonomous vehicle kinematics theory, the coordinate of the collision point S2 is expressed as: in the formula: v r (t) is the speed of the vehicle behind the target lane at time t, D1 is the relative distance between the autonomous vehicle and the vehicle behind the target lane, and l is the width of the lane; according to the collision point S1, S2 point coordinate, determining the vehicle lane changing safety domain to avoid collision; the vehicle is changed to the right side so as to be in the same theory; the intelligent network vehicle judges whether the safety domain satisfies the lane changing condition according to the detected surrounding vehicle information.\n | 4. The PHEV hybrid vehicle group optimization control method for queue management and adaptive cruise control according to claim 1, wherein in step three, each intelligent network vehicle is taken as an intelligent body, n network vehicles in the vehicle group are regarded as n intelligent bodies, the number n of the network vehicles in the vehicle group is limited in the range allowed by the calculation force, n network vehicles are controlled by n parallel intelligent bodies to realize interaction; n intelligent bodies share the same neural network and parameter; through the parameter sharing structure of the neural network algorithm, the improvement of the driving state of any network-connected vehicle is beneficial to the vehicle group reward gain; The intelligent network-linked vehicle in the vehicle queue interacts with the vehicle in the adjacent vehicle queue, at the same time, the intelligent network-linked vehicle in the vehicle queue and the manual driving vehicle also maintain interactive cooperation, the interactive cooperation method comprises: Method A, when on the common lane, vehicle queue through self-adaptive cruise with regenerative braking cooperative control, the vehicle between keep reasonable vehicle distance, namely two continuous vehicle must continuously keep a safe longitudinal gap; the deviation from the safe distance, that is, the distance error is as small as possible to reduce the collision risk, and give play to the low oil consumption of the vehicle queue, The advantage of high traffic throughput is that it is compatible with the randomness of the driving of the common vehicle, so the intelligent network vehicle and the common vehicle need larger vehicle distance, when the vehicle is braked, the motor recovers part of the braking energy; method B, when the vehicle is close to the intersection with signal lamp, each vehicle in the vehicle queue is split and recombined to reduce the energy consumption and driving time, so that part of the queue vehicles orderly pass through the intersection before the green light signal is stopped, and the rest vehicles wait before the stop line.\n | 5. The PHEV hybrid vehicle group optimization control method for queue management and adaptive cruise control according to claim 4, wherein in step four, the SoftActor-Critic reinforcement learning algorithm is the SAC algorithm, which is the Off-policy model-free non-policy depth reinforcement learning algorithm combining the maximum entropy learning with the Actor-Critic framework; the learning content of the SAC reinforcement learning algorithm comprises a state s, an action a, a reward r and an environment model p; the state comprises oil consumption of vehicle, battery charge state, speed, acceleration, yaw angle, vehicle distance, action as torque, steering angle, reward as fuel consumption, driving time, comfort, self-adaptive cruise cost function; step four, SAC algorithm training and learning the sample data from the environment and continuously updating and optimizing to finally obtain the optimal strategy, so that the intelligent network vehicle in the mixed vehicle group can be distributed in different lanes according to the driving style of the driver, and the mixed vehicle in the same lane forms queues of different lengths; the driving styles of different vehicles are classified into radical type, stable type and prudent type; When the vehicle queue is running on the road and different vehicles are distributed with lanes, the vehicles of the radical style tend to be arranged on the leftmost lane, the robust vehicles tend to be on the middle lane, and the cautious vehicles are on the rightmost lane; the gain degree of the vehicle group is determined; The SAC algorithm adjusts the final distribution result of the vehicle running lane according to the gain degree of the vehicle group.\n | 6. The PHEV hybrid vehicle group optimization control method for queue management and adaptive cruise control according to claim 5, wherein the SAC algorithm is composed of one actor neural network and four critic neural networks; the input of the actor neural network is state, the output is action probability distribution parameter P (x); 4 critic neural networks are divided into state value estimation v critic and v critic target network, action-state value estimation Q1 and Q2 critic neural network; the input of the Qcritic neural network is the state and the output is the value of the state; wherein the output of the Vcritic neural network is v (s), representing the estimation of the state value; the output of the Qcritic neural network is q (s, a), which represents the estimation of the action-state to the value; n intelligent bodies share the same neural network and parameter; through the parameter sharing structure of the neural network algorithm, the improvement of the driving state of any network-connected vehicle is beneficial to the vehicle group reward gain; In the algorithm, entropy is defined as: wherein x follows the probability density function P (x) distribution; the introduction of the maximum entropy makes the output of the action more dispersed, which avoids the excessive concentration of the output action, so as to improve the searching ability of the algorithm, the learning ability and stability of the new task; The optimal policy in the SAC algorithm framework is expressed as: Pi represents the strategy adopted by the intelligent body, a is action, s represents state, r represents reward; a is the temperature parameter, determining the relative importance of the reward entropy, so as to ensure the randomness of the optimal strategy; The state space S of the SAC is defined as: Wherein, is the driving style, soc is the battery charge state, v p is the vehicle speed, ap is the vehicle acceleration, tdri is the driving time, theta is the yaw angle, ddes is the distance from the front vehicle; The motion space A is defined as: A = (Tp, [delta] p) Equation 13; wherein Tp is the torque of the vehicle, delta p is the steering wheel corner of the vehicle; The reward function is defined as: R = (w1-mfuelcfuel + w2-Pbattcelec + w3 - (tdri-tref) + w4-Prec + w5-Jmin) formula 14; w1, w2, w3, w4, w5 is the proportion coefficient, mfuel represents the oil consumption of the current intelligent network-connected vehicle, cfuel is the fuel price, Pbatt is the motor power, celec is the price of electricity, tref is the reference driving time, Prec is the brake energy recycling power, Jmin is self-adaptive cruise comprehensive value function; the vehicle driving action is independently executed by each network vehicle, the corresponding reward value is optimized by collecting the control experience of the network vehicle to a centralized playback buffer area; For the specific state st and action at, the soft value function Qsoft (st, at) of the algorithm is expressed as follows: formula 15; wherein, y belongs to [0, 1] is a scale factor; In order to avoid the overestimation when the Q value is maximized and the further overestimation when the target is calculated by using the target network, the SAC algorithm introduces two online networks Q1 and Q2, the parameters are e1 and e2 respectively, and the two target networks v and v target respectively, the parameters are e1 and e2 respectively. and selecting the minimum function value output by the target network as the target value of the target frame; The soft value network parameter is updated by minimizing the loss function, as shown below: The strategy is expressed by Gaussian distribution in the random strategy, that is, the state is mapped into the mean value and variance of the Gaussian distribution by the parameter, and the action is obtained by sampling from the Gaussian distribution; if the state st is used as the input, outputting a Gaussian distribution with mean and standard deviation; then using the re-parameterized technique to obtain the action at, the formula is as follows: in the formula, Epsilon t is the noise signal sampled from the standard normal distribution; is the average value and standard deviation of the Gaussian distribution, wherein u (st) and sigma (st) are the average value and standard deviation of the Gaussian distribution, t is the noise signal sampled from the standard normal distribution; The relationship between the policy function and the soft function is expressed as: updating the policy network parameter by minimizing Kullback-Leibler divergence; the smaller the Kullback-Leibler divergence, the smaller the difference between the rewards corresponding to the output action, the better the convergence effect of the strategy; The update rule of the policy network is expressed as: wherein Z (st) is a distribution function for normalizing the distribution; Finally, the policy network parameter is updated according to the gradient descent method, which is expressed as: the adjustment of the temperature coefficient is important to the training effect of the SAC algorithm; the optimal temperature coefficient is different along with the difference of the strengthening learning task and the training period; the temperature coefficient automatic adjusting mechanism is used; Under this mechanism, the constrained optimization problem is constructed, and the optimal temperature coefficient of each step is obtained by minimizing the objective function, which is expressed as: wherein H0 is a predefined minimum policy entropy threshold.\n | 7. The PHEV hybrid vehicle group optimization control method for queue management and adaptive cruise control according to claim 5, wherein step four, through SAC strengthening learning algorithm, training and learning the involved vehicle group queue recombination, self-adaptive cruise and efficient traffic data of crossroad, so as to obtain the optimal control strategy; the SAC considers the state of the network vehicle in the vehicle group and finds the optimal adaptive cruise control strategy, the optimal adaptive cruise control strategy is fed back to the torque and corner of each network vehicle corresponding to the adaptive cruise control, controlling the vehicle running track; in the adaptive cruise control of step three, the needed vehicle distance is influenced by the driving style of the driver, the road commuting efficiency and the vehicle safety, considering the uncertainty of the driving intention of the manual driving vehicle to make the distance between the online vehicle and the manual driving vehicle larger than the vehicle distance between the online vehicle; if the vehicle distance is too narrow, the commuting efficiency will be improved, but the anxiety of the driver may cause the collision accident; On the contrary, the larger vehicle distance is the guarantee of the safety of the vehicle, but the road commuting efficiency will deteriorate, and the side-line vehicle is easy to insert; The constant time vehicle head time interval CTH is used for the vehicle interval algorithm, as shown below: ddes=tau h v h + d0 formula 22; wherein tau h is nominal vehicle head time distance, d0 is safe parking distance; There is the following constraint formula in the following vehicle-following safety: dmin < d < dmax [Delta] d=d-ddes [Delta] vmin < [Delta] vmax [Delta] v=vp_vf wherein d is the actual vehicle distance between the autonomous vehicle and the front vehicle, and dmin and dmax are the minimum and maximum vehicle distances; delta v is the speed difference between the autonomous vehicle and the front vehicle, delta v min and delta v max are the minimum and maximum speed difference; the comfort constraint formula is as follows: FORMULA; delta a = ap-af; af is the acceleration of the front vehicle; The adaptive cruise comprehensive value function is: Jmin=w 6 delta d2 + w 7 delta v 2 + w 8 delta a2 formula 23.\n | 8. The PHEV hybrid vehicle group optimization control method for queue management and adaptive cruise control according to claim 5, wherein when the brake force distribution is limited according to the ECE rule, the following brake force distribution strategy is adopted; when the brake strength z is less than z1, the brake force is only provided by the front axle; when the braking strength z1 is less than z2, the braking force of the front and back shafts is distributed along the stated line of the ECE; when the brake strength z2 is less than z3, the brake force of the front axle is not changed, and the brake force of the rear axle is increased; when the brake strength z3 is less than z, the brake of the motor is stopped, and the brake forces of the front and back shafts are distributed along the beta line; in the whole braking process, if the motor braking force is not enough, the hydraulic braking force will compensate the loss of the total braking force; The brake force distribution is represented by the following formula: The boundary of z is calculated as follows: wherein Fbf represents the front shaft braking force, Fbr is the rear shaft braking force, Fb is the total required braking force, L is the total wheelbase, k is the rear wheelbase, hg is the mass centre height, Tbmax is the motor braking moment maximum value, beta is the braking force distribution coefficient, theta is the correction coefficient of the rotating quality, rw is the radius of the wheel, it is the vehicle transmission ratio, n is the transmission efficiency.\n | 9. The PHEV hybrid vehicle group optimization control method for queue management and adaptive cruise control according to claim 5, wherein step four, when the vehicle group is close to the intersection with the signal lamp, the optimal strategy adjusts the driving torque of each online vehicle according to the traffic lamp signal timing, corner and braking force, performing queue recombination and queue length planning, so that part of the vehicle forms a queue to pass through the intersection in the green light period, the rest vehicle waits before the stop line, reducing the energy consumption of the whole vehicle group, and realizing better economical efficiency and passing efficiency.", "output": "The method involves clustering the driving data of multiple working conditions by multiple characteristic parameters to determine multiple driving styles. The vehicle lane-changing safety area is determined according to the obtained vehicle state information and the surrounding environment information. The vehicle distance, vehicle speed and acceleration is restrained in the self-adaptive cruise control based on the safety and comfort. The braking energy when braking is recycled. The queue is recombined again to make the vehicle group pass through in order when it is close to the crossroad. The optimal strategy capable of making the vehicle group rearrange the queue is obtained according to the driving styles of different drivers on the road not more than three lanes. PHEV hybrid vehicle group optimization control method for queue management and self-adaptive cruise control. Can also be used in hybrid power vehicles and electric vehicles. The method enables rationally planning the vehicle queue according to the driving intention of the driver, so that the queue can efficiently drive the vehicle. The drawing shows a flow diagram of a PHEV hybrid vehicle group optimization control method for queue management and adaptive cruise control. (Drawing includes non-English language text)", "instruction": "Please summarize the input" }, { "input": "DEVICE AND METHOD FOR SELF-AUTOMATED PARKING LOT FOR AUTONOMOUS VEHICLES BASED ON VEHICULAR NETWORKINGThe present disclosure relates to a device and a method for self-automated parking lots for autonomous vehicles based on vehicular networking, advantageous in reducing parking movements and space. It is described a device for self-automated parking lot for autonomous vehicles based on vehicular networking, comprising: a vehicle electronic module for receiving, executing and reporting vehicle movements, a parking lot controller for managing and coordinating a group of vehicles in parking and unparking maneuvers, the vehicle module and controller comprising a vehicular ad hoc networking communication system. It is also described a method comprising moving autonomously in platoon one or more rows of already parked vehicles in order to make available a parking space for a vehicle arriving to the parking space; and moving autonomously in platoon one or more rows of parked vehicles in order to make a parked vehicle able to exit the parking space.|1. A device for self-automated parking lot for autonomous vehicles based on vehicular networking, comprising: \na parking lot controller for managing and coordinating a group of vehicles in parking and unparking maneuvers in said parking lot; \neach of said vehicles comprising a vehicle electronic module for receiving, executing and reporting vehicle movements, \nwherein said vehicle movements are sent by, and reported to, the parking lot controller, \nthe parking lot controller comprising a vehicular networking communication system for communicating with the communication system of the vehicle module, \nwherein the parking lot controller is configured for: \nmoving autonomously in platoon one or more rows of already parked vehicles in order to make available a parking space for a vehicle arriving to the parking space; and \nmoving autonomously in platoon one or more rows of parked vehicles in order to make a parked vehicle able to exit the parking space. \n | 2. The device according to claim 1, wherein said vehicular communication system comprises a dedicated short-range communication protocol.\n | 3. The device according to claim 1, wherein said vehicular communication system is a mobile communications system.\n | 4. The device according to claim 1, wherein said vehicular communicating is a vehicle-to-infrastructure communication system.\n | 5. The device according to claim 1, wherein said controller is further configured for: \nmanaging parking infrastructure access based on space availability; \nmanaging vehicle movements upon entering parking infrastructure until the designated parking space is reached; \ncoordinating vehicle or vehicles movements to allow enter or exit of vehicle or vehicles in the parking area; and \nusing a communication module for sending data describing said vehicle movements. \n | 6. The device according to claim 5, wherein said parking lot controller is configured for also performing as vehicle module, when the parking lot controller functions are assumed by an elected vehicle where this vehicle module is placed.\n | 7. The device according to claim 1, wherein said vehicle module is configured for transferring said parking lot controller functions to another vehicle module just before the exit of the parking lot of the controller.\n | 8. The device according to claim 1, further comprising a positioning system for positioning the vehicle, a user interface for receiving and displaying user interactions, a connection to the vehicle actuators, computer readable memory and a computer processor.\n | 9. The device according to claim 1, wherein said parking lot controller is a local or remote server.\n | 10. The device according to the claim 9, further comprising a user interface for receiving and displaying user interactions, computer readable memory and a computer processor.\n | 11. A method for operating a self-automated parking lot for autonomous vehicles based on vehicular networking, \nsaid self-automated parking lot comprising a parking lot controller for managing and coordinating the vehicles in parking and unparking maneuvers in said parking lot, and \neach vehicle comprising a vehicle electronic module for receiving, executing and reporting vehicle movements, wherein said vehicle movements are received from, and reported to, said parking lot controller by a communications system, said method comprising: \nmoving autonomously in platoon one or more rows of already parked vehicles in order to make available a parking space for a vehicle arriving to the parking space; and \nmoving autonomously in platoon one or more rows of parked vehicles in order to make a parked vehicle able to exit the parking space. \n | 12. The method according to claim 11, further comprising: \nmoving autonomously in platoon two rows of vehicles such that vehicles move in carousel between the two rows, transferring vehicles of a first end of the first row of vehicles to a first end of the second row of vehicles, and transferring vehicles of the second end of the second row of vehicles to the second end of the first row of vehicles. \n | 13. The method according to claim 11, further comprising: \nmoving autonomously in platoon one row of vehicles such that an empty parking space is obtained at one end of said row for receiving a vehicle entering the parking lot. \n | 14. The method according to claim 11, further comprising: \nmoving autonomously in platoon two rows of vehicles such that vehicles move in carousel between the two rows, transferring vehicles of a first end of the first row of vehicles to a first end of the second row of vehicles, and transferring vehicles of the second end of the second row of vehicles to the second end of the first row of vehicles, \nsuch that a vehicle exiting the parking lot is moved to one of the ends of one of the vehicle rows. \n | 15. The method according to claim 11, further comprising: \non approaching the parking lot, the vehicle module communicating with the parking lot controller to signal the vehicle arrival and receiving a designated parking area; \nsubsequently, the parking lot controller generating, from a data map of the parking lot vehicles, a number of movements from one or more rows of vehicles to one or more rows of vehicles of the parking lot, then calculating the least costly movement and executing said movement by communicating said movement to the vehicle modules. \n | 16. The method according to claim 11, further comprising: \nthe parking lot controller receiving vehicle position and sensor status data from the vehicle modules, creating a data map of the parking lot vehicles, periodically broadcasting vehicle modules with updates of said data. \n | 17. The method according to claim 11, wherein the vehicle rows are linear, circular, elliptical, spiral, or combinations thereof.\n | 18. The method according to claim 11, wherein the vehicle rows are grouped in cascading or interlinking parking zones such that only a part of the vehicle rows of one zone are able to exchange vehicles with the vehicle rows of another zone.\n | 19. The method according to claim 11, wherein the parking lot controller is carried out by one of the vehicle electronic modules, in particular by electing a vehicle module by the vehicle modules by a set of predefined criteria, further in particular by resolving a conflict of tied vehicle modules by a set of predefined criteria.\n | 20. A non-transitory storage media including program instructions for implementing a method for operating a self-automated parking lot for autonomous vehicles based on vehicular ad hoc networking, the program instructions including instructions executable to carry out the method of claim 11.", "output": "The device has a parking lot controller to manage/coordinate a group of vehicle (xx0,xx9) in parking/un-parking maneuvers in a parking lot. A vehicle electronic module is provided in the vehicle to receive, execute and report vehicle movements. A vehicular networking communication system (xx1) is provided for communicating with a communication system of the module. The controller is configured for moving autonomously in platoon/rows of already parked vehicles in order to make available a parking space for a vehicle arriving to space and to make a parked vehicle able to exit the space. INDEPENDENT CLAIMS are included for the following:a method for operating a self-automated parking lot for autonomous vehicles based on vehicular networking; anda non-transitory media storing program for operating a self-automated parking lot for autonomous vehicles based on vehicular networking. Device for self-automated parking lot for autonomous vehicles by vehicular networking. The car that is parked in the parking space requires a minimal travel distance of the cars in the parking lot where no optimization based on the estimated exit time is used. The total travelled distance is significantly made less even with such non-optimized strategy. The self-automated parking lots for autonomous vehicles based on vehicular networking are thus beneficial in reducing parking movements and space. The drawing shows a schematic view of a collaborative parking system. x10Computing systemxx0,xx9Vehiclexx1Vehicular networking communication systemxx2Positioning systemxx7Vehicle actuator", "instruction": "Please summarize the input" }, { "input": "DRIVING SYSTEM IN AREA OUTSIDE ENHANCED AUTONOMOUS VALET PARKING LOT, AND APPLICATION METHOD THEREOFA driving system in an area outside an enhanced autonomous valet parking (E-AVP) lot, and an application method thereof, relating to the technical fields of autonomous driving, vehicle-infrastructure cooperation, and intelligent transportation systems. The system comprises a client, an E-AVP cloud, a road side unit, and a 5G-V2X vehicle end; the client communicates with the E-AVP cloud by means of 5G; the E-AVP cloud communicates with the client, the road side unit, and the 5G-V2X vehicle end by means of 5G; the road side unit communicates with the 5G-V2X vehicle end by means of V2X communication; the 5G-V2X vehicle end achieves inter-vehicle interaction by means of a CAN bus. The present application achieves interaction among an E-AVP management system, a user terminal, and a vehicle-mounted terminal, and solves the existing problems that parking and vehicle searching are difficult for users, thereby achieving the objectives of unmanned supervision of remote valet parking and improvement of the parking experience of the users.|1. An enhanced driving system of area outside autonomous parking parking lot, wherein it comprises a client, an E-AVP cloud end, a road test unit and a 5G-V2X vehicle end, wherein: the client end is used for performing identity authentication, sending a request instruction, communicating with the E-AVP cloud end through the 5G and feeding back the vehicle track; the E-AVP cloud is used for performing identity authentication, performing global track planning through the communication between the 5G and the client, the drive test unit and the 5G-V2X vehicle end, and issuing the track slice to the drive test unit according to the corresponding range of each drive test unit; interacting with the AVP system; the road test unit is used for performing identity authentication, communicating with the 5G-V2X vehicle end through V2X communication, and guiding the vehicle according to the track slice sent by the E-AVP cloud end; 5G-V2X vehicle end for identity authentication, through CAN bus between vehicle interaction, obstacle identification and obstacle track generation.\n | 2. The driving system of enhanced type autonomous parking parking parking area according to claim 1, wherein, when the client sends the request instruction to the E-AVP cloud end, it needs to carry out the on-chain identity authentication, if the identity authentication is not passed or the request instruction sent by the client is invalid, the E-AVP cloud does not respond and returns the information of refusing response; if the identity authentication is successful, the E-AVP cloud sends a request response to the client, and sends a request instruction to the 5G-V2X vehicle end and the drive test unit; the request instruction sent by the client to the E-AVP cloud comprises parking lot request, parking space request, parking request or car taking request; the request response sent by the E-AVP cloud end received by the client comprises whether the E-AVP cloud end is valid, whether the vehicle is finished to park, whether the off-field parking is finished and the current position information of the vehicle.\n | 3. The driving system of enhanced autonomous parking parking area according to claim 1, wherein, 5G-V2X vehicle end collecting position information from the vehicle. the driving information and the track information of other vehicles, the position information of the vehicle and the driving information are sent to the E-AVP cloud end, the E-AVP cloud end generates the global track according to the received information.\n | 4. The driving system of enhanced autonomous parking parking parking area according to claim 1 or 3, wherein the E-AVP cloud end according to the request instruction to generate global track, and the global track according to the road test unit covering the global track to slice, and distributing the cut track to each corresponding drive test unit.\n | 5. The driving system of enhanced autonomous parking parking area according to claim 1, wherein the E-AVP cloud end after the global track planning according to the range of each road test unit to the global track planning, according to the range slice corresponding to each road test unit, sending each slice to the road test unit; the slice sent by the E-AVP cloud end to the road test unit comprises a starting point and an end point position of the vehicle planning estimation in the range of the road test unit, and a road section ID through which the starting point position to the end point position orderly passes, each road section ID comprises a lane ID when the vehicle runs on the road section, Each lane ID comprises a road centre line of the lane, and the vehicle travels according to the position of the road centre line.\n | 6. The driving system of enhanced autonomous parking parking area according to claim 1, wherein, E-AVP cloud end to generate the global track to 5G-V2X vehicle end, 5G-V2X vehicle end through V2X with other vehicle share from the global track; the 5G-V2X vehicle end communicates with the vehicle network through the CAN bus, comprising sending the global track to the vehicle network, realizing the real-time control of the vehicle through the vehicle network.\n | 7. An application method of the driving system in the outer area of the enhanced autonomous parking parking lot, wherein it comprises the driving system in the outer area of the enhanced autonomous parking lot, the autonomous parking AVP system and the client terminal according to claim 1, the client terminal comprises identity authentication, request instruction, customer service end control, customer service end and E-AVP cloud end data interaction and vehicle track feedback; A driving method of a driving system in an area outside an enhanced autonomous passenger-and-parking parking lot comprises the following steps: before the client sends the parking request, the E-AVP cloud performs identity authentication on the client through the blockchain technology; after the client passes the identity authentication, the client sends a request instruction to the E-AVP cloud; when the E-AVP cloud end receives the request instruction, the E-AVP cloud end obtains the vehicle information from the client end, and obtains the parking space information and parking lot information from the vehicle network cloud end; the E-AVP cloud performs global path planning according to the obtained information, and slices the planned path as a local path planning, and sends the local path planning to the corresponding drive test unit through 5G; the E-AVP cloud uses the blockchain technology to perform the identity authentication on the drive test unit, after passing the authentication, the E-AVP cloud sends the slice information to the drive test unit; the road test unit guides the vehicle to advance according to the track planned by the local track according to the received local path plan; the road detection unit performs identity authentication to the 5G-V2X controller by using the blockchain technology, after passing the identity authentication, sends the path plan to the 5G-V2X controller, the 5G-V2X controller guides the vehicle to run to the entrance of the parking lot according to the received information.\n | 8. The application method of the driving system of the enhanced autonomous parking parking area according to claim 7, wherein, if the E-AVP cloud end through block chain technology to the client end for identity authentication, the client end is not through identity authentication, or if the identity authentication in the request sent by the client after the identity authentication is invalid, the E-AVP cloud does not accept the request instruction of the client.", "output": "The system has a road detecting unit for performing identity authentication, where a track slice is sent to the road test unit according to the corresponding range of the road test unit. A vehicle end is connected with a vehicle end Fifth-generation (5G)-V2X. The vehicle end guides a vehicle according to a track slice sent by an E-AVP cloud. A client is utilized for performing identity authentication and sending request instruction. The vehicle end performs the identity authentication to realize vehicle interaction through a Controller Area Network (CAN) bus, obstacle recognition and obstacle avoidance track generation. An INDEPENDENT CLAIM is included for an enhanced autonomous passenger parking car park external area driving system application method. Enhanced autonomous passenger parking car park external area driving system. The system realizes interaction of the vehicle terminal by the E-AVP management system and the user terminal, so as to solve the problem that the existing user parking is difficult to search to realize the unmanned monitoring of remote parking, thus improving the user parking experience. The drawing shows a schemctic block diagram of an enhanced autonomous passenger parking car park external area driving system (Drawing includes non-English language text).", "instruction": "Please summarize the input" }, { "input": "Methods and software for managing vehicle priority in a self-organizing traffic control systemMethods and software for managing vehicle priority proximate to a potential travel-priority conflict zone, such as a roadway intersection, where travel conflicts, such as crossing traffic, can arise. Coordination involves forming an ad-hoc network in a region containing the conflict zone using, for example, vehicle-to-vehicle communications and developing a dynamic traffic control plan based on information about vehicles approaching the conflict zone. Instructions based on the dynamic traffic control plan are communicated to devices aboard vehicles in the ad-hoc network, which display one or more virtual traffic signals to the operators of the vehicles and/or control the vehicles (for example, in autonomous vehicles) in accordance with the dynamic traffic control plan, which may account for a priority level associated with one or more of the vehicles.What is claimed is: \n | 1. A method of managing vehicle priority proximate to a potential travel-priority conflict zone, the method being executed in a dynamic traffic control system and comprising: \ncommunicating with a first component of the dynamic traffic control system located on-board a vehicle proximate to the potential travel-priority conflict zone so as to establish a dynamic traffic control plan for avoiding a travel-priority conflict in the potential travel-priority conflict zone; \ncoordinating with the first component of the dynamic traffic control system via said communicating to elect a dynamic traffic controller as a temporary coordinator vehicle responsible for temporarily coordinating the dynamic traffic control plan; \nreceiving a priority-request message from a priority vehicle; \ndetermining a travel direction of the priority vehicle; \ncomparing the travel direction of the priority vehicle to a travel direction of a non-priority vehicle proximate to the potential travel-priority conflict zone; \ntransmitting a priority-granted message to the priority vehicle when the travel direction of the priority vehicle and the travel direction of the non-priority vehicle proximate to the potential travel-priority conflict zone differ; and \nproviding traffic control instructions to an operator of the priority vehicle via a visual or audio indication produced in the priority vehicle as a function of the priority-granted message. \n | 2. A method of managing vehicle priority proximate to a potential travel-priority conflict zone, the method being executed in a dynamic traffic control system and comprising: \ncommunicating with a first component of the dynamic traffic control system located on-board a vehicle proximate to the potential travel-priority conflict zone so as to establish a dynamic traffic control plan for avoiding a travel-priority conflict in the potential travel-priority conflict zone; \ncoordinating with the first component of the dynamic traffic control system via said communicating to elect a first dynamic traffic controller as a first temporary coordinator vehicle responsible for temporarily coordinating the dynamic traffic control plan; \nreceiving a priority-request message from a priority vehicle; \ndetermining a travel direction of the priority vehicle; \ncomparing the travel direction of the priority vehicle to a travel direction of a non-priority vehicle proximate to the potential travel-priority conflict zone; and \nwhen the travel direction of the priority vehicle and the travel direction of the non-priority vehicle proximate to the potential travel-priority conflict zone are the same, coordinating with the first component of the dynamic traffic control system via said communicating to hand over responsibility for coordinating the dynamic traffic control plan to a new temporary coordinator vehicle by electing a second dynamic traffic controller as a second temporary coordinator vehicle responsible for temporarily coordinating the dynamic traffic control plan. \n | 3. A method according to claim 1 or 2, wherein said receiving a priority-request message includes receiving a priority-request message from an emergency vehicle.\n | 4. A method according to claim 1 or 2, further comprising receiving a priority-clear message from the priority vehicle.\n | 5. A method according to claim 1 or 2, wherein at least a portion of said communicating is performed via vehicle-to-vehicle communication.\n | 6. A method according to claim 1 or 2, further comprising revoking priority for the priority vehicle if no transmissions are received from the priority vehicle for a predetermined period of time.", "output": "The managing method involves coordinating with the primary component of the dynamic traffic control system through communication to elect dynamic traffic controller as a temporary coordinator vehicle responsible for temporarily coordinating the dynamic traffic control plan. A priority request message is received from the priority vehicle before the priority granted message is transmitted to the priority vehicle. An INDEPENDENT CLAIM is also included for a machine readable storage medium. Managing method for vehicle priority proximate to potential travel priority conflict zone. Uses include but are not limited to bus, trail, trolley, streetcar. The dynamic traffic control system may weight the travel directions and lanes containing mass transit vehicles in a manner that allows each of those travel directions and lanes to clear more quickly than they would if a non-priority vehicle were present in place of each mass transit vehicle. The drawing shows a flow diagram of the managing method for vehicle priority in self-organizing traffic control system from the perspective of a priority vehicle.", "instruction": "Please summarize the input" }, { "input": "Methods and software for managing vehicle priority in a self-organizing traffic control systemMethods and software for managing vehicle priority proximate to a potential travel-priority conflict zone, such as a roadway intersection, where travel conflicts, such as crossing traffic, can arise. Coordination involves forming an ad-hoc network in a region containing the conflict zone using, for example, vehicle-to-vehicle communications and developing a dynamic traffic control plan based on information about vehicles approaching the conflict zone. Instructions based on the dynamic traffic control plan are communicated to devices aboard vehicles in the ad-hoc network, which display one or more virtual traffic signals to the operators of the vehicles and/or control the vehicles (for example, in autonomous vehicles) in accordance with the dynamic traffic control plan, which may account for a priority level associated with one or more of the vehicles.What is claimed is: \n | 1. A non-transitory machine-readable storage medium containing machine-executable instructions for performing a method of managing vehicle priority proximate to a potential travel-priority conflict zone, the method being executed in a dynamic traffic control system, said machine-executable instructions comprising: \na first set of machine-executable instructions for communicating with a first component of the dynamic traffic control system located on-board a vehicle proximate to the potential travel-priority conflict zone so as to establish a dynamic traffic control plan for avoiding a travel-priority conflict in the potential travel-priority conflict zone; \na second set of machine-executable instructions for coordinating with the first component of the dynamic traffic control system via said communicating to elect a dynamic traffic controller as a temporary coordinator vehicle responsible for temporarily coordinating the dynamic traffic control plan; \na third set of machine-executable instructions for receiving a priority-request message from a priority vehicle; \na fourth set of machine-executable instructions for determining a travel direction of the priority vehicle; \na fifth set of machine-executable instructions for comparing the travel direction of the priority vehicle to a travel direction of a non-priority vehicle proximate to the potential travel-priority conflict zone; \na sixth set of machine-executable instructions for transmitting a priority-granted message to the priority vehicle when the travel direction of the priority vehicle and the travel direction of the non-priority vehicle proximate to the potential travel-priority conflict zone differ; and \na seventh set of machine-executable instructions for providing traffic control instructions to an operator of the priority vehicle via a visual or audio indication produced in the priority vehicle as a function of the priority-granted message. \n | 2. A non-transitory machine-readable storage medium according to claim 1, wherein said third set of machine-executable instructions includes machine-executable instructions for receiving a priority-request message from an emergency vehicle.\n | 3. A non-transitory machine-readable storage medium according to claim 1, further comprising machine-executable instructions for receiving a priority-clear message from the priority vehicle.\n | 4. A non-transitory machine-readable storage medium according to claim 1, further comprising machine-executable instructions for implementing vehicle-to-vehicle communication.\n | 5. A non-transitory machine-readable storage medium according to claim 1, further comprising machine-executable instructions for revoking priority for the priority vehicle if no transmissions are received from the priority vehicle for a predetermined period of time.\n | 6. A non-transitory machine-readable storage medium containing machine-executable instructions for performing a method of managing vehicle priority proximate to a potential travel-priority conflict zone, the method being executed in a dynamic traffic control system, said machine-executable instructions comprising: \na first set of machine-executable instructions for communicating with a first component of the dynamic traffic control system located on-board a vehicle proximate to the potential travel-priority conflict zone so as to establish a dynamic traffic control plan for avoiding a travel-priority conflict in the potential travel-priority conflict zone; \na second set of machine-executable instructions for coordinating with the first component of the dynamic traffic control system via said communicating to elect a dynamic traffic controller as a temporary coordinator vehicle responsible for temporarily coordinating the dynamic traffic control plan; \na third set of machine-executable instructions for receiving a priority-request message from a priority vehicle; \na fourth set of machine-executable instructions for determining a travel direction of the priority vehicle; \na fifth set of machine-executable instructions for comparing the travel direction of the priority vehicle to a travel direction of a non-priority vehicle proximate to the potential travel-priority conflict zone; and \na sixth set of machine-executable instructions for, when the travel direction of the priority vehicle and the travel direction of the non-priority vehicle proximate to the potential travel- priority conflict zone are the same, coordinating with the first component of the dynamic traffic control system via said communicating to hand over responsibility for coordinating the dynamic traffic control plan to a new temporary coordinator vehicle by electing a second dynamic traffic controller as a second temporary coordinator vehicle responsible for temporarily coordinating the dynamic traffic control plan. \n | 7. A non-transitory machine-readable storage medium according to claim 6, wherein said third set of machine-executable instructions includes machine-executable instructions for receiving a priority-request message from an emergency vehicle.\n | 8. A non-transitory machine-readable storage medium according to claim 6, further comprising machine-executable instructions for receiving a priority-clear message from the priority vehicle.\n | 9. A non-transitory machine-readable storage medium according to claim 6, further comprising machine-executable instructions for implementing vehicle-to-vehicle communication.\n | 10. A non-transitory machine-readable storage medium according to claim 6, further comprising machine-executable instructions for revoking priority for the priority vehicle if no transmissions are received from the priority vehicle for a predetermined period of time.", "output": "The storage medium comprises a component that provides with a set of machine-executable instructions for communicating in a dynamic traffic control system. The control system is located with on-board a vehicle proximate to the potential travel-priority conflict zone. A dynamic traffic control plan is established for avoiding a travel-priority conflict in the potential travel priority conflict zone. Another set of machine-executable instructions for coordinating with the component of the dynamic traffic control system through a dynamic traffic controller. Storage medium for storing instructions for a method for managing vehicle priority proximate to a potential travel-priority conflict zone in a dynamic traffic control system. The dynamic traffic control plan is established for avoiding a travel-priority conflict in the potential travel priority conflict zone, and thus enables to easily manage vehicle priority proximate to a potential travel-priority conflict zone in a dynamic traffic control system. The drawing shows a block diagram of a computing system. 824Storage device832Input device836Display848Remote device852Display adapter", "instruction": "Please summarize the input" }, { "input": "Self-driving rule learning method based on deep strengthening learningThe invention claims an autonomous driving rule learning method based on deep strengthening learning, under the vehicle networking environment, there are two types of vehicles in the road network, autonomous driving vehicle and network vehicle. the autonomous driving vehicle obtains the driving state of the online vehicle in the road network in real time through the vehicle-to-vehicle (V2V) communication technology of the vehicle-to-vehicle (vehicle-to-vehicle, V2V) communication system of the vehicle-to-vehicle (vehicle-to-vehicle, V2V) communication system of the vehicle-to-vehicle (vehicle-to-vehicle, V2V) communication system of the vehicle-to-vehicle (vehicle-to-vehicle, V2V) communication system of the vehicle-to-vehicle (vehicle-to-vehicle, V2V) communication system learning the autonomous driving rule, adjusting the vehicle queue driving distance, so as to maximize the average speed of the road network and improve the passing efficiency of the road network. The invention lays the foundation for further improving the autonomous decision-making ability of the vehicle by deep reinforcement learning.|1. An autonomous driving rule learning method based on deep strengthening learning, wherein The specific implementation steps of the method are as follows: step 1: obtaining self-driving vehicle information; In the driving process, the information to be acquired by the autonomous driving vehicle comprises: the position x and the speed v of the online vehicle in the road network; the driving state of the current autonomous driving vehicle comprises the speed, acceleration and position of the autonomous driving vehicle; the self-driving vehicle adopts the driving action according to the driving state of the network vehicle; the driving state of the network vehicle is used as the input of the driving strategy model; step 2: self-driving vehicle driving rule; The defined driving behaviour of the autonomous driving vehicle is the acceleration α of the vehicle, the speed of the autonomous driving vehicle at t, t + 1 time is velocityt, velocityt + 1, the equation of the autonomous driving vehicle updating motion state is: step 3: a reward and punishment mechanism of the driving rule of the autonomous driving vehicle; setting the acceleration threshold of the autonomous driving vehicle as accel_threshold, calculating the average value alpha avg of the stored autonomous driving vehicle driving behaviour alpha, comparing the alpha avg with the accel_threshold, If aavg > accel_threshold, then there is, raccel = r + 8 * (accel_threshold-aavg), aavg > accel_threshold, wherein, r represents a reward value obtained before the occurrence of a vehicle collision behaviour, and [delta] is a hyperparameter; there is a negative reward value rcollide=-500 when there is a vehicle collision; the ui (t) and hi (t) are respectively the speed and time distance of the vehicle i at the time step length t; The form of the reward equation is shown as follows: wherein v des expected speed; hmax is time interval threshold value, alpha is gain; step 4: a self-driving vehicle driving strategy model; the autonomous driving vehicle driving strategy model selects multi-layer perceptron MLP; the driving strategy model of autonomous driving vehicle is composed of four layers of network, comprising an input layer, a hidden layer and an output layer; the number of the hidden layer is 3, the number of the output layer is 1; step 5: learning the driving rule of the self-driving vehicle; the learning of the driving rule of the autonomous driving vehicle can obtain the position and speed of the online vehicle in the road network in each time step length, and the probability value of the driving behaviour is output by the driving strategy model of the autonomous driving vehicle; storing the position and speed of the online vehicle in the road network of each turn, the driving action and reward value adopted by the autonomous driving vehicle and the speed and position of the online vehicle in the next time step; after collecting the network vehicle driving state data, sampling MiniBatch from the data for training; wherein the self-driving vehicle driving strategy model adjustment is realized by PG algorithm; In the PG algorithm, J (theta) is used to represent the target function, representing the expected return of the whole round; The expected return of the track is expanded to obtain J (0) = # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # [pi] [theta] ([tau]) represents a probability of a selection behaviour, and r ([tau]) represents a reward value obtained in the round; The objective of the PG algorithm is to maximize the expected return value, and the maximized process is realized by gradient calculation to obtain the final form of solving the gradient. taking the probability distribution paold of the driving action aold of the autonomous driving vehicle as the expected output probability distribution; combining the driving state s of the online vehicle into a matrix and inputting to the neural network, outputting the probability distribution panew of the driving action after Softmax as the actual output probability distribution; Calculating the proximity of two probability distributions judging whether the calculated gradient is trustworthy according to the size of the reward value; the cross entropy loss function is reward value A discount processing is performed prior to reverse propagation, indicating that the current reward value is more important than the future reward value, Rdiscount=rl + yr2 + y2r3 + ... wherein y represents a discount factor, and the final form of the cross entropy loss function is as follows: Next, parameter update is performed. wherein, the learning-rate represents the learning rate, theta represents the driving strategy model of the autonomous driving vehicle before updating, The invention claims a driving strategy model of updated autonomous driving vehicle.\n | 2. The autonomous driving rule learning method based on deep strengthening learning according to claim 1, wherein The network structure of step 4 is as follows: an input layer: The input layer has two neurons, firstly according to the input element xi of the input layer, and a bias/> solving the input element f of the hidden layer; in the formula: the p layer is the element number of the input layer; q is the number of hidden layer elements; i represents an input layer neuron; the neural network input is the position and speed [vN, xN] of the online vehicle in the road network sensed by the autonomous driving vehicle, N represents the number of the online vehicle in the road network; hidden layer: Input element of hidden layer in the activation function, calculating the output element zj of the hidden layer, the activation function selects tanh function; The output element zj function expression of the hidden layer is output layer: The output element zj, weight of the hidden layer and a bias/> In its activation function, the input element f of the output layer is solved. in the formula: j is the element number of the output layer, n is the hidden layer number; the output layer is the driving action adopted by the self-driving vehicle; The input element of the output layer the output element yk of the output layer is solved in the activation function thereof, and the activation function uses Softmax function.", "output": "The method involves providing autonomous driving vehicle in vehicle queue vehicle-to-vehicle communication during the driving process. The position and speed of the connected vehicle in the road network are obtained. The autonomous driving vehicle is needed to adopt driving behaviors according to the driving status of the connected vehicle. The defined driving behavior of the autonomous driving vehicle is the acceleration of the vehicle. The speed of the autonomous driving vehicle updates the movement state. The basic goal of autonomous vehicle driving is to dissipate stop-and-stop waves in the road network. The acceleration threshold of autonomous driving vehicles is set to accel threshold. The driving strategy model of autonomous driving vehicle selects multi-layer perceptron (MLP). The driving rules of autonomous driving vehicle are learned. The probability value of driving behavior is outputted through the driving strategy model of autonomous driving vehicle. Autonomous driving rule learning based on deep reinforcement learning. The utilization of deep reinforcement learning improves the autonomous decision-making ability of vehicles. The driver of the non-standard operation and error operation influence of the running safety of the automobile is reduced, which improves the driving safety of a vehicle. The drawing shows a flowchart illustrates the implementation method for autonomous driving rule learning based on deep reinforcement learning. (Drawing includes non-English language text)", "instruction": "Please summarize the input" }, { "input": "Multi-agent cruise control method, device, electronic device and storage mediumThe invention claims a multi-intelligent cruise control method, device, electronic device and storage medium, wherein the intelligent cruise control of each automatic driving vehicle is realized by inputting the vehicle signal of the current time of the sub-vehicle team collected by each automatic driving vehicle to the intelligent optimization control model; wherein the intelligent optimization control model is obtained by performing centralized neural network parameter training on the partially observable Markov game model based on the vehicle queue real-time collection state sample built by multiple automatic driving vehicles. The invention continuously and continuously interacts with the environment, continuously and intelligently learns and adjusts the optimized control strategy of the networked cruise control, so as to adapt to the actual complex and variable network dynamic scene, The invention solves the problem that the current cruise control method based on network control has unpredictability of complex traffic environment and unreliability of network.|1. A multi-agent cruise control method, wherein the method comprises: determining the current time vehicle signal of the sub-vehicle team collected by each automatic driving vehicle; respectively inputting the present state signal of each automatic driving vehicle to the corresponding intelligent optimizing control model to realize the intelligent cruise control of multiple automatic driving vehicles; wherein the intelligent optimization control model is obtained by performing centralized neural network parameter training to the partially observable Markov game model based on the vehicle queue real-time collection state sample built by the automatic driving vehicle; The construction process of the partially observable Markov game model comprises the following steps: dividing the vehicle queue composed of all vehicles into several sub-queues according to the position of the automatic driving vehicle, taking a certain automatic driving vehicle as reference, the nearest automatic driving vehicle in front of the vehicle is the head vehicle, the nearest automatic driving vehicle in back of the vehicle is the tail vehicle, all the vehicles between the head and the tail form the sub-queue system of the automatic driving vehicle; obtaining the sub-queue state information built by each automatic driving vehicle, and establishing the dynamic equation of the queue system according to the sub-queue state information; according to the dynamic equation of the sub-queue system, taking the minimum state error and input as the target function to construct the quadratic optimization control equation; constructing a partially observable Markov game model of network control according to the dynamic equation of the sub-queue system and the quadratic optimization control equation; the step of obtaining the sub-queue state information established by each automatic driving vehicle, and establishing a dynamic equation of the sub-queue system according to the sub-queue state information, comprises the following steps: obtaining vehicle distance, vehicle speed and acceleration information of each vehicle in the sub-queue through vehicle-to-vehicle communication; according to the vehicle distance, vehicle speed and acceleration information of each vehicle in the vehicle sub-queue, establishing the dynamic equation of each vehicle in the sub-queue; setting the real-time vehicle speed of the head vehicle as the expected vehicle speed, obtaining the expected vehicle distance of each vehicle corresponding to the expected vehicle speed based on the preset range strategy, and establishing the state error equation of each vehicle according to the expected vehicle speed of the head vehicle, the expected vehicle distance of each vehicle and the current vehicle speed and vehicle distance of each vehicle; combining the state error equation of each vehicle in the sub-queue, and based on the state equation of each vehicle in the sub-queue of continuous time, after discretization processing, obtaining the dynamic equation of the sub-queue system; The preset range policy includes: The definition is as follows: wherein vd (l) represents a desired speed based on vehicle distance l, lmin represents a preset minimum vehicle distance, lmax represents a preset maximum vehicle distance, vmax represents a preset maximum vehicle speed; the discretization process to obtain the dynamic equation of the sub-queue system is as follows: Wherein, k is the sampling interval sequence number, and is a state variable and an acceleration control strategy respectively representing the ith sub-queue at the kth moment, Bi = [01 x 2n, 0, 1, 01 x 2m] T, p i (t) = [fi (- n), ..., fi (-1), 0, 0, fi1, ..., fim] T, fij = [0, δ ij (t)], j belongs to [- n, -1] U [1, m], wherein, T is the sampling interval, τ k is the network induction time delay at the kth time, the current automatic driving vehicle label is 0, from the vehicle to the head vehicle direction in turn mark 1, 2, ..., m, m + 1, from the vehicle to the tail vehicle direction in turn mark -1, -2, ...,-(n-1), - n, ij represents the jth vehicle in the ith sub-queue, xi ij and n ij represent artificial driver parameter, the partial derivative at the desired vehicle distance, [delta] ij (t) is an additional acceleration interference term due to the time-varying characteristic of the desired vehicle distance; according to the dynamic equation of the sub-queue system, taking the minimum state error and input as the target function to construct the quadratic optimization control equation as follows: Wherein, N is the sampling interval number, and D0i and Vi are coefficient matrixes; the intelligent optimization control model performs neural network parameter training on the partially observable Markov game model based on the vehicle sub-queue real-time collection state sample established by each automatic driving vehicle, comprising: each intelligent agent constructs a depth deterministic strategy gradient algorithm comprising a current actor network, a current critic network, a target actor network and a target critic network to update the partially observable Markov game model parameter; The observation of all the intelligent bodies itself constitutes the environment state S = (o1, ..., oC), wherein C is the number of the intelligent bodies, namely the total number of the automatic driving vehicles, in each time slot k, the observation of each intelligent body i according to the input the current actor network will output the corresponding action strategy f; executing strategy/> The actions of all the intelligent bodies constitute a global action A = (a1, ..., aC), according to the environment state, all the intelligent agent actions, and according to the state transfer matrix to obtain the environment state S ' at the next time, each intelligent agent reward function to obtain the corresponding reward rik, wherein the reward value is composed of a local reward rip and a global reward rg, all the intelligent agent reward value R = (r1, ..., rC), storing (Sk, Ak, Rk, S ' k) as a sample in an empirical playback buffer to obtain a state sample; wherein, is the parameter of the actor network, in order to effectively explore and add random noise in the continuous action space; Each current critic network obtains global information and updates its parameters by centralized training minimization of the following mean square error loss function Wherein, U is the sample number of the small batch sampling, t is the small batch sampling sequence number. is the current Q value, through the global/> and The information is input to the current critic network of each agent, and the information is input to the current critic network of each agent. is the target Q value, expressed as: In the formula, rit is the corresponding reward function value of the agent i, For the next Q value generated by the target critic network of the agent i, a 'j = u' j (0, j) is the target actor network according to the input self-observation/. the generated next action strategy, gamma is discount factor; the current actor network of the intelligent agent i updates its parameter through the following strategy gradient function Wherein, is a gradient operator; Each target actor network and target critic network respectively update their parameters by the following way: and Wherein, Epsilon is a fixed constant, and Epsilon is more than 0 and less than 1.\n | 2. The multi-agent cruise control device realized based on the multi-agent cruise control method according to claim 1, wherein it comprises an observing signal unit and an intelligent control unit; the signal collecting unit is used for obtaining the current vehicle speed and vehicle distance information of the vehicle in the sub-queue; the intelligent control unit is used for inputting the current vehicle signal of the sub-queue vehicle to the intelligent optimization control model to realize the intelligent cruise control of the automatic driving vehicle; the intelligent optimization control model is obtained by training the Markov game model based on the mixed vehicle queue real-time collection state sample pair part observable by the multiple automatic driving vehicles and the manual driving vehicle.\n | 3. An electronic device realized based on the multi-agent cruise control method according to claim 1, comprising a memory, a processor and a computer program stored on the memory and operated on the processor, wherein the processor realizes the multi-agent cruise control method when executing the program.\n | 4. A non-transient state computer readable storage medium realized based on the multi-agent cruise control method according to claim 1, wherein the non-transient state computer readable storage medium is stored with a computer program; the computer program is executed by the processor to realize the multi-agent cruise control method.", "output": "The method involves obtaining vehicle distance, vehicle speed and acceleration information of a vehicle in a sub-queue through vehicle-to-vehicle communication. Dynamic equation of the vehicle is established according to the vehicle distance. A state error equation of each vehicle is established based on expected vehicle speed of a head vehicle. The expected vehicle distance of the vehicle is obtained corresponding to the expected vehicle speed based on a preset range strategy. An intelligent optimization control model is obtained by performing a centralized neural network parameter training to a partially observable Markov game model based on real-time collection state sample of a vehicle queue in a real time. An INDEPENDENT CLAIM is included for a device for performing multi-agent cruise control for automatic driving vehicles. Method for performing multi-agent cruise control for automatic driving vehicles. The method enables continuously and intelligently learning and adjusting an optimized control strategy of networked cruise control, so as to adapt actual complex and variable network dynamic scene, thus reducing unpredictability of complex traffic environment and unreliability of network. The drawing shows a flow diagram illustrating a method for performing multi-agent cruise control for automatic driving vehicles. (Drawing includes non-English language text).", "instruction": "Please summarize the input" }, { "input": "A linked automatic driving automobile auxiliary sensing street lamp system based on intelligent network of the V2IThe invention claims an intelligent network-based V2I-linked automatic driving automobile auxiliary sensing street lamp system, comprising a street lamp and data centre is set in the road, street lamp is equipped with a camera image sensor. the millimetre wave radar sensor and a laser radar sensor, each sensor is connected with the data centre so that the data transmission to the data center of each collection, data centre after receiving the data based on the computer vision technique and data fusion technology to perform data extraction and fusion process so as to form the road of real-time data. data centre with the 5G network service provider combine so as to corresponding to the real-time data of the road through the 5G network distribution intelligent network connected to each vehicle-mounted terminal to realize the V2I signal according to the requirement of automatic driving automobile. the system through road can detect all kinds of information covered by the traffic network, obtaining instant data information of its path is convenient to the vehicle terminal, unlimitedly extend the sensing range, avoid the potential safety hazard and the traffic jam.|1. An intelligent network-based V2I-linked automatic driving automobile auxiliary sensing street lamp system, wherein it comprises a street lamp and data centre is set in the road, the street lamp is provided with a camera image sensor, millimeter wave radar sensor and a laser radar sensor; the camera image sensor collects the image data in the coverage area, the object speed data collected by the millimetre wave radar sensor under the street lamp, the laser radar sensor collecting the road point cloud data, each sensor is connected with the data centre so that the data transmission to the data center of each acquisition. real-time data after the data center receives the data based on the computer vision technique and data fusion technology to perform data extraction and fusion process so as to form the road, the data centre and the 5G network service provider combined so as to fix the corresponding real-time data of the road network by 5G the network distributed to each vehicle-mounted terminal to the intelligent network for realizing V2I according to the requirement of automatic driving automobile.\n | 2. The intelligent network according to claim 1 automatic driving automobile auxiliary sensing street lamp system, wherein the data centre comprises a lower data centre and the high-grade data centre. the lower data centre connected with each sensor by receiving the collected data corresponding to each sensor based on the computer vision technique and data fusion technology to perform data extraction and fusion processing to form the road data area is then transmitted to the high-grade data centre, the high-grade data centre fusion so as to form the road of real-time data to data of larger range than lower data centre, the high-grade data centre and 5G network service provider combined so as to realize the intelligent network of the V2I communication signal.\n | 3. The intelligent network according to claim 2 automatic driving automobile auxiliary sensing street lamp system, wherein the road data in the region comprises the road video data, point cloud data, and a category, size, outline, moving speed and direction of travel of each of the traffic participants.\n | 4. said intelligent network connected automatic driving automobile auxiliary sensing street lamp system according to any one of claims 1 to 3, wherein said street lamp having a unique device ID, data of each sensor transmits the collected after frame addition time, position and the device ID are transmitted to the data center.\n | 5. said intelligent network connected automatic driving automobile auxiliary sensing street lamp system according to any one of claims 1 to 3, wherein the camera image sensor is a high speed camera image sensor view, the laser radar sensor is a solid-state laser radar sensor.\n | 6. The intelligent network according to claim 2 3 automatic driving automobile auxiliary sensing street lamp system, wherein the road network of several street lamp adopts multiple-to-one connected lower data centre, the lower data center, each lower data centre multi-to-one connecting high-grade data centre, the high-grade data centre fusion each area lower data center uploading of data, forming the real-time data of the whole road network.\n | 7. The intelligent network according to claim 6 automatic driving automobile auxiliary sensing street lamp system, wherein the street lamp and the corresponding lower data centre is a high-speed network, through different levels of optical fibre connected with each street lamp and low-grade data centre to networking, the high-grade data centre and the 5G network service provider the combined building wireless network.\n | 8. The intelligent network according to claim 4 automatic driving automobile auxiliary sensing street lamp system, wherein, the high-grade data centre receives the automatic driving vehicle sent from the vehicle of the planning of the route and the current position after the automatic driving automobile near real-time data of street lamp distribution to the vehicle terminal through the 5G network.", "output": "The system has a street lamp provided with a camera image sensor, a millimeter wave radar sensor and a laser radar sensor. The camera image sensor collects image data in a coverage area. The laser radar sensor collects road point cloud data. A sensor is connected with a data center. The data center receives data based on a computer vision technology and data fusion technology to perform data extraction and fusion process to form a road. The data center cooperates with a Fifth Generation (5G) network service provider to distribute real-time data corresponding to a road network to an automobile vehicle terminal through Fifth Generation (5G) network according to requirement of an automatic driving automobile to realize intelligent network communication of V2I. V2I intelligent network based linked automatic driving automobile auxiliary sensing street lamp system. The system detects multi-information through the road covered by a traffic network for obtaining instant data information to the automobile terminal in convenient manner, and avoids potential safety hazard and traffic jam. The drawing shows a schematic view of a V2I intelligent network based linked automatic driving automobile auxiliary sensing street lamp system. '(Drawing includes non-English language text)'", "instruction": "Please summarize the input" }, { "input": "The medium access method for the V2V communication.In the medium access method for the V2 V (Vehicle-to-Vehicle) communication, the base station obtains multiple vehicle each driving path information and the base station receives the V2V connect request signal from the connection request vehicle among the multiple vehicles and the base station selects the connection request vehicle and the vehicle in which the driving path most lengthways coincides with the future between multiple vehicles and the connection request vehicle and the vehicle based on multiple vehicle each driving path information as the connection object car and the base station assigns the resource block to the pair of the connection object car and connection request vehicle and the V2V communication is performed using the connection request vehicle and the resource block in which the connection object car is allocated.|1. A medium access method for the V2V communication, wherein: It comprises the step that the base station obtains multiple vehicle each driving path information; the step that the base station receives the V2 V (Vehicle-to-Vehicle) connect request signal from the connection request vehicle among the multiple vehicles; the step that the base station selects the connection request vehicle and the vehicle in which the driving path most lengthways coincides with the future between multiple vehicles and the connection request vehicle and the vehicle based on multiple vehicle each driving path information as the connection object car; and the step, it includes the step that the base station assigns the resource block to the pair of the connection object car and connection request vehicle, and the connection request vehicle and the step that the connection object car performs the V2V communication using the above-mentioned allocated resource block; and that the base station assigns the resource block to the pair of the connection object car and connection request vehicle. Is the step of grasping on the carrier frequency of the resource block allocated in the adjacent vehicle pair adjacent to the pair of the connection object car and connection request vehicle; the step of determining the size of the separation frequency based on the moving direction of the moving direction of the pair of the connection request vehicle and connection object car and adjacent vehicle pair; and the step of assigning the carrier frequency and the resource block having the separated carrier frequency over the separation frequency of the resource block allocated in the adjacent vehicle pair to the pair of the connection object car and connection request vehicle.\n\n | 2. The medium access method for the V2V communication of claim 1, wherein: the step that the base station obtains multiple vehicle each driving path information comprises the step that the base station receives the driving path information of the corresponding to vehicle from the navigation device for being included respectively on multiple vehicles. \n\n | 3. The medium access method for the V2V communication of claim 1, wherein: the step that the base station obtains multiple vehicle each driving path information comprises the step that the base station receives multiple vehicle each driving path information from the navigation server guiding the route of reaching even the destination location to multiple vehicles. \n\n | 4. The medium access method for the V2V communication of claim 1, wherein: the step that the base station obtains multiple vehicle each driving path information comprises the step that the base station receives multiple vehicle each driving path information from the autonomous driving server which controls multiple vehicles so that multiple vehicles move to the autonomous driving mode. \n\n | 5. The medium access method for the V2V communication of claim 1, wherein: The step that the base station selects the connection request vehicle and the vehicle in which the driving path most lengthways coincides with the future between multiple vehicles and the connection request vehicle and the vehicle based on multiple vehicle each driving path information as the connection object car comprise the step that the base station determines vehicles having within the connection request vehicle and V2V communication possible distance among multiple vehicles with candidate vehicles; and the step that the base station selects the connection request vehicle and the vehicle in which the driving path most lengthways coincides with the future between candidate vehicles and the connection request vehicle and the vehicle based on the candidate vehicle each driving path information as the connection object car. \n\n | 6. The medium access method for the V2V communication of claim 5, wherein: The step that the base station selects the connection request vehicle and the vehicle in which the driving path most lengthways coincides with the future between candidate vehicles and the connection request vehicle and the vehicle based on the candidate vehicle each driving path information as the connection object car comprise the step that the base station decides the bifurcations existing in the driving path of the connection request vehicle based on the driving path information of the connection request vehicle as standard bifurcations; and the step that the base station selects the candidate vehicle which it very much most includes standard bifurcations based on the candidate vehicle each driving path information of candidate vehicles on the driving path from the current position as the connection object car forward. \n\n | Deletion. \n | Deletion.\n | 9. The medium access method for the V2V communication of claim 1, wherein: The step of determining the size of the separation frequency based on the moving direction of the moving direction of the pair of the connection request vehicle and connection object car and adjacent vehicle pair comprise the step the moving direction of the moving direction of the pair of the connection request vehicle and connection object car and adjacent vehicle pair are identical; and of determining the separation frequency with the first frequency; and the step the moving direction of the moving direction of the pair of the connection request vehicle and connection object car and adjacent vehicle pair are opposite from each other; and of determining the separation frequency with the secondary frequency which is greater than the first frequency.\n\n | 10. The medium access method for the V2V communication of claim 1, wherein: the medium access method further includes choosing and connecting as the candidate pair the vehicle in which the base station treats based on multiple vehicle each driving path information respectively on multiple vehicles and the driving path most lengthways coincides with the future while it has within the V2V communication possible distance. \n\n | 11. The medium access method for the V2V communication of claim 10, wherein: the step that the base station selects the connection request vehicle and the vehicle in which the driving path most lengthways coincides with the future between multiple vehicles and the connection request vehicle and the vehicle based on multiple vehicle each driving path information as the connection object car comprise the step that the base station determines the vehicle connected among multiple vehicles to the connection request vehicle and candidate pair with the connection object car. \n", "output": "The method involves receiving vehicle-to-vehicle (V2V) connect request signal from a connection request vehicle i.e. car by a base station. The connection request vehicle is selected (S300) by the base station. A driving path is coincided between the connection request vehicle and a connection object vehicle. A resource block is assigned to the connection object vehicle and the connection request vehicle by the base station. Carrier frequency of the resource block is grasped. The carrier frequency of the resource block is assigned with separated carrier frequency to pair the connection object vehicle and the connection request vehicle. Method for accessing V2V communication medium by utilizing a base station. The method enables assigning the resource block to the connection object vehicle and the connection request vehicle by the base station so as to increase network lifetime. The method enables reducing communication overhead defects and inter-carrier interference. The drawing shows a flowchart illustrating a method for accessing V2V communication medium by utilizing base station. '(Drawing includes non-English language text)' S300Step for selecting connection request vehicle by base station", "instruction": "Please summarize the input" }, { "input": "APPARATUS AND METHOD FOR DETECTING VIOLATION OF TRAFFIC SIGNAL USING SENSOR OF AUTONOMOUS VEHICLE AND V2I COMMUNICATIONDisclosed is an apparatus and method for regulating signal-violating objects using a sensor of an autonomous vehicle and V2I communication, and a method for regulating signal-violating objects using a sensor and V2I communication of an autonomous vehicle according to an embodiment of the present invention, (a) autonomous driving Based on the surrounding information collected through the target vehicle, which is the vehicle, determining whether the target vehicle satisfies a predefined reference state as a state in which a signal violation object can be controlled through the surrounding information, (b) the If the target vehicle satisfies the reference condition, detecting the signal-violating object based on the surrounding information; and (c) displaying a report image captured to include the signal-violating object based on the detection result to the self-driving vehicle. It may include transmitting through V2I communication to a predetermined infrastructure interlocked with.|1. In the method for regulating signal-violating objects using autonomous vehicle sensors and V2I communication, (a) based on surrounding information collected through a target vehicle, which is an autonomous vehicle, the target vehicle detects a signal-violating object through the surrounding information. Determining whether or not a predefined reference state is satisfied as a state in which enforcement is possible;\n(b) when the target vehicle meets the reference state, detecting the signal violation object based on the surrounding information; And (c) transmitting, based on the detection result, a tip video captured to include the signal violation object to a predetermined infrastructure linked to the autonomous vehicle through V2I communication, step (a) is to determine whether the target vehicle meets the reference state based on the location of the target vehicle, the condition of the road on which the target vehicle is located, and the arrangement of surrounding objects with respect to the target vehicle, wherein (a) The step includes: (a1) determining whether signal information on the road where the target vehicle is located is a stop signal; \n(a2) based on location information on the road of the target vehicle, determining whether the target vehicle is located within a preset reference distance from the stop line of the road considering the size of a single vehicle; and (a3) determining the presence or absence of surrounding objects around the target vehicle, wherein in the step (b), when it is determined that the state of the target vehicle meets the reference state, the signal violation object An enforcement method that activates a detection operation and, when it is determined that the state of the target vehicle does not meet the reference state, deactivates the detection operation of the signal violation object. \n | 2. delete\n | 3. The enforcement method according to claim 1, wherein step (a1) derives equalization information of a traffic light structure disposed in front of the target vehicle from an image acquired through a vision sensor mounted on the target vehicle.\n | 4. The method of claim 1, wherein the step (a1) involves receiving lighting information of a traffic light structure on a road where the target vehicle is located through the V2I communication.\n | 5. The method of claim 1, wherein the step (a2) determines whether the target vehicle is located within the reference distance based on precision road map (Local Dynamic Map, LDM) information provided in the target vehicle and GPS information of the target vehicle. A crackdown method to determine whether or not.\n | 6. The method of claim 1, wherein the surrounding information includes object movement information including at least one of a relative speed, a relative distance, and a relative acceleration of an object traveling around the target vehicle, wherein step (b) comprises: To detect the signal violation object of the object based on the object movement information, enforcement method.\n | 7. The enforcement method of claim 6, wherein the object movement information is obtained based on at least one of a lidar sensor and a vision sensor mounted on the target vehicle.\n | 8. The method of claim 1, wherein the information video is a time period set in advance based on a time point at which the signal violation object is detected in step (b) from an original video obtained time-sequentially through a vision sensor mounted on the target vehicle. Which is a partial image extracted to correspond to, the enforcement method.\n | 9. The enforcement method according to claim 1, characterized in that the reporting image is a partial image extracted from an original image obtained time-sequentially through a vision sensor mounted on the target vehicle so that the rear license plate of the traffic violation object is identified. Way.\n | 10. The enforcement method of claim 1, wherein step (c) transmits vehicle number information of the traffic violation object identified from the report image together with the report image.\n | 11. An apparatus for controlling signal-violating objects using a sensor of an autonomous vehicle and V2I communication, wherein the target vehicle is capable of cracking down on a signal-violating object through the surrounding information based on surrounding information collected through a target vehicle, which is an autonomous vehicle. a state analyzer that determines whether or not a state meets a predefined reference state;\na detection unit that detects the signal violation object based on the surrounding information when the target vehicle meets the reference state; And a communication unit that transmits a report image captured to include the signal violation object based on the detection result to a predetermined infrastructure linked to the autonomous vehicle through V2I communication, and the state analysis unit includes the target vehicle. Based on the location of the target vehicle, the road condition on which the target vehicle is located, and the arrangement of surrounding objects with respect to the target vehicle, it is determined whether the target vehicle meets the reference state, wherein the state analysis unit determines the location of the target vehicle. It is determined whether the signal information on the road is a stop signal, and based on the location information on the road of the target vehicle, whether the target vehicle is located within a preset reference distance from the stop line of the road considering the size of a single vehicle. determines whether or not there are surrounding objects around the target vehicle, and the detection unit, if it is determined that the state of the target vehicle meets the reference state, activates a detection operation of the signal violation object, An enforcement device that deactivates the detection operation of the signal violation object when it is determined that the state of the target vehicle does not meet the reference state. \n | 12. delete\n | 13. The method of claim 11, wherein the state analysis unit derives equalization information of a traffic light structure placed in front of the target vehicle from an image acquired through a vision sensor mounted on the target vehicle or the target vehicle through the V2I communication. Receives lighting information of the traffic light structure of the located road, and the condition analysis unit determines the target vehicle based on the precision road map (Local Dynamic Map, LDM) information provided in the target vehicle and the GPS information of the target vehicle. A enforcement device that determines whether the device is located within a distance.\n | 12. The method of claim 11, wherein the surrounding information includes object movement information including at least one of a relative speed, a relative distance, and a relative acceleration of an object traveling in the vicinity of the target vehicle, and wherein the sensor comprises the object movement information. On the basis of which to detect the signal violation object of the object, the enforcement device.\n | 15. According to claim 11, wherein the communication unit, to transmit the license plate information of the signal violation object identified from the report image together with the report image, the enforcement device.", "output": "The method involves detecting a signal-violating object through surrounding information collected through a target vehicle (1), based on surrounding information collected through a target vehicle, which is an autonomous vehicle. A signal offending object is detected based on the surrounding information when the target vehicle meets a reference state. A report video captured to include the signal violation object is transmitted to a predetermined infrastructure (200) linked with a self-driving vehicle based on a detection result. A determination is made whether signal information of a road is a stop signal. The target vehicle is located within a preset reference distance from a stop line of the road. An INDEPENDENT CLAIM is included for an apparatus for controlling signal-violating objects using a sensor of an autonomous vehicle and V2I communication. Method for regulating signal-violating objects using autonomous vehicle sensors and V2I communication. The self-driving vehicle detects a traffic violation vehicle at an intersection and transmits a report image of the traffic violation object so as to control signal violation objects using sensors and V2I communication. The method enables actively cracking down traffic violation vehicles in different locations using communication with the autonomous vehicle without installing a separate information collection device around the road. The drawing shows a schematic configuration diagram of the enforcement system using the autonomous vehicle including the signal violation object enforcement device using the sensor of the autonomous vehicle and V2I communication. 1Target vehicle10Enforcement system12Vision sensor100Signal violation object enforcement device200Infrastructure", "instruction": "Please summarize the input" }, { "input": "DYNAMIC AUTONOMOUS DRIVING ASSISTANT SYSTEM USING V2I COMMUNICATIONA dynamic autonomous driving support system using V2I communication is disclosed, and the dynamic autonomous driving support method using V2I communication according to an embodiment of the present application includes a vehicle operation design domain (ODD) and a driving path Receiving data and sensor data collected by a sensor installed in the vehicle, collecting driving environment data related to the driving route, designing the driving based on the driving data, the sensor data, and the driving environment data It may include dynamically changing the area and transmitting information on the changed driving design area to the vehicle.|1. A method for supporting dynamic autonomous driving using V2I communication, the method comprising: receiving driving data including an operation design domain (ODD) and a driving route of a vehicle and sensor data collected by a sensor installed in the vehicle;\ncollecting driving environment data associated with the driving route; \ndynamically changing the driving design area based on the driving data, the sensor data, and the driving environment data; and transmitting information on the changed driving design area to the vehicle, wherein the dynamically changing the driving design area includes dividing the driving path into link units; and generating determination information on whether or not autonomous driving is permitted for each section of the driving route based on the driving data, the sensor data, and the driving environment data in consideration of the divided links, wherein the determination information is generated. In the step of performing the first determination information for determining that the vehicle is capable of autonomous driving based on the sensor data, for a link in which it is determined that the vehicle is capable of autonomous driving only with the sensor data without the driving environment data. and for a link in which the vehicle cannot perform autonomous driving only with the sensor data, but it is determined that autonomous driving is possible by converging the sensor data and the driving environment data, the vehicle determines that the sensor data and the driving environment data are capable of autonomous driving. Second determination information for determining that autonomous driving is possible based on the link is generated, and even if the vehicle fuses the sensor data and the driving environment data, autonomous driving is determined to be impossible. Generating third decision information for determining that the autonomous driving of the vehicle is impossible, wherein the first to third decision information are individually generated for each unit section including the link or at least one link. In the step of collecting the driving environment data, the driving environment data associated with the driving path of the vehicle is selected from among the driving environment data obtained from the infrastructure, and the driving environment data is selected from a plurality of infrastructures for one link. When redundant data is received, data corresponding to any one of the plurality of infrastructures is allocated as the driving environment data based on the reliability information of each of the plurality of infrastructures, and the reliability information is if the infrastructure is a photographing device type., It is characterized in that the higher the photographing device installed to photograph the road corresponding to the link in a relatively wide area, the interface visualizing the first to third determination information, It is visualized through a display unit provided in a dynamic self-driving device provided in the vehicle or a user terminal interworking with the dynamic self-driving device, and based on an autonomous driving selection input for each link received based on the interface, for each link. Whether or not autonomous driving is determined, and if an autonomous driving setting input preset to perform autonomous driving for a section in which a specific type of determination information is generated among the first to third determination information is received in advance, wherein an autonomous driving function is activated even if the autonomous driving selection input is not received for at least one of the link and the unit section in which the determination information of the specific type is generated. \n | 2. delete\n | 3. delete\n | 4. delete\n | 5. The method of claim 1, further comprising, after the transmitting step, transmitting the driving environment data collected for the link where the second determination information is generated to the vehicle..\n | 6. The method of claim 1, wherein the generating of the determination information comprises: predicting road surface condition information and shadow information for a link through which the vehicle will pass based on the driving environment data; and generating the determination information for a corresponding link based on the predicted road surface condition information and shadow information.\n | 7. The method of claim 6, wherein the driving environment data includes time-series data on the road surface condition of the link, and the predicting includes an LSTM (The dynamic autonomous driving support method of predicting the road surface condition information based on a Long Short-Term Memory)-based model.\n | 8. The method of claim 6, wherein the driving environment data includes image data on the road surface of the link, and the predicting includes predicting the presence or absence of a shadow on the road surface and a change in the length of the shadow when the image data is input. A dynamic autonomous driving support method of predicting the shadow information based on a learned You Only Look Once (YOLO)-based model.\n | 9. In a dynamic autonomous driving method using V2I communication, driving data including an operation design domain (ODD) and a driving route of a vehicle and sensor data collected by a sensor installed in the vehicle are used as a dynamic autonomous driving support device transmitting;\nreceiving, from the dynamic autonomous driving support device, determination information on whether or not autonomous driving is allowed for each section of the driving route, which is generated based on the driving data, the sensor data, and driving environment data collected in association with the driving route; and determining whether or not to perform autonomous driving for each section of the vehicle based on the determination information, wherein the apparatus for supporting dynamic autonomous driving divides the driving path in units of links and considers the divided links. Based on the driving data, the sensor data, and the driving environment data, determination information on autonomous driving per section of the driving route is generated, and the dynamic autonomous driving support apparatus determines whether the vehicle is detected by the sensor without the driving environment data. For a link that is determined to be capable of autonomous driving only with data, first determination information for determining that the vehicle is in a state capable of autonomous driving based on the sensor data is generated, and the vehicle cannot perform autonomous driving only with the sensor data. One, for a link that is determined to be capable of autonomous driving by fusing the sensor data and the driving environment data, Second determination information for determining that the vehicle is in a state in which autonomous driving is possible based on the sensor data and the driving environment data is generated, and even if the vehicle fuses the sensor data and the driving environment data, autonomous driving is not possible. Third decision information for determining that the vehicle is in a state in which autonomous driving is impossible is generated for a link determined to be the link, wherein the first to third decision information include the link or at least one link. Characterized in that the dynamic self-driving support device selects driving environment data associated with the driving route of the vehicle from among the driving environment data obtained from the infrastructure, and multiple links for one link are individually generated. When the driving environment data is redundantly received from the infrastructure of, allocating data corresponding to any one of the plurality of infrastructures as the driving environment data based on reliability information of each of the plurality of infrastructures; If the infrastructure is a type of photographing device, the reliability information may be given higher to a photographing device installed to photograph a road corresponding to the link in a relatively large area, and after the receiving step, provided in the vehicle displaying an interface for visualizing the first to third determination information through a display unit provided in a dynamic self-driving device or a user terminal interworking with the dynamic self-driving device; and receiving an autonomous driving selection input and an autonomous driving setting input for each link based on the interface, wherein the step of determining whether or not to perform autonomous driving includes the link based on the autonomous driving selection input for each link. It is determined whether or not autonomous driving is performed, and if an autonomous driving setting input preset to perform autonomous driving is received in advance for a section in which a specific type of determination information is generated among the first to third determination information, The autonomous driving function is activated even if the autonomous driving selection input is not received for at least one of the link and the unit section in which the specific type of determination information is generated. \n | 10. delete\n | 10. The method of claim 9, wherein the determination information is individually generated for each link included in the driving route, and the determining step comprises autonomously for each link based on the determination information received for each link. A dynamic self-driving method that determines whether to perform driving.\n | 12. The method of claim 11, further comprising: transmitting a driving environment data request signal for the link in which the second determination information is generated to the dynamic autonomous driving support device;\nreceiving the driving environment data from the dynamic autonomous driving support device; and performing autonomous driving on a corresponding link based on the sensor data and the driving environment data. \n | 13. A dynamic autonomous driving support device using V2I communication, comprising: a communication unit configured to receive driving data including an operation design domain (ODD) and a driving route of a vehicle and sensor data collected by a sensor installed in the vehicle;\na collection unit that collects driving environment data associated with the driving route; and a determination unit that dynamically changes the driving design area based on the driving data, the sensor data, and the driving environment data, wherein the determination unit divides the driving route by link unit, and the divided link to generate determination information on autonomous driving per section of the driving route based on the driving data, the sensor data, and the driving environment data in consideration of For a link that is determined to be capable of autonomous driving only, first determination information for determining that the vehicle is in a state capable of autonomous driving based on the sensor data is generated, and whether or not the vehicle is capable of autonomous driving only with the sensor data, For a link determined to be capable of autonomous driving by fusing the sensor data and the driving environment data, Second determination information for determining that the vehicle is in a state in which autonomous driving is possible based on the sensor data and the driving environment data is generated, and even if the vehicle fuses the sensor data and the driving environment data, autonomous driving is not possible. Third decision information for determining that the vehicle is in a state in which autonomous driving is impossible is generated for a link determined to be the link, wherein the first to third decision information include the link or at least one link. characterized in that it is individually generated for each unit section, wherein the communication unit transmits information on the changed driving design area to the vehicle, and the collection unit includes a driving route of the vehicle among the driving environment data obtained from infrastructure Select driving environment data associated with, but if the driving environment data is received redundantly from a plurality of infrastructures for one link, Based on the reliability information of each of the plurality of infrastructures, data corresponding to any one of the plurality of infrastructures is allocated as the driving environment data, and the reliability information corresponds to the link if the infrastructure is a photographing device type. An interface visualizing the first to third determination information is provided in the dynamic self-driving device provided in the vehicle. Whether or not autonomous driving for each link is determined based on an autonomous driving selection input for each link visualized through a display unit or a user terminal interworking with the dynamic autonomous driving device and received based on the interface, When an autonomous driving setting input is received in advance to perform autonomous driving for a section in which a specific type of determination information is generated among the first to third determination information, wherein an autonomous driving function is activated even if the autonomous driving selection input is not received for at least one of the link and the unit section where the determination information of the specific type is generated. \n | 14. delete\n | 15. The method of claim 13, wherein the determination unit generates the determination information for each divided link, and the communication unit transmits the driving environment data collected for the link where the second determination information is generated to the vehicle. Phosphorus, dynamic autonomous driving support device.\n | 16. The method of claim 15, wherein the determination unit predicts road surface condition information and shadow information for a link through which the vehicle will pass based on the driving environment data, and predicts road surface condition information and shadow information on the link based on the predicted road surface condition information and shadow information. For generating the determination information for, a dynamic autonomous driving support device.", "output": "The method involves receiving driving data including an operation design domain (ODD) and a driving route of a vehicle (1) and sensor data collected by a sensor installed in the vehicle. The driving environment data associated with the driving route is collected. The driving design area is changed dynamically based on the driving data, the sensor data, and the driving environment data. The information on the changed driving design area is transmitted to the vehicle. The determination information on whether autonomous driving is possible for each section of the driving route is generated based on the driving data, the sensor data and the driving environment data. An INDEPENDENT CLAIM is included for a dynamic autonomous driving support device. Method for supporting dynamic autonomous driving using vehicle-to-infrastructure communication. The method enables supporting autonomous driving by actively determining the ODD for an autonomous vehicle according to road conditions. The method enables providing a dynamic autonomous driving support system using vehicle to infrastructure (V2I) communication so as to support safe and highly reliable autonomous driving. The drawing shows a schematic view of a dynamic autonomous driving support system using V2I communication. (Drawing includes non-English language text) 1Vehicle10Dynamic autonomous driving support system20Network100Dynamic autonomous driving support device300Infrastructure", "instruction": "Please summarize the input" }, { "input": "The vehicle charging and communication systemVehicle-to-vehicle charging and communication systems are provided. The vehicle-to-vehicle charging and communication system includes a first electric vehicle including a first battery unit and a first communication terminal unit, a second electric vehicle including a second battery unit and a second communication terminal unit, and the first and second electric vehicles through a mesh network. 2 A server unit connected to the communication terminal unit, and a block in which the first electric vehicle corresponds to a first node, the second electric vehicle corresponds to a second node, and connects the first and second nodes in a block chain structure A chain network is included, and the battery unit and the communication terminal are interlocked to provide the power required amount of the battery unit to the server unit through the communication terminal unit, and the second electric vehicle approves the request for power charging of the first electric vehicle , The first and second electric vehicles transmit and receive location information through the mesh network, and compensation is paid to the second electric vehicle through the blockchain network. \n\n|1. A first electric vehicle including a first battery unit and a first communication terminal unit; A second electric vehicle including a second battery unit and a second communication terminal unit; A server unit connected to the first and second communication terminals through a mesh network; And a blockchain network in which the first electric vehicle corresponds to a first node, the second electric vehicle corresponds to a second node, and connects the first and second nodes in a block chain structure. Including, wherein the battery unit and the communication terminal are interlocked to provide the power required amount of the battery unit to the server unit through the communication terminal unit, and when the second electric vehicle approves the power charging request of the first electric vehicle, the The first and second electric vehicles transmit and receive location information to and from each other through the mesh network, and compensation is paid to the second electric vehicle through the blockchain network, and power to a third electric vehicle within the communication range of the first electric vehicle When a charging request is provided and the third electric vehicle fails to provide power charging for the first electric vehicle, the electric power charging request is transmitted to the second electric vehicle within the communication range of the third electric vehicle. And communication systems.\n | 2. The vehicle-to-vehicle charging and communication system of claim 1, wherein the first electric vehicle and the second electric vehicle move to a position where wireless charging is performed through autonomous driving control.\n | 3. The vehicle-to-vehicle charging and communication system of claim 1, wherein the first electric vehicle and the second electric vehicle exchange electric power through a mutual magnetic resonance method.\n | 4. The vehicle-to-vehicle charging and communication system of claim 1, wherein a wireless power receiver is installed at a front end of the first electric vehicle, and a wireless power transmitter is installed at a rear end of the second electric vehicle.\n | 5. delete\n | 6. delete\n | 7. delete\n | 8. delete\n | 9. delete\n | 10. delete\n | 11. delete\n | 12. delete\n | 13. delete", "output": "The system has a first electric vehicle (10) including a first battery unit and a first communication terminal unit in a power equipment management system. A second electric vehicle (20) includes a second battery unit and a second communication terminal unit. A server unit (30) is connected to the first and second communication terminal units through a mesh network. A block chain network (100) is provided in which the first electric vehicle corresponds to first node, the second electric vehicle corresponds to second node, and connects the first and second nodes in a block chain structure. The battery unit and the communication terminal unit are interlocked so that the required power of battery unit is provided to server unit through communication terminal unit. The first and second electric vehicles transmit and receive location information with each other through the mesh network, when the second electric vehicle approves the request for power charging of first electric vehicle. Inter-vehicle charging and communication system. The vehicle having sufficient battery level can be searched for by sharing information between electric vehicles, and the vehicle requiring battery charging can receive power from the vehicle. The convenience of charging the electric vehicle is increased since the power can be exchanged while the vehicle is running. The surplus power of the electric vehicle can be easily bought and sold, and the reward can be easily paid using the blockchain technology for the electric vehicle that provided the electric power. The drawing shows a block diagram illustrating an inter-vehicle charging and communication system. (Drawing includes non-English language text) 10First electric vehicle20Second electric vehicle30Server unit100Block chain network", "instruction": "Please summarize the input" }, { "input": "Systems and Methods for Training Machine-Learned Models with Deviating Intermediate RepresentationsSystems and methods for vehicle-to-vehicle communications are provided. An adverse system can obtain sensor data representative of an environment proximate to a targeted system. The adverse system can generate an intermediate representation of the environment and a representation deviation for the intermediate representation. The representation deviation can be designed to disrupt a machine-learned model associated with the target system. The adverse system can communicate the intermediate representation modified by the representation deviation to the target system. The target system can train the machine-learned model associated with the target system to detect the modified intermediate representation. Detected modified intermediate representations can be discarded before disrupting the machine-learned model.What is claimed is: \n | 1. A computer-implemented method, the method comprising: \nobtaining, by a computing system comprising one or more computing devices, sensor data representative of a secondary environment proximate to an autonomous vehicle; \ngenerating, by the computing system, an intermediate representation for the autonomous vehicle based, at least in part, on the sensor data, wherein the intermediate representation is descriptive of at least a portion of the secondary environment; \ndetermining, by the computing system, an intermediate representation deviation for the intermediate representation based, at least in part, on the intermediate representation and a machine-learned model associated with the autonomous vehicle; \ngenerating, by the computing system, data indicative of a modified intermediate representation based, at least in part, on the intermediate representation and the intermediate representation deviation; and \ncommunicating, by the computing system, the data indicative of the modified intermediate representation to a vehicle computing system associated with the autonomous vehicle. \n | 2. The computer-implemented method of claim 1, wherein the machine-learned model associated with the autonomous vehicle comprises a machine-learned model utilized by the vehicle computing system to detect one or more objects within a surrounding environment of the autonomous vehicle.\n | 3. The computer-implemented method of claim 2, wherein the machine-learned model is configured to output one or more bounding box proposals indicative of one or more objects within the surrounding environment of the autonomous vehicle based, at least in part, on one or more intermediate representations.\n | 4. The computer-implemented method of claim 3, wherein determining the intermediate representation deviation for the intermediate representation based, at least in part, on the intermediate representation and the machine-learned model associated with the autonomous vehicle comprises: \nobtaining, by the computing system via a second machine-learned model, one or more ground truth bounding box proposals based, at least in part, on the intermediate representation, wherein the second machine-learned model is the same as the machine-learned model; \nobtaining, by the computing system via the second machine-learned model, one or more deviating bounding box proposals based, at least in part, on the modified intermediate representation; and \nmodifying, by the computing system, the intermediate representation deviation for the intermediate representation based, at least in part, on a comparison between the one or more ground truth bound box proposals and the one or more deviating bounding box proposals. \n | 5. The computer-implemented method of claim 4, wherein modifying the intermediate representation deviation for the intermediate representation based, at least in part, on the comparison between the one or more ground truth bound box proposals and the one or more deviating bounding box proposals comprise: \ndetermining, by the computing system, an adversarial loss for the intermediate representation deviation based, at least in part, on the one or more ground truth bound box proposals and the one or more deviating bounding box proposals; and \nmodifying, by the computing system, the intermediate representation deviation based, at least in part, on adversarial loss, wherein the intermediate representation deviation is modified to minimize the adversarial loss over the one or more deviating bounding box proposals. \n | 6. The computer-implemented method of claim 5, wherein each respective ground truth bounding box proposal of the one or more ground truth bounding box proposals comprises a respective ground truth class score indicative of respective ground truth object classification and one or more respective ground truth bounding box parameters indicative of a respective ground truth spatial location and one or more respective ground truth dimensions of the respective ground truth object classification, and \nwherein each respective deviating bounding box proposal of the one or more respective deviating bounding box proposals comprises a respective deviating class score indicative of a respective deviating object classification and one or more respective deviating bounding box parameters indicative of a respective deviating spatial location and one or more respective deviating dimensions of the respective deviating object classification. \n | 7. The computer-implemented method of claim 6, wherein the adversarial loss is determined based, at least in part, on a difference between a ground truth class score corresponding to at least one ground truth bounding box proposal and a deviating class score corresponding to a deviating bounding box proposal corresponding to the at least one ground truth bounding box proposal.\n | 8. The computer-implemented method of claim 6, wherein the adversarial loss is determined based, at least in part, on a difference between one or more ground truth bounding box parameters corresponding to at least one ground truth bounding box proposal and one or more deviating bounding box parameters corresponding to a deviating bounding box proposal corresponding to the at least one ground truth bounding box proposal.\n | 9. The computer-implemented method of claim 4, wherein the computing system is onboard a transmitting autonomous vehicle physically located proximate to the autonomous vehicle, and wherein the intermediate representation deviation is associated with a first time.\n | 10. The computer-implemented method of claim 9, further comprising: \nobtaining, by the computing system, movement data indicative of a motion of the transmitting autonomous vehicle from the first time to a second time; \nobtaining, by the computing system, second sensor data representative of the secondary environment proximate to the autonomous vehicle at the second time; \ngenerating, by the computing system, a second intermediate representation for the autonomous vehicle based, at least in part, on the second sensor data; and \ndetermining, by the computing system, a second intermediate representation deviation for the second intermediate representation based, at least in part, on the intermediate representation deviation associated with the first time and the movement data. \n | 11. A computing system comprising: \none or more processors; and \none or more non-transitory computer-readable media that collectively store instructions that, when executed by the one or more processors, cause the system to perform operations, the operations comprising: \nobtaining a plurality of intermediate representations associated with an autonomous vehicle, wherein each intermediate representation is descriptive of at least a portion of a secondary environment proximate to the autonomous vehicle at a plurality of times; \ngenerating a surrogate machine-learned model based, at least in part, on the plurality of intermediate representations; \nobtaining a target intermediate representation from the plurality of intermediate representations; \ndetermining an intermediate representation deviation for the target intermediate representation based, at least in part, on the target intermediate representation and the surrogate machine-learned model; \ngenerating data indicative of a modified intermediate representation based, at least in part, on the target intermediate representation and the intermediate representation deviation; and \ncommunicating the data indicative of the modified intermediate representation to a vehicle computing system associated with the autonomous vehicle. \n | 12. The computing system of claim 11, wherein each intermediate representation of the plurality of intermediate representations is generated by a first portion of a machine-learned model associated with the autonomous vehicle, and wherein a first portion of the surrogate machine-learned model is trained to output a surrogate intermediate representation substantially similar to the plurality of intermediate representations.\n | 13. The computing system of claim 11, wherein generating the surrogate machine-learned model based, at least in part, on the plurality of intermediate representations comprises: \nobtaining sensor data representative of surrogate environment proximate to the autonomous vehicle; and \ngenerating the surrogate machine-learned model based, at least in part, on the plurality of intermediate representations and the sensor data. \n | 14. The computing system of claim 13, wherein generating the surrogate machine-learned model based, at least in part, on the plurality of intermediate representations further comprises: \ngenerating, via a first portion of the surrogate machine-learned model, a surrogate intermediate representation based, at least in part, on the sensor data; \ngenerating, via a machine-learned discriminator model, a discriminator loss based, at least in part, on the surrogate intermediate representation and at least one of the plurality of intermediate representations; and \ntraining the surrogate machine-learned model to minimize the discriminator loss. \n | 15. The computing system of claim 14, wherein the discriminator loss is indicative of a difference between the surrogate intermediate representation and the at least one intermediate representation.\n | 16. The computing system of claim 11, wherein a second portion of the surrogate machine-learned model is configured to output one or more bounding box proposals indicative of one or more objects within the secondary environment proximate to the autonomous vehicle based, at least in part, on the target intermediate representation.\n | 17. The computer-implemented method of claim 16, wherein determining the intermediate representation deviation for the target intermediate representation based, at least in part, on the target intermediate representation and the surrogate machine-learned model comprises: \nobtaining, via the second portion of the surrogate machine-learned model, one or more ground truth bounding box proposals based, at least in part, on the target intermediate representation; \nobtaining, via the second portion of the surrogate machine-learned model, one or more deviating bounding box proposals based, at least in part, on the modified intermediate representation; and \nmodifying the intermediate representation deviation for the target intermediate representation based, at least in part, on a comparison between the one or more ground truth bound box proposals and the one or more deviating bounding box proposals. \n | 18. An autonomous vehicle comprising: \none or more sensors; \none or more processors; and \none or more tangible, non-transitory, computer readable media that collectively store instructions that when executed by the one or more processors cause the one or more processors to perform operations, the operations comprising: \nobtaining, via the one or more sensors, sensor data representative of a surrounding environment of the autonomous vehicle; \ngenerating, via a first portion of a machine-learned model, an intermediate representation based, at least in part, on the sensor data, wherein the intermediate representation is descriptive of at least a portion of the surrounding environment of the autonomous vehicle; \ndetermining an intermediate representation deviation for the intermediate representation based, at least in part, on the intermediate representation and the machine-learned model; \ngenerating, data indicative of a modified intermediate representation based, at least in part, on the intermediate representation and the intermediate representation deviation; and \ncommunicating the data indicative of the modified intermediate representation to one or more devices associated with a target autonomous vehicle. \n | 19. The autonomous vehicle of claim 18, wherein the target autonomous vehicle is configured to utilize a second portion of the machine-learned model to detect one or more objects within a surrounding environment of the target autonomous vehicle.\n | 20. The autonomous vehicle of claim 18, wherein the machine-learned model is trained to detect the modified intermediate representation.", "output": "The method (800) involves obtaining (802) sensor data representative of a secondary environment proximate to an autonomous vehicle by a computing system, where the computing system is provided with computing devices. An intermediate representation for the autonomous vehicle is generated (804) based on the sensor data by the computing system, where the intermediate representation is descriptive of a portion of the secondary environment. Data indicative of the modified intermediate representation is generated (808) based on the intermediate representation and intermediate representation deviation by the computer system. The data indicative of the modified intermediate representation is communicated (810) to a vehicle computing system associated with the autonomous vehicle by the computing system. An INDEPENDENT CLAIM is included for a system for training a machine-learned model for performing operations of an autonomous vehicle. Method for training a machine-learned model for performing operations of an autonomous vehicle (claimed) i.e. car (from drawings). The method enables improving safety of passengers of an autonomous vehicle, safety of surroundings of the autonomous vehicle and experience of a rider and/or operator of the vehicle, and reducing traffic congestion in communities as well as providing alternate forms of transportation that provide environmental benefits. The drawing shows a flow diagram illustrating a method for training a machine-learned model for performing operations of an autonomous vehicle. 800Method for training machine-learned model for performing operations of autonomous vehicle 802Obtaining sensor data representative of secondary environment proximate to autonomous vehicle by computing system 804Generating intermediate representation for autonomous vehicle based on sensor data by computing system 808Generating data indicative of modified intermediate representation 810Communicating data indicative of modified intermediate representation to vehicle computing system ", "instruction": "Please summarize the input" }, { "input": "Telecommunications Network For VehiclesSystems and methods for facilitating communication with autonomous vehicles are provided. In one example embodiment, the autonomous vehicle telecommunications network system includes a first point-of-presence (POP) interface configured to obtain a first communication associated with a first autonomous vehicle and to route the first communication associated with the first autonomous vehicle. The autonomous vehicle telecommunications network system includes a first security system configured to authenticate the first autonomous vehicle. The first POP interface is separate from the first security system. The autonomous vehicle telecommunications network system includes a first data center system configured to receive the first communication associated with the first autonomous vehicle that is authenticated and to provide data to the first autonomous vehicle. The first POP interface is separate from the first data center system. The autonomous vehicle telecommunications network system can include a similarly configured second POP interface, second security system, and second data center system.|1. A vehicle telecommunications network system comprising: \na first point-of-presence interface configured to obtain a first communication associated with a first vehicle and to route the first communication, wherein the first point-of-presence interface is remote from the first vehicle; \na first security system configured to authenticate the first vehicle, wherein the first security system is remote from the first vehicle; \na first data center system that is remote from the first vehicle, the first data center system configured to receive the first communication associated with the first vehicle that is authenticated and to provide data to the first vehicle; \na second point-of-presence interface configured to obtain a second communication associated with a second vehicle and to route the second communication, wherein the second point-of-presence interface is remote from the second vehicle and the first vehicle; \na second security system configured to authenticate the second vehicle, wherein the second security system is remote from the second vehicle and the first vehicle; and \na second data center system that is remote from the first vehicle and the second vehicle, the second data center system configured to receive the second communication associated with the second vehicle that is authenticated and to provide data to the second vehicle, wherein the first point-of-presence interface, the first security system, and the first data center are remote from the second vehicle. \n | 2. The vehicle telecommunications network system of claim 1, wherein the first point-of-presence interface is separate from the first security system and the first data center system, and wherein the second point-of-presence interface is separate from the second security system and the second data center system.\n | 3. The vehicle telecommunications network system of claim 1, wherein the first point-of-presence interface, the first security system, and the first data center system are associated with a first geographic region, and \nwherein the second point-of-presence interface, the second security system, and the second data center system are associated with a second geographic region that is different from the first geographic region. \n | 4. The vehicle telecommunications network system of claim 3, wherein the first vehicle is located within the first geographic region and wherein the second vehicle is located within the second geographic region.\n | 5. The vehicle telecommunications network system of claim 1, wherein the first point-of-presence interface is further configured to obtain the second communication associated with the second vehicle when the second point-of-presence interface is unavailable.\n | 6. The vehicle telecommunications network system of claim 1, wherein the second point-of-presence interface is further configured to obtain the first communication associated with the first vehicle when the first point-of-presence interface is unavailable.\n | 7. The vehicle telecommunications network system of claim 1, further comprising: \na first vehicle assistance system configured to facilitate a provision of assistance to at least one of the first vehicle or a first user of the first vehicle; and \na second vehicle assistance system configured to facilitate a provision of assistance to at least one of the second vehicle or a second user of the second vehicle. \n | 8. A vehicle telecommunications network system comprising: \na first point-of-presence interface configured to obtain a first communication associated with a first vehicle and to route the first communication, wherein the first point-of-presence interface is remote from the first vehicle; \na first security system configured to authenticate the first vehicle, wherein the first point-of-presence interface is separate from the first security system, wherein the first security system is remote from the first vehicle; and \na first data center system that is remote from the first vehicle, the first data center system configured to receive the first communication associated with the first vehicle that is authenticated and to provide data to the first vehicle, wherein the first point-of-presence interface is separate from the first data center system. \n | 9. The vehicle telecommunications network system of claim 8, further comprising: \na second point-of-presence interface configured to obtain a second communication associated with a second vehicle and to route the second communication associated with the second vehicle, wherein the second point-of-presence interface is remote from the second vehicle and the first vehicle; \na second security system configured to authenticate the second vehicle, wherein the second point-of-presence interface is separate from the second security system, wherein the second security system is remote from the second vehicle and the first vehicle; and \na second data center system that is remote from the first vehicle and the second vehicle, the second data center system configured to receive the second communication associated with the second autonomous vehicle that is authenticated and to provide data to the second autonomous vehicle, wherein the second point-of-presence interface is separate from the second data center system, wherein the first point-of-presence interface, the first security system, and the first data center are remote from the second vehicle. \n | 10. The vehicle telecommunications network system of claim 9, wherein the first point-of-presence interface, the first security system, and the first data center system are associated with a first geographic region, and \nwherein the second point-of-presence interface, the second security system, and the second data center system are associated with a second geographic region that is different from the first geographic region. \n | 11. The vehicle telecommunications network system of claim 10, wherein the first vehicle is located within the first geographic region, and wherein the second vehicle is located within the second geographic region.\n | 12. The vehicle telecommunications network system of claim 9, wherein the first point-of-presence interface is further configured to obtain the second communication associated with the second vehicle when the second point-of-presence interface is unavailable, and wherein the second point-of-presence interface is further configured to obtain the first communication associated with the first vehicle when the first point-of-presence interface is unavailable.\n | 13. The vehicle telecommunications network system of claim 8, wherein the first point-of-presence interface is further configured to route the first communication associated with the first vehicle to vehicle assistance system.\n | 14. The vehicle telecommunications network system of claim 8, wherein the first point-of-presence interface is configured to allow the first vehicle to access a public internet network.\n | 15. The vehicle telecommunications network system of claim 8, wherein the first point-of-presence interface is physically separate from the first security system and the first data center system.\n | 16. The vehicle telecommunications network system of claim 8, wherein the first point-of-presence interface is logically separate from the first security system and the first data center system.\n | 17. The vehicle telecommunications network system of claim 8, wherein the vehicle telecommunications network system does not utilize internet protocol security.\n | 18. A computer-implemented method for facilitating communication with vehicles comprising: \nobtaining, by a point-of-presence interface, a communication associated with vehicle, wherein the point-of presence interface is remote from the vehicle; \ndetermining, by the point-of presence interface, a recipient computing system that is remote from the vehicle based at least in part on the communication from the vehicle, wherein the recipient computing system is separate from the point-of presence interface; \nfacilitating, by the point-of-presence interface, an authentication of the vehicle by a security system, wherein the point-of-presence interface is separate from the security system; \nproviding, by the point-of-presence interface, the communication associated with the vehicle that is authenticated to the recipient computing system. \n | 19. The computer-implemented method of claim 18, further comprising: \nobtaining, by the point-of-presence interface from the recipient computing system, data in response to the communication; and \nproviding, by the point-of-presence interface, the data to the vehicle. \n | 20. The computer-implemented method of claim 18, wherein the point-of-presence interface is a first point-of-presence interface associated with a first geographic region, wherein the vehicle is located in a second geographic region that is different from the first geographic region, wherein a second point-of-presence interface associated with the second geographic region is unavailable, and wherein obtaining the communication associated with the vehicle comprises: \nobtaining, by the point-of-presence interface, a communication from the vehicle located in the second geographic region based at least in part on the second point-of-presence interface being unavailable.", "output": "The vehicle telecommunication network system (100) has a first point-of-presence interface and a second point-of-presence interface that obtains first and second communication associated with first and second vehicles and routes the first and second communication. A first security system and a second security system authenticates the first and second vehicles. The second security system is remote from the second vehicle and the first vehicle. A first data center system is remote from the first vehicle. A second data center system that is remote from the first vehicle and the second vehicle. The first and second data center systems receive the first and second communication associated with the first and second vehicles that are authenticated and to provide data to the first and second vehicles. The first point-of-presence interface, the first security system and the first data center are remote from the second vehicle. An INDEPENDENT CLAIM is included for a computer-implemented method for facilitating communication with vehicles. Vehicle telecommunication network system for communicating data to and from vehicle e.g. autonomous vehicle. The autonomous vehicle telecommunication network system provides an improved infrastructure to facilitate communication between an autonomous vehicle and computing system that are remote from the vehicle. The system provides redundant regional telecommunication systems that decrease transmission latency, and offer better reliability and scalability. The drawing shows a schematic view of an autonomous vehicle system. 100Vehicle telecommunication network system105Vehicle computing system110Vehicle115Operation computing system120User", "instruction": "Please summarize the input" }, { "input": "vehicle management systemThe invention claims a system, method and vehicle for the vehicle to stop the service. In one example embodiment, a method includes: data one or more of airborne unmanned vehicle on the autonomous computing device obtains one or more parameters indicating tool associated with the autonomous unmanned vehicle. the autonomous unmanned vehicle providing the vehicle service to one or more users to the vehicle service is configured. the method comprises: by the computing device based at least in part on traffic tool associated with the autonomous driving of the one or more parameters determining there is a fault associated with the autonomous driving vehicle. the method comprising: one or more motion the presence is determined by the computing device based at least in part on the failure of the execution by the autonomous driving of the vehicle. the method comprises the following steps: performing the action by the computing device at least partially in one or more such that the autonomous driving vehicle stopping service based on the failure.|1. A computer-implemented method for enabling a vehicle to stop the service implementation, which comprises: data one or more of airborne unmanned vehicle on the autonomous computing device obtains one or more parameters indicating tool associated with the autonomous unmanned vehicle, wherein the autonomous driving vehicle providing the vehicle service to one or more users to the vehicle service is configured, by the one or more computing devices based at least in part on traffic tool associated with the autonomous driving of the one or more parameters determining there is a fault associated with the autonomous driving vehicle; the existence determination by the one or more computing devices based at least in part on the failure of the unmanned vehicle by the autonomous execution of one or more actions, and by the one or more computing devices at least partially executing the action in one or more such that the autonomous driving vehicle stopping service based on the failure.\n | 2. The method implemented by the computer according to claim 1, wherein the autonomous driving vehicle and indicating the autonomous driving vehicle can be used or not available for state provides the vehicle service is associated. wherein at least one of the actions comprises the state adjusting tool associated with the autonomous unmanned vehicle, and wherein the one or more computing devices executing the action in one or more such that the vehicle stops service comprising: by the one or more computing devices to adjust and the autonomous unmanned vehicle associated with the state to indicate that the autonomous driving vehicle is not available for providing the vehicle service.\n | 3. The method implemented by the computer according to claim 2, wherein the one or more computing devices adjusting traffic tool associated with the autonomous driving of the state to indicate that the autonomous driving vehicle is not available for providing the traffic with the service comprises: indicating the autonomous driving vehicle is not available for data provides the vehicle service provided to away from the autonomous driving vehicle to one or more remote computing devices by the one or more computing devices.\n | 4. The method implemented by the computer according to claim 3, wherein the one or more computing devices to perform one or more of the action based at least in part on the fault to make the autonomous unmanned vehicle stops service comprises removing the autonomous unmanned vehicle from the service queue associated with the vehicle by the one or more computing devices.\n | 5. The method implemented by the computer according to claim 2, wherein when the state associated with the vehicle indicates that the autonomous driving vehicle can be used to provide the vehicle service, the autonomous unmanned vehicle does not accept a request for the service of the vehicle.\n | 6. The method implemented by the computer according to claim 1, wherein the one or more computing devices determining the one or more actions comprising: by the one or more computing devices based at least in part on one or more characteristics of the fault to determine the severity of the fault; by the one or more computing devices at least in part determines the operating state of the autonomous unmanned vehicle based on the severity of the fault, wherein said operating state indicates the autonomous driving vehicle is in condition to provide the service of the vehicle; the operation state and by the one or more computing devices based at least in part on the autonomous unmanned vehicle determining the one or more actions.\n | 7. The method executed by computer according to claim 6, wherein said operating state indicates the autonomous unmanned vehicle is in condition to the self-drive vehicle of one or more current users providing the service of the vehicle, and wherein the autonomous driving vehicle is configured to stop the service on the vehicle before finishing current provided by the user to the one or more of the vehicle service.\n | 8. The method implemented by the computer according to claim 6, wherein said operating state indicates the vehicle is not in a condition to provide the service of the vehicle.\n | 9. The method implemented by the computer according to claim 8, wherein the autonomous driving vehicle is configured to stop the one or more current user providing the vehicle service.\n | 10. The method implemented by the computer according to claim 1, wherein at least one of the actions includes a travel to and reaches the maintenance position, and further comprising: by the one or more computing devices the one or more control command signal provided to the autonomous driving one or more system vehicle is airborne to cause the autonomous unmanned vehicle travel to and reaches the maintenance position wherein the method.\n | 11. The method implemented by the computer according to claim 10, wherein the one or more computing devices based at least in part on traffic tool associated with the autonomous driving of the one or more parameters determined from the presence of the autonomous unmanned vehicle associated with said fault comprises at least one of threshold value to the parameter by the one or more computing device is associated with the autonomous driving of the vehicle.\n | 12. The method implemented by the computer according to claim 11, further comprising: at least one geographic position by the one or more computing devices to obtain data indicative of the maintenance position, wherein the data indicative of the maintenance position indicative of the maintenance position; travel route of the geographical position by the one or more computing devices based at least in part on the determined the maintenance position to the maintenance position, data of one or more travel factors by the one or more computing devices obtaining an indication associated with the travel route; and by the one or more computing devices at least partially determining the threshold based on the travel route and the one or more travel factors necessary degree, wherein said threshold value indicates that the autonomous driving vehicle to traverse the travel route and reaches the maintenance position needed by said geographic location of said at least one parameter.\n | 13. A computing system for the vehicle to stop the service, the system comprising: one or more memory devices on one or more processor for autonomous driving vehicle is airborne, and the autonomous unmanned vehicle airborne, the one or more memory devices storing instructions, the instructions when executed by the one or more processors, cause the one or more processors to perform operations, the operations comprising: data obtaining an indication with the autonomous unmanned vehicle associated with one or more parameters; wherein the autonomous driving vehicle providing vehicle in one or more user configured service to the vehicle service. and wherein the autonomous unmanned vehicle and indicating that the autonomous driving vehicle can be used or not available for state provides the vehicle service is associated, based at least in part on the presence and the autonomous driving fault vehicle associated with comparing the determined threshold value and one or more of the autonomous unmanned vehicle associated with said one or more parameters; the existence determination based at least in part on the failure of the unmanned vehicle by the autonomous execution of one or more actions, wherein the action in one or more comprises adjusting the state associated with the autonomous unmanned vehicle, and based at least in part on the fault adjustment with the autonomous driving vehicle associated with the state to indicate that the autonomous driving vehicle is not available to service the traffic tool.\n | 14. The computing system according to claim 13, wherein the adjusting and the autonomous driving vehicle associated with the state to indicate that the autonomous driving vehicle is not available for providing the vehicle service comprises: indicating the autonomous driving vehicle will data service queue from service associated with said traffic tool for providing removable away from the autonomous driving vehicle to one or more remote computing devices.\n | 15. The computing system according to claim 13, wherein at least one of the actions comprises travel to and reaches the maintenance position, and further comprising: the one or more control command signal provided to the autonomous driving one or more system vehicle is airborne to cause the autonomous unmanned vehicle travel to and reaches the maintenance position wherein the operation.\n | 16. The computing system according to claim 15, wherein at least one of the parameters indicates that the autonomous driving amount of available data storage device on the vehicle onboard.\n | 17. The computing system according to claim 16, wherein at least one threshold indicating the threshold value amount of available data storage device, and wherein the threshold value for data storage device is based at least in part on the autonomous driving vehicle travel to and onto the data storage device an amount required for the maintenance position.\n | 18. An autonomous unmanned vehicle, comprising: one or more system of the autonomous unmanned vehicle is airborne, one or more processors of the autonomous unmanned vehicle is airborne, and one or a plurality of memory devices of the autonomous unmanned vehicle is airborne. the one or more memory devices storing instructions, the instructions when executed by the one or more processors, cause the one or more processors to perform operations, the operations comprising: data obtaining an indication with the autonomous unmanned vehicle associated with one or more parameters; one or more of the system wherein at least a portion of the data on airborne by the autonomous driving vehicle in the claims, wherein the autonomous driving vehicle included in a plurality of vehicles associated with a service provider. and wherein the autonomous driving vehicle to vehicle provides the service provider to one or more users of the service is configured, based at least in part on the autonomous driving vehicle associated with the one or more parameters determining the presence and the associated autonomous driving the fault vehicle; and at least partially performing one or more actions to make the vehicle stop service based on the failure, so that the autonomous driving vehicle is not available for providing the vehicle service.\n | 19. The computing device of autonomous driving vehicle according to claim 18, wherein associated with said service provider will not in the autonomous driving vehicle when stopping service to the autonomous driving vehicle provides one or more request for the service of the vehicle.\n | 20. The autonomous driving vehicle according to claim 18, wherein the system of the autonomous unmanned vehicle is airborne in the one or more comprises one or more image capture device, the image capture data device is configured to obtain the operation to be used for the autonomous driving vehicle in the autonomous mode, and wherein the fault and storing the image data are associated.\n | 21. The method implemented by computer of the movement of a stopped vehicle, comprising: data one or more of airborne unmanned vehicle on the autonomous computing device obtains one or more parameters indicating tool associated with the autonomous unmanned vehicle, wherein the autonomous driving vehicle providing vehicle services to one or more users to the vehicle service is configured, by the one or more computing devices based at least in part on traffic tool associated with the autonomous driving of the one or more parameters determining there is a fault associated with the autonomous driving vehicle; the existence determination by the one or more computing devices based at least in part on the failure of the unmanned vehicle by the autonomous execution of one or more actions, wherein at least one of the actions comprises stopping the moving of the autonomous unmanned vehicle; and by the one or more computing devices the one or more control command signal provided to the autonomous driving one or more airborne in the system on the vehicle to facilitate stopping the autonomous driving vehicle of the motion in response to the presence of the fault.\n | 22. The method implemented by the computer according to claim 21, wherein, in order to facilitate the movement of stopping the vehicle, the airborne system in one or more of the at least partially based on determined following at least one of the stopping position of the autonomous unmanned vehicle, the deceleration rate and the deceleration time delay, and wherein in order to facilitate stopping the autonomous driving the motion of the vehicle, with the unmanned vehicle of the fault self-associated and one or more travelling condition the autonomous driving the one or more system vehicle is airborne to cause the autonomous driving vehicle deceleration.\n | 23. The method implemented by the computer according to claim 22, wherein the stopping position is located in the current running of the autonomous unmanned vehicle lane.\n | 24. The method implemented by the computer according to claim 22, wherein the stop position outside of the autonomous driving vehicle of the current driving lane.\n | 25. The method implemented by the computer according to claim 21, wherein the one or more computing device determines that one or more actions of one or more system is airborne by the autonomous driving vehicle comprises the execution of: by the one or more computing devices at least partially determining the severity of the fault based on one or more characteristics of the fault, and by the one or more computing devices at least partially determining the action of one or more based on the severity of the fault.\n | 26. The method implemented by the computer according to claim 25, wherein the deceleration rate of the autonomous unmanned vehicle is based at least in part on the severity associated with the fault.\n | 27. The method implemented by the computer according to claim 21, wherein at least one of the actions comprises passing the vehicle stops service, further comprising: by the one or more computing devices executing the action in one or more such that the autonomous driving vehicle stop service so that the autonomous driving vehicle is not available for providing the vehicle service wherein the method.\n | 28. The method implemented by the computer according to claim 21, wherein the one or more action comprises one or more of the fault notification in the user, and further comprising: providing data of the presence indicative of the failure via one or more display device by the one or more computing devices for display wherein the method.\n | 29. The method implemented by the computer according to claim 21, wherein the action of one or more request containing different vehicle provides the vehicle to the one or more user service, and further comprising: indicating the data request so that the different vehicle provides the vehicle to the one or more user service provided to one or more remote computing devices associated with a service provider by the one or more computing devices wherein the method.\n | 30. data The method implemented by the computer according to claim 21, wherein the user of said one or more parameters comprises an indication associated with the fault of the input.\n | 31. A computing system for stopping the movement of the traffic tool, comprising: an autonomous driving one or more processor is onboard of the vehicle, and one or more memory devices of the autonomous unmanned vehicle is airborne. the one or more memory devices storing instructions, the instructions when executed by the one or more processors, cause the one or more processors to perform operations, the operations comprising: data obtaining an indication with the autonomous unmanned vehicle associated with one or more parameters; wherein the autonomous driving vehicle providing the vehicle service to one or more users to the vehicle service is configured; based at least in part on traffic tool associated with the autonomous driving of the one or more parameters determining there is a fault associated with the autonomous driving of the vehicle, determined at least in part by the autonomous unmanned vehicle to perform one or more actions, wherein at least one of the actions comprises stopping the moving of the autonomous unmanned vehicle, based on said failure and the one or more control command signal provides one or more of the system on the autonomous driving of the vehicle onboard to facilitate stopping the autonomous driving vehicle of the motion in response to the presence of the fault.\n | 32. The computing system according to claim 31, wherein at least partially determining the one or more actions executed by the autonomous driving vehicle comprises determining a severity associated with the fault, and at least partially based on information associated with the fault of the severity determining the one or more actions based on the fault.\n | 33. The computing system according to claim 31, wherein the operations further comprise: the indication of the traffic tool to request of data maintenance is provided away from the autonomous driving vehicle to one or more remote computing devices.\n | 34. data The computing system according to claim 31, wherein the one or more parameters comprises an indication associated with the fault of the input, and wherein said operation further comprises: the indication provided to far away from the autonomous driving vehicle of one or more remote computing device pair confirmation data of the request of the presence of the fault.\n | 35. The system according to claim 31, wherein the operations further comprise: via one or more display devices provide data indicating one or more characteristics of the fault for display.\n | 36. An autonomous unmanned vehicle, comprising: one or more system of the autonomous unmanned vehicle is airborne, one or more processors of the autonomous unmanned vehicle is airborne, and one or a plurality of memory devices of the autonomous unmanned vehicle is airborne. the one or more memory devices storing instructions, the instructions when executed by the one or more processors, cause the one or more processors to perform operations, the operations comprising: data obtaining an indication with the autonomous unmanned vehicle associated with one or more parameters; one or more of said data indicates that said one or more parameters wherein at least a portion of the airborne on via the autonomous unmanned vehicle for providing in said system, and wherein the autonomous driving vehicle providing the vehicle service to one or more users to the vehicle service is configured, based at least in part on traffic tool associated with the autonomous driving of the one or more parameters determining the presence and the associated autonomous driving the fault vehicle, determined at least in part by the autonomous unmanned vehicle to perform one or more actions based on the fault; and the one or more control command signal provides one or more of the system on the autonomous airborne unmanned vehicle to perform one or more of the actions is to facilitate stopping movement of the autonomous driving vehicle in response to the presence of the fault.\n | 37. The autonomous driving vehicle according to claim 36, wherein the one or more configured in the airborne system based at least in part on the fault determining a stop position of the vehicle.\n | 38. The autonomous driving vehicle according to claim 36, wherein the parameter in the one or more of user input indicating the request input by the user indicates to stop the vehicle.\n | 39. The autonomous driving vehicle according to claim 38, which further comprises one or a plurality of display apparatus, and further comprising: via the one or more display devices provide indication data of the request of stopping the vehicle for display wherein the operation.\n | 40. The autonomous driving vehicle according to claim 39, further comprising one or more of an audio output device, and further comprising: the indication request so that a human operator with at least one of the autonomous unmanned vehicle of the current user communication of data via the display device and the audio output device is supplied to from the autonomous driving vehicle to one or more remote computing device wherein the operation.", "output": "The method involves obtaining the data (604,608) that indicates parameters associated with the autonomous vehicle. An existence of a fault associated with the autonomous vehicle is determined. The service request is rejected that is associated with the vehicle service. The level of severity of the fault is determined by the computing devices based on the characteristics of the fault. An operational state of the autonomous vehicle is determined based on the level of severity of the fault. The autonomous vehicle is controlled to travel to and arrive at a maintenance location. INDEPENDENT CLAIMS are included for the following:a computing system for taking a vehicle out-of-service; andan autonomous vehicle has processors and a memory device for storing instructions to obtain data of parameters. Method for taking a vehicle, particularly autonomous vehicle (Claimed) out-of-service. The potential latency issues are avoided that arises from remote computing device to process vehicle fault diagnosis request. The computationally efficient approach is provided to address vehicle faults, while saving computational resources. The computational response time is reduced for addressing the determined faults. The drawing shows a schematic view of a user interface. 600User interface602Display device604,608Data606User interface element", "instruction": "Please summarize the input" }, { "input": "Systems and Methods for Mitigating Vehicle Pose Error Across an Aggregated Feature MapSystems and methods for improved vehicle-to-vehicle communications are provided. A system can obtain sensor data depicting its surrounding environment and input the sensor data (or processed sensor data) to a machine-learned model to perceive its surrounding environment based on its location within the environment. The machine-learned model can generate an intermediate environmental representation that encodes features within the surrounding environment. The system can receive a number of different intermediate environmental representations and corresponding locations from various other systems, aggregate the representations based on the corresponding locations, and perceive its surrounding environment based on the aggregated representations. The system can determine relative poses between the each of the systems and an absolute pose for each system based on the representations. Each representation can be aggregated based on the relative or absolute poses of each system and weighted according to an estimated accuracy of the location corresponding to the representation.What is claimed is: \n | 1. A computer-implemented method, the method comprising: \nobtaining, by a computing system comprising one or more computing devices onboard an autonomous vehicle, sensor data associated with an environment of a first autonomous vehicle; \nobtaining, by the computing system, estimated location data indicative of a first estimated pose of the first autonomous vehicle; \ndetermining, by the computing system, a first intermediate environmental representation of at least a first portion of the environment of the first autonomous vehicle based, at least in part, on the sensor data; \nobtaining, by the computing system, a first message from a second autonomous vehicle, wherein the first message comprises a second intermediate environmental representation of at least a second portion of the environment of the first autonomous vehicle and second estimated location data indicative of a second estimated pose of the second autonomous vehicle; \ndetermining, by the computing system, a first relative pose between the first autonomous vehicle and the second autonomous vehicle based, at least in part, on the intermediate environmental representation and the second intermediate environmental representation; \ngenerating, by the computing system, an updated intermediate environmental representation based, at least in part, on the first intermediate environmental representation, the second intermediate environmental representation, and the first relative pose; and \ngenerating, by the computing system, an autonomy output for the first autonomous vehicle based, at least in part, on the updated intermediate environmental representation. \n | 2. The computer-implemented method of claim 1, wherein the first intermediate environmental representation is a first feature map encoded with a first plurality of encoded features representative of the first portion of the environment; and \nwherein the second intermediate environmental representation is a second feature map encoded with a second plurality of encoded features representative of the second portion of the environment. \n | 3. The computer-implemented method of claim 2, wherein determining the relative pose between the first autonomous vehicle and the second autonomous vehicle based, at least in part, on the first intermediate environmental representation and the second intermediate environmental representation, comprises: \ngenerating, by the computing system, an intermediate environmental representation pair by concatenating the first plurality of encoded features and second plurality of encoded features along a features dimension; \ninputting, by the computing system, the intermediate environmental representation pair to a machine-learned relative pose regression model configured to output the relative pose based, at least in part, on the intermediate environmental representation pair. \n | 4. The computer-implemented method of claim 1, wherein method further comprises: \nobtaining, by the computing system, a second message from a third autonomous vehicle, wherein the second message comprises a third intermediate environmental representation of at least a third portion of the environment of the first autonomous vehicle and third estimated location data indicative of a third estimated pose of the third autonomous vehicle; \ndetermining, by the computing system, a second relative pose between the first autonomous vehicle and the third autonomous vehicle based, at least in part, on the first intermediate environmental representation and the third intermediate environmental representation; and \ngenerating, by the computing system, the updated intermediate environmental representation based, at least in part, on the third intermediate environmental representation and the second relative pose between the first autonomous vehicle and the third autonomous vehicle. \n | 5. The computer-implemented method of claim 4, wherein the first relative pose is indicative of a first displacement between the first autonomous vehicle and the second autonomous vehicle, wherein the second relative pose is indicative of a second displacement between the first autonomous vehicle and the third autonomous vehicle, and wherein the method further comprises: \ndetermining, by the computing system, a third relative pose between the second autonomous vehicle and the third autonomous vehicle based, at least in part, on the second intermediate environmental representation and the third intermediate environmental representation, the third relative pose indicative of a third displacement between the second autonomous vehicle and the third autonomous vehicle; and \ngenerating, by the computing system, the updated intermediate environmental representation based, at least in part, on the third relative pose. \n | 6. The computer-implemented method of claim 5, wherein the first estimated pose is indicative of one or more first estimated spatial coordinates and a first estimated orientation for the first autonomous vehicle, the second estimated pose is indicative of one or more second estimated spatial coordinates and a second estimated orientation for the second autonomous vehicle, and the third estimated pose is indicative of one or more third estimated spatial coordinates and a third estimated orientation for the third autonomous vehicle.\n | 7. The computer-implemented method of claim 5, further comprising: \ndetermining, by the computing system, a first absolute pose for the first autonomous vehicle based, at least in part, on the first relative pose, the second relative pose, and the third relative pose; \ndetermining, by the computing system, a second absolute pose for the second autonomous vehicle based, at least in part, on the first relative pose, the second relative pose, and the third relative pose; \ndetermining, by the computing system, a third absolute pose for the third autonomous vehicle based, at least in part, on the first relative pose, the second relative pose, and the third relative pose; and \ngenerating, by the computing system, the updated intermediate environmental representation based, at least in part, on the first absolute pose, the second absolute pose, and the third absolute pose. \n | 8. The computer-implemented method of claim 7, wherein the first absolute pose is indicative of one or more first updated spatial coordinates and a first updated orientation for the first autonomous vehicle, the second absolute pose is indicative of one or more second updated spatial coordinates and a second updated orientation for the second autonomous vehicle, and the third absolute pose is indicative of one or more third updated spatial coordinates and a third updated orientation for the third autonomous vehicle.\n | 9. The computer-implemented method of claim 8, wherein generating the updated intermediate environmental representation comprises: \ngenerating, by the computing system using a machine-learned aggregation model, a second transformed intermediate environmental representation by transforming the second intermediate environmental representation based, at least in part, on the one or more second updated spatial coordinates and a second updated orientation for the second autonomous vehicle; \ngenerating, by the computing system using the machine-learned aggregation model, a third transformed intermediate environmental representation by transforming the third intermediate environmental representation based, at least in part, on the one or more third updated spatial coordinates and the third updated orientation for the second autonomous vehicle; and \ngenerating, by the computing system using the machine-learned aggregation model, the updated intermediate environmental representation based, at least in part, on the first intermediate environmental representation, the second transformed intermediate environmental representation, and the third transformed intermediate environmental representation. \n | 10. The computer-implemented method of claim 9, wherein generating the updated intermediate environmental representation further comprises: \nassigning, by the computing system using the machine-learned aggregation model, a second weight to the second transformed intermediate environmental representation; \nassigning, by the computing system using the machine-learned aggregation model, a third weight to the third transformed intermediate environmental representation; and \ngenerating, by the computing system using the machine-learned aggregation model, the updated intermediate environmental representation based, at least in part, on the second weight and the third weight. \n | 11. A computing system comprising: \none or more processors; and \none or more non-transitory computer-readable media that collectively store instructions that, when executed by the one or more processors, cause the system to perform operations, the operations comprising: \nobtaining sensor data associated with an environment of a first autonomous vehicle; \nobtaining estimated location data indicative of a first estimated pose of the first autonomous vehicle; \ndetermining, via a first portion of a machine-learned detection and prediction model, a first intermediate environmental representation of at least a first portion of the environment of the first autonomous vehicle based, at least in part, on the sensor data; \nobtaining a first message from a second autonomous vehicle, wherein the first message comprises a second intermediate environmental representation of at least a second portion of the environment of the first autonomous vehicle and second estimated location data indicative of a second estimated pose of the second autonomous vehicle; \ndetermining, via a machine-learned regression model, a first relative pose between the first autonomous vehicle and the second autonomous vehicle based, at least in part, on the intermediate environmental representation and the second intermediate environmental representation; and \ngenerating, via a second portion of the machine-learned detection and prediction model, an autonomy output for the first autonomous vehicle based, at least in part, on the first intermediate environmental representation, the second intermediate environmental representation, and the first relative pose. \n | 12. The computing system of claim 11, wherein the operations further comprise: \ngenerating, via the second portion of the machine-learned detection and prediction model, an updated intermediate environmental representation based, at least in part, on the first intermediate environmental representation, the second intermediate environmental representation, and the first relative pose; and \ngenerating, via the second portion of the machine-learned detection and prediction model, the autonomy output for the first autonomous vehicle based, at least in part, on the updated intermediate environmental representation. \n | 13. The computing system of claim 12, wherein generating the updated intermediate environmental representation comprises: \nobtaining a second message from a third autonomous vehicle, wherein the second message comprises a third intermediate environmental representation of at least a third portion of the environment of the first autonomous vehicle and third estimated location data indicative of a third estimated pose of the third autonomous vehicle; \ndetermining, via the machine-learned regression model, a second relative pose between the first autonomous vehicle and the third autonomous vehicle based, at least in part, on the first intermediate environmental representation and the third intermediate environmental representation; and \ngenerating, via the second portion of the machine-learned detection and prediction model, the updated intermediate environmental representation based, at least in part, on the third intermediate environmental representation and the second relative pose. \n | 14. The computing system of claim 13, wherein the second portion of the machine-learned detection and prediction model comprises a machine-learned attention model configured to weigh a plurality of intermediate environmental representations.\n | 15. The computing system of claim 14, wherein generating the updated intermediate environmental representation comprises: \ndetermining, via the machine-learned attention model, a first weight for the second intermediate environmental representation; \ndetermining, via the machine-learned attention model, a second weight for the third intermediate environmental representation; and \ngenerating, via the second portion of the machine-learned detection and prediction model, the updated intermediate environmental representation based, at least in part, on the first weight and the second weight. \n | 16. The computing system of claim 15, wherein the first weight is indicative of a first predicted accuracy of the second estimated pose, and wherein the second weight is indicative of a second predicted accuracy of the third estimated pose.\n | 17. The computing system of claim 11, wherein the machine-learned detection and prediction model and the machine-learned regression model are trained end-to-end via backpropagation.\n | 18. An autonomous vehicle comprising: \none or more sensors; \none or more processors; and \none or more tangible, non-transitory, computer readable media that collectively store instructions that when executed by the one or more processors cause the one or more processors to perform operations, the operations comprising: \nobtaining, via the one or more sensors, sensor data associated with a surrounding environment of the autonomous vehicle; \ndetermining a first intermediate environmental representation of at least a first portion of the surrounding environment of the autonomous vehicle based, at least in part, on the sensor data; \nobtaining a plurality of messages from a plurality of respective autonomous vehicles, wherein each respective message of the plurality of messages comprises a respective intermediate environmental representation of at least another portion of the surrounding environment associated with a respective autonomous vehicle of the plurality of respective autonomous vehicles; \ndetermining a plurality of relative poses based, at least in part, on the first intermediate environmental representation and the respective intermediate environmental representation, the plurality of relative poses comprising a respective relative pose between the autonomous vehicle and each of the plurality of respective autonomous vehicles; \ndetermining a plurality of absolute poses based, at least in part, on the plurality of relative poses, the plurality of absolute poses comprising a respective absolute pose for the autonomous vehicle and each of the plurality of respective autonomous vehicles; \ngenerating an updated intermediate environmental representation based, at least in part, on the first intermediate environmental representation, the respective intermediate environmental representation, and at least one of the plurality of absolute poses; and \ngenerating an autonomy output for the autonomous vehicle based, at least in part, on the updated intermediate environmental representation. \n | 19. The autonomous vehicle of claim 18, wherein the sensor data comprises three-dimensional data representative of the surrounding environment of the autonomous vehicle.\n | 20. The autonomous vehicle of claim 18, wherein the autonomy output comprises one or more bounding boxes indicative of one or more objects within the surrounding environment of the autonomous vehicle.", "output": "The method involves obtaining (702) sensor data associated with an environment of an autonomous vehicle. The estimated location data indicative of an estimated pose of the autonomous vehicle is obtained (704). An intermediate environmental representation of a portion of the environment of the vehicle is determined (706) based on the sensor data. A message is obtained (708) from another autonomous vehicle, where the message comprises another intermediate representation. A relative pose between the autonomous vehicles is determined (712). An updated intermediate representation is generated (716). An autonomy output is generated (718) for the first autonomous vehicle based on updated representation by a computing system. The intermediate representations are feature maps encoded with encoded features representative of the portions of environment. INDEPENDENT CLAIMS are included for: (1) a computing system comprises multiple processors; (2) an autonomous vehicle comprises multiple sensors. Method for mitigating vehicle pose errors across an aggregated feature map used for performing autonomous vehicle operations by a computing system. Uses include but are not limited to a laptop computer, a tablet computer, an ultrabook, a smartphone, a Personal digital assistant (PDA), and a wearable device. The autonomous vehicle technology improves the ability of an autonomous vehicle to effectively provide vehicle services to others and support the various members of the community in which the autonomous vehicle is operating, including persons with reduced mobility and/or persons that are underserved by other transportation options. The drawing shows a flow chart of the method. 702Obtaining sensor data associated with an environment of a first autonomous vehicle 704Obtaining estimated location data indicative of an estimated pose of the autonomous vehicle 706Determining intermediate environmental representation of a portion of the environment of the vehicle 708Obtaining message from another autonomous vehicle 712Determining relative pose between the autonomous vehicles 716Generating updated intermediate representation 718Generating autonomy output for first autonomous vehicle", "instruction": "Please summarize the input" }, { "input": "Systems and methods for vehicle-to-vehicle communications for improved autonomous vehicle operationsSystems and methods for vehicle-to-vehicle communications are provided. An example computer-implemented method includes obtaining from a first autonomous vehicle, by a computing system onboard a second autonomous vehicle, a first compressed intermediate environmental representation. The first compressed intermediate environmental representation is indicative of at least a portion of an environment of the second autonomous vehicle and is based at least in part on sensor data acquired by the first autonomous vehicle at a first time. The method includes generating, by the computing system, a first decompressed intermediate environmental representation by decompressing the first compressed intermediate environmental representation. The method includes determining, by the computing system, a first time-corrected intermediate environmental representation based at least in part on the first decompressed intermediate environmental representation. The first time-corrected intermediate environmental representation corresponds to a second time associated with the second autonomous vehicle.What is claimed is: \n | 1. A computer-implemented method for vehicle-to-vehicle communications, the method comprising: \nobtaining from a first autonomous vehicle, by a computing system comprising one or more computing devices onboard a second autonomous vehicle, a first compressed intermediate environmental representation, \nwherein the first compressed intermediate environmental representation is indicative of at least a portion of an environment of the second autonomous vehicle and is based at least in part on sensor data acquired by the first autonomous vehicle at a first time; \ngenerating, by the computing system, a first decompressed intermediate environmental representation by decompressing the first compressed intermediate environmental representation; and \ndetermining, by the computing system, a first time-corrected intermediate environmental representation based at least in part on the first decompressed intermediate environmental representation, wherein the first time-corrected intermediate environmental representation is indicative of a time difference between the first time and a second time at which the second autonomous vehicle obtains sensor data of the environment to account for the time difference. \n | 2. The computer-implemented method of claim 1, wherein determining the first time-corrected intermediate environmental representation comprises: \ngenerating, by the computing system, the first time-corrected intermediate environmental representation based at least in part on a machine-learned time correction model, \nwherein the machine-learned time correction model is configured to adjust the first decompressed intermediate environmental representation to account for the time difference between the first time and the second time. \n | 3. The computer-implemented method of claim 2, wherein the machine-learned time correction model comprises a neural network.\n | 4. The computer-implemented method of claim 1, wherein the first time is associated with a sensor timestamp of the first autonomous vehicle.\n | 5. The computer-implemented method of claim 1, wherein the second time is indicative of a time at which the second autonomous vehicle obtains the sensor data through one or more sensors of the second autonomous vehicle.\n | 6. The computer-implemented method of claim 1, further comprising: \ndetermining, by the computing system, an updated intermediate environmental representation based at least in part on the first time-corrected intermediate environmental representation. \n | 7. The computer-implemented method of claim 6, further comprising: \ndetermining, by the computing system, an autonomy output based at least in part on the updated intermediate environmental representation. \n | 8. The computer-implemented method of claim 7, further comprising: \ngenerating, by the computing system, a motion plan for the second autonomous vehicle based at least in part on the autonomy output. \n | 9. The computer-implemented method of claim 8, further comprising: \ninitiating, by the computing system, a motion control of the second autonomous vehicle based at least in part on the motion plan. \n | 10. A computing system comprising: \na machine-learned time correction model configured to compensate for time differences between a plurality of times; \none or more processors; and \none or more tangible, non-transitory, computer readable media that collectively store instructions that when executed by the one or more processors cause the computing system to perform operations, the operations comprising: \nobtaining a first compressed intermediate environmental representation from a first autonomous vehicle, wherein the first compressed intermediate environmental representation is based at least in part on sensor data acquired by the first autonomous vehicle at a first time; \ngenerating a first decompressed intermediate environmental representation by decompressing the first compressed intermediate environmental representation; \ndetermining, using the machine-learned time correction model, a first time-corrected intermediate environmental representation indicative of a time difference between the first time and a second time based at least in part on the first decompressed intermediate environmental representation, wherein the first time-corrected intermediate environmental representation is adjusted based at least in part on the time difference associated with a second autonomous vehicle; and \ngenerating an updated intermediate environmental representation based at least in part on the first time-corrected intermediate environmental representation. \n | 11. The computing system of claim 10, wherein the first time is associated with a sensor timestamp of the first autonomous vehicle.\n | 12. The computing system of claim 10, wherein the first time and the second time are based on global positioning system data.\n | 13. The computing system of claim 10, wherein generating the updated intermediate environmental representation based at least in part on the first time-corrected intermediate environmental representation comprises: \naggregating, using a machine-learned aggregation model, the first time-corrected intermediate environmental representation and a second intermediate environmental representation generated by the second autonomous vehicle. \n | 14. The computing system of claim 13, wherein the machine-learned aggregation model is a graph neural network comprising a plurality of nodes.\n | 15. The computing system of claim 14, wherein the machine-learned aggregation model is configured to initialize a node state of at least one node based at least in part on the time difference between the first time and the second time.\n | 16. The computing system of claim 10, wherein the operations further comprise: \ngenerating a motion plan based at least in part on the updated intermediate environmental representation; and \ninitiating a motion control of the second autonomous vehicle based at least in part on the motion plan. \n | 17. An autonomous vehicle comprising: \none or more processors; and \none or more tangible, non-transitory, computer readable media that collectively store instructions that when executed by the one or more processors cause the one or more processors to perform operations, the operations comprising: \nobtaining a first compressed intermediate environmental representation from another autonomous vehicle, wherein the first compressed intermediate environmental representation is based at least in part on sensor data acquired by the other autonomous vehicle at a first time; \ngenerating a first decompressed intermediate environmental representation by decompressing the first compressed intermediate environmental representation; \ndetermining a first time-corrected intermediate environmental representation based at least in part on the first decompressed intermediate environmental representation and one or more machine-learned models, wherein the first time-corrected intermediate environmental representation is indicative of a time difference between the first time and a second time and, wherein the first time-corrected intermediate environmental representation is adjusted based at least in part on the time difference associated with the autonomous vehicle; and \nperforming one or more autonomy operations of the autonomous vehicle based at least in part on the first time-corrected intermediate environmental representation. \n | 18. The autonomous vehicle of claim 17, wherein performing the one or more autonomy operations of the autonomous vehicle based at least in part on the first time-corrected intermediate environmental representation comprises: \ngenerating an autonomy output based at least in part on the first time-corrected intermediate environmental representation. \n | 19. The autonomous vehicle of claim 18, wherein generating the autonomy output based at least in part on the first time-corrected intermediate environmental representation comprises: \ngenerating an updated intermediate environmental representation based at least in part on the first time-corrected intermediate environmental representation; and \ngenerating the autonomy output based at least in part on the updated intermediate environmental representation. \n | 20. The autonomous vehicle of claim 18, wherein the autonomy output is indicative of perception data and prediction data associated with the autonomous vehicle.", "output": "The computer-based method involves obtaining from a first autonomous vehicle and a first compressed intermediate environmental representation by a computing system comprising one or more computing devices onboard a second autonomous vehicle. The first compressed intermediate environmental representation is indicative of a portion of an environment of the second autonomous vehicle and is based on sensor data acquired by the first autonomous vehicle at a first time. A first time-corrected intermediate environmental representation is determined based on the first decompressed intermediate environmental representation by the computing device. The first time-corrected intermediate environmental representation corresponds to a second time associated with the second autonomous vehicle. INDEPENDENT CLAIMS are included for the following: 1. a computing system for vehicle-to-vehicle communication 2. an autonomous vehicle for vehicle-to-vechicle communication Computer- based method for vehicle-to-vehicle communication. The autonomous vehicle technology helps to improve the safety of passengers of an autonomous vehicle, improves the safety of the surroundings of the autonomous vehicle, improves the experience of the rider and operator of the autonomous vehicle. The compressed intermediate environmental representation reduces bandwidth requirements without sacrificing performance. The joint perception or prediction system is configured to perform the functions of the perception system and the prediction system in a coordinated manner for improved speed, efficiency and on-board computational resource cost.", "instruction": "Please summarize the input" }, { "input": "Systems and methods for vehicle-to-vehicle communications for improved autonomous vehicle operationsSystems and methods for vehicle-to-vehicle communications are provided. An example computer-implemented method includes obtaining, by a computing system onboard a first autonomous vehicle, sensor data associated with an environment of the first autonomous vehicle. The method includes determining, by the computing system, an intermediate environmental representation of at least a portion of the environment of the first autonomous vehicle based at least in part on the sensor data. The method includes generating, by the computing system, a compressed intermediate environmental representation by compressing the intermediate environmental representation of at least the portion of the environment of the first autonomous vehicle. The method includes communicating, by the computing system, the compressed intermediate environmental representation to a second autonomous vehicle.What is claimed is: \n | 1. A computer-implemented method for vehicle-to-vehicle communications, the method comprising: \nobtaining, by a first autonomous vehicle, sensor data associated with an environment of the first autonomous vehicle; \ndetermining an intermediate environmental representation of at least a portion of the environment of the first autonomous vehicle based at least in part on the sensor data; \ngenerating a compressed intermediate environmental representation by compressing the intermediate environmental representation of at least the portion of the environment of the first autonomous vehicle; \nselecting a second autonomous vehicle to which to communicate the compressed intermediate environmental representation from among a plurality of autonomous vehicles based at least in part on an ability of the second autonomous vehicle to decompress the compressed intermediate environmental representation, wherein selecting the second autonomous vehicle comprises: \ncommunicating with the second autonomous vehicle as the second autonomous vehicle enters a communication range of the first autonomous vehicle; \nexchanging, with the second autonomous vehicle, data indicating that the second autonomous vehicle is able to decompress the compressed intermediate environmental representation; and \nselecting the second autonomous vehicle based on the data indicating that the second autonomous vehicle is able to decompress the compressed intermediate environmental representation; and \ncommunicating the compressed intermediate environmental representation to the second autonomous vehicle. \n | 2. The computer-implemented method of claim 1, wherein the sensor data comprises three-dimensional point cloud data, and wherein determining the intermediate environmental representation comprises: \ngenerating voxelized sensor data by voxelizing the three-dimensional point cloud data; \ninputting the voxelized sensor data into a machine-learned model, the machine-learned model configured to apply one or more convolutional layers to the voxelized sensor data; and \nobtaining the intermediate environmental representation as an output of the machine-learned model. \n | 3. The computer-implemented method of claim 1, wherein the intermediate environmental representation comprises a feature map describing at least the portion of the environment of the first autonomous vehicle.\n | 4. The computer-implemented method of claim 1, further comprising: \nselecting the second autonomous vehicle to which to communicate the compressed intermediate environmental representation from among a plurality of autonomous vehicles. \n | 5. The computer-implemented method of claim 4, wherein selecting the second autonomous vehicle to which to communicate the compressed intermediate environmental representation from among the plurality of autonomous vehicles comprises: \nselecting, by the computing system, the second autonomous vehicle based at least in part on the communication range of the first autonomous vehicle. \n | 6. The computer-implemented method of claim 1, wherein the sensor data comprises LIDAR point cloud data.\n | 7. The computer-implemented method of claim 1, wherein the sensor data comprises a first type of sensor data and a second type of sensor data, wherein the first type of sensor data is associated with a first sensor modality, and the second type of sensor data is associated with a second sensor modality.\n | 8. The computer-implemented method of claim 1, wherein the sensor data comprises a first set of sensor data acquired by the first autonomous vehicle and a second set of sensor data acquired by another autonomous vehicle.\n | 9. A computing system comprising: \none or more processors; and \none or more tangible, non-transitory, computer readable media that collectively store instructions that when executed by the one or more processors cause the computing system to perform operations, the operations comprising: \nobtaining sensor data associated with an environment of a first autonomous vehicle; \ndetermining an intermediate environmental representation of at least a portion of the environment of the first autonomous vehicle based at least in part on the sensor data and a machine-learned model; \ngenerating a compressed intermediate environmental representation by compressing the intermediated environmental representation of at least the portion of the environment of the first autonomous vehicle; \nselecting a second autonomous vehicle to which to communicate the compressed intermediate environmental representation from among a plurality of autonomous vehicles based at least in part on an ability of the second autonomous vehicle to decompress the compressed intermediate environmental representation, wherein selecting the second autonomous vehicle comprises: \ncommunicating with the second autonomous vehicle as the second autonomous vehicle enters a communication range of the first autonomous vehicle; \nexchanging, with the second autonomous vehicle, data indicating that the second autonomous vehicle is able to decompress the compressed intermediate environmental representation; and \nselecting the second autonomous vehicle based on the data indicating that the second autonomous vehicle is able to decompress the compressed intermediate environmental representation; and \ncommunicating the compressed intermediate environmental representation to the second autonomous vehicle. \n | 10. The computing system of claim 9, wherein the sensor data comprises three-dimensional LIDAR point cloud data.\n | 11. The computing system of claim 9, wherein determining the intermediate environmental representation comprises: \ngenerating voxelized sensor data based at least in part on the sensor data; and \ngenerating the intermediate environmental representation based at least in part on the voxelized sensor data and the machine-learned model. \n | 12. The computing system of claim 9, wherein the second autonomous vehicle is configured to decompress the compressed intermediate environmental representation and utilize the intermediate environmental representation for one or more autonomous operations of the second autonomous vehicle.\n | 13. The computing system of claim 9, wherein the operations further comprise: \nobtaining, from another autonomous vehicle, a second intermediate environmental representation of at least the portion of the environment of the first autonomous vehicle. \n | 14. An autonomous vehicle comprising: \none or more sensors; \none or more processors; and \none or more tangible, non-transitory, computer readable media that collectively store instructions that when executed by the one or more processors cause the one or more processors to perform operations, the operations comprising: \nobtaining, via the one or more sensors, sensor data associated with an environment of the autonomous vehicle; \ndetermining a first intermediate environmental representation of at least a portion of the environment of the autonomous vehicle based at least in part on the sensor data; \ngenerating a first compressed intermediate environmental representation by compressing the first intermediate environmental representation of at least the portion of the environment of the autonomous vehicle; \ndetermining a recipient to which to communicate the first compressed intermediate environmental representation from among a plurality of potential recipients based at least in part on an ability of the recipient to decompress the compressed intermediate environmental representation, wherein determining the recipient comprises: \nexchanging, with the recipient, data indicating that the recipient is able to decompress the compressed intermediate environmental representation; and \nselecting the recipient based on the data indicating that the recipient is able to decompress the compressed intermediate environmental representation; \ncommunicating the first compressed intermediate environmental representation to the recipient. \n | 15. The autonomous vehicle of claim 14, wherein determining the first intermediate environmental representation based at least in part on the sensor data comprises: \ngenerating voxelized sensor data by voxelizing three-dimensional point cloud data of the sensor data; \ninputting the voxelized sensor data into a machine-learned model; and \nreceiving the first intermediate environmental representation as an output of the machine-learned model. \n | 16. The autonomous vehicle of claim 15, wherein the machine-learned model is configured to apply one or more convolutional layers to the voxelized sensor data.\n | 17. The autonomous vehicle of claim 14, wherein the operations further comprise: \nobtaining a second compressed intermediate environmental representation from another autonomous vehicle; \ngenerating a decompressed intermediate environmental representation by decompressing the second compressed intermediate environmental representation; \ndetermining, using one or more machine-learned models, an updated intermediate environmental representation based at least in part on the decompressed intermediate environmental representation and the first intermediate environmental representation generated by the first autonomous vehicle; and \ngenerating an autonomy output for the autonomous vehicle based at least in part on the updated intermediate environmental representation. \n | 18. The autonomous vehicle of claim 17, wherein the operations further comprise compensating for a time delay between the first intermediate environmental representation and the decompressed intermediate environmental representation.", "output": "The method (700) involves obtaining (702) the sensor data associated with an environment of the first autonomous vehicle by a computing system for the multiple computing devices onboard a first autonomous vehicle. An intermediate environmental representation of a portion for the environment in the first autonomous vehicle is determined (704) based on the sensor data by the computing system. A compressed intermediate environmental representation is generated (706) by compressing the intermediate environmental representation of at least the portion for the environment of the first autonomous vehicle by the computing system. The compressed intermediate environmental representation is communicated (710) to a second autonomous vehicle. INDEPENDENT CLAIMS are also included for the following:(a) a computing system for performing an autonomous vehicle operations in vehicle-to-vehicle communication;(b) an autonomous vehicle for vehicle-to-vehicle communications. Method for performing an autonomous vehicle operations in vehicle-to-vehicle communications. Method ensures the more accurate estimates of the object's position, size, and shape, as well as the predicted future trajectory of the object and improve the ability of the autonomous vehicle to safely plan its motion though its environment. The drawing shows a flow chart of a method for performing an autonomous vehicle operations in vehicle-to-vehicle communications.700Method 702Obtaining the sensor data associated with an environment of the first autonomous vehicle by a computing system for the multiple computing devices onboard a first autonomous vehicle 704Determining an intermediate environmental representation of a portion for the environment in the first autonomous vehicle based on the sensor data by the computing system 706Generating the compressed intermediate environmental representation by compressing the intermediate environmental representation of at least the portion for the environment of the first autonomous vehicle by the computing system 710Communicating the compressed intermediate environmental representation to a second autonomous vehicle ", "instruction": "Please summarize the input" }, { "input": "Systems and methods for vehicle-to-vehicle communications for improved autonomous vehicle operationsSystems and methods for vehicle-to-vehicle communications are provided. An example computer-implemented method includes obtaining from a first autonomous vehicle, by a second autonomous vehicle, a first compressed intermediate environmental representation. The first compressed intermediate environmental representation is indicative of at least a portion of an environment of the second autonomous vehicle. The method includes generating a first decompressed intermediate environmental representation by decompressing the first compressed intermediate environmental representation. The method includes determining, using one or more machine-learned models, an updated intermediate environmental representation based at least in part on the first decompressed intermediate environmental representation and a second intermediate environmental representation generated by the second autonomous vehicle. The method includes generating an autonomy output for the second autonomous vehicle based at least in part on the updated intermediate environmental representation.What is claimed is: \n | 1. A computer-implemented method for vehicle-to-vehicle communications, the method comprising: \nobtaining from a first autonomous vehicle, by a computing system comprising one or more computing devices onboard a second autonomous vehicle, a first compressed intermediate environmental representation output by one or more intermediate layers of a machine-learned perception model of the first autonomous vehicle, wherein the first compressed intermediate environmental representation is indicative of at least a portion of an environment of the second autonomous vehicle; \ngenerating, by the computing system, a first decompressed intermediate environmental representation by decompressing the first compressed intermediate environmental representation; \ndetermining, by the computing system using one or more machine-learned models onboard the second autonomous vehicle, an updated intermediate environmental representation based at least in part on the first decompressed intermediate environmental representation and a second intermediate environmental representation output by one or more intermediate layers of a machine-learned perception model of the second autonomous vehicle, \nwherein the one or more machine-learned models used to determine the updated intermediate environmental representation comprise a machine-learned aggregation model configured to aggregate a plurality of intermediate environmental representations, \nwherein the machine-learned aggregation model comprises a graph neural network, and wherein the graph neural network comprises a plurality of nodes, each respective node of the graph neural network corresponding to a respective autonomous vehicle of a plurality of autonomous vehicles within the environment of the second autonomous vehicle, wherein the plurality of autonomous vehicles comprises the first autonomous vehicle, \nwherein at least one node of the machine-learned aggregation model is configured to be updated based on a change to the plurality of autonomous vehicles that are within the environment of the second autonomous vehicle; and \ngenerating, by the computing system, an autonomy output for the second autonomous vehicle based at least in part on the updated intermediate environmental representation. \n | 2. The computer-implemented method of claim 1, wherein the machine-learned aggregation model is configured based on a number of transmitter vehicles within the environment of the second autonomous vehicle.\n | 3. The computer-implemented method of claim 1, wherein each respective autonomous vehicle of the plurality of autonomous vehicles is associated with a respective set of spatial coordinates, and wherein the machine-learned aggregation model is configured to transform the first decompressed intermediate environmental representation based at least in part on a set of spatial coordinates associated with the first autonomous vehicle.\n | 4. The computer-implemented method of claim 1, wherein determining, onboard the second autonomous vehicle, the updated intermediate environmental representation comprises: \ndetermining, by a time delay correction model, a first time-corrected intermediate environmental representation, wherein the first-time corrected intermediate environmental representation comprises a first time associated with a sensor timestamp of the first autonomous vehicle; \nobtaining, by the time delay correction model, a second time indicative of a time at which the second autonomous vehicle intends to perceive the environment; \ngenerating, an updated time-corrected intermediate environmental representation, \nwherein the updated time-corrected intermediate environmental representation accounts for a time delay associated with the first time-corrected intermediate environmental representation obtained from the first autonomous vehicle and the second time at which the second autonomous vehicle intends to perceive the environment; and \ndetermining, by the computing system, the updated intermediate environmental representation based at least in part on the updated time-corrected intermediate environmental representation. \n | 5. The computer-implemented method of claim 4, wherein determining the updated intermediate environmental representation further comprises: \ndetermining, by the computing system, the updated intermediate environmental representation based at least in part on the first time-corrected intermediate environmental representation, the second intermediate environmental representation generated by the second autonomous vehicle, and the machine-learned aggregation model. \n | 6. The computer-implemented method of claim 5, wherein the machine-learned aggregation model is configured to aggregate the first time-corrected intermediate environmental representation and the second intermediate environmental representation and provide the updated intermediate environmental representation as an output of the machine-learned aggregation model.\n | 7. The computer-implemented method of claim 1, wherein generating the autonomy output for the second autonomous vehicle based at least in part on the updated intermediate environmental representation comprises: \ninputting, by the computing system, the updated intermediate environmental representation into the machine-learned perception model of the second autonomous vehicle; and \nobtaining, by the computing system, the autonomy output as an output of the machine-learned perception model of the second autonomous vehicle. \n | 8. The computer-implemented method of claim 1, wherein the autonomy output is indicative of a bounding shape associated with an object within the environment of the second autonomous vehicle and one or more predicted future locations of the object.\n | 9. The computer-implemented method of claim 1, wherein the object is occluded from a field of view of one or more sensors of the second autonomous vehicle.\n | 10. The computer-implemented method of claim 1, further comprising: \ngenerating, by the computing system, a motion plan for the second autonomous vehicle based at least in part on the autonomy output; and \ninitiating, by the computing system, a motion control of the second autonomous vehicle based at least in part on the motion plan. \n | 11. An autonomous vehicle computing system comprising: \na machine-learned aggregation model configured to aggregate a plurality of intermediate environmental representations from a first autonomous vehicle and a second autonomous vehicle, the autonomous vehicle computing system being onboard the second autonomous vehicle; \na machine-learned perception model configured to generate autonomy outputs; \none or more processors; and \none or more tangible, non-transitory, computer readable media that store instructions that are executable by the one or more processors to cause the computing system to perform operations, the operations comprising: \nobtaining a first compressed intermediate environmental representation output by one or more intermediate layers of a machine-learned perception model of the first autonomous vehicle, wherein the first compressed intermediate environmental representation is indicative of at least a portion of an environment of the second autonomous vehicle; \ngenerating a first decompressed intermediate environmental representation by decompressing the first compressed intermediate environmental representation; \nobtaining a second intermediate environmental representation output by one or more intermediate layers of a machine-learned perception model of the second autonomous vehicle; \ndetermining, using the machine-learned aggregation model, an updated intermediate environmental representation based at least in part on the first decompressed intermediate environmental representation and the second intermediate environmental representation, \nwherein the machine-learned aggregation model comprises a graph neural network, the graph neural network comprising a plurality of nodes each corresponding to a respective autonomous vehicle of a plurality of autonomous vehicle within the environment of the second autonomous vehicle, wherein the plurality of autonomous vehicles comprises the first autonomous vehicle, \nwherein at least one node of the machine-learned aggregation model is configured to be updated based on a change to the plurality of autonomous vehicles that are within the environment of the second autonomous vehicle; and \ngenerating an autonomy output for the second autonomous vehicle based at least in part on the updated intermediate environmental representation. \n | 12. The autonomous vehicle computing system of claim 11, wherein determining the updated intermediate environmental representation based at least in part on the first decompressed intermediate environmental representation and the second intermediate environmental representation comprises: \ninputting the first decompressed intermediate environmental representation and the second intermediate environmental representation into the machine-learned aggregation model, \nwherein the machine-learned aggregation model is configured to aggregate the first decompressed intermediate environmental representation and the second intermediate environmental representation to generate the updated intermediate environmental representation; and \nobtaining the updated intermediate environmental representation as an output of the machine-learned aggregation model. \n | 13. The autonomous vehicle computing system of claim 11, wherein the machine-learned aggregation model is configured to initialize a node state of at least one node of the graph neural network and to update the node state of the at least one node based at least in part on a spatial transformation.\n | 14. The autonomous vehicle computing system of claim 11, further comprising a time delay correction model configured to adjust the first decompressed intermediate environmental representation to account for a time delay.\n | 15. The autonomous vehicle computing system of claim 14, wherein the first decompressed intermediate environmental representation is adjusted to account for the time delay using the time delay correction model.\n | 16. The autonomous vehicle computing system of claim 11, wherein obtaining the second intermediate environmental representation generated by the second autonomous vehicle comprises: obtaining sensor data via one or more sensors of the second autonomous vehicle; and determining the second intermediate environmental representation based at least in part on the sensor data obtained via the one or more sensors of the second autonomous vehicle.\n | 17. An autonomous vehicle comprising: \none or more processors; and \none or more tangible, non-transitory, computer readable media that store instructions that are executable by the one or more processors to cause the one or more processors to perform operations, the operations comprising: \nobtaining a first compressed intermediate environmental representation output by one or more intermediate layers of a machine-learned perception model of another autonomous vehicle, wherein the first compressed intermediate environmental representation is indicative of at least a portion of an environment of the autonomous vehicle; \ngenerating a first decompressed intermediate environmental representation by decompressing the first compressed intermediate environmental representation; \ngenerating a second intermediate environmental representation output by one or more intermediate layers of a machine-learned perception model of the autonomous vehicle; \ndetermining, using a machine-learned aggregation model onboard the autonomous vehicle, an updated intermediate environmental representation based at least in part on the first decompressed intermediate environmental representation and the second intermediate environmental representation, \nwherein the machine-learned aggregation model comprises a graph neural network, and wherein the graph neural network comprises a plurality of nodes, each respective node of the graph neural network corresponding to a respective autonomous vehicle of a plurality of autonomous vehicles within the environment of the autonomous vehicle, wherein the plurality of autonomous vehicles comprises the other autonomous vehicle, \nwherein at least one node of the machine-learned aggregation model is configured to be updated based on a change to the plurality of autonomous vehicles that are within the environment of the autonomous vehicle; and \ngenerating an autonomy output for the autonomous vehicle based at least in part on the updated intermediate environmental representation, wherein the autonomy output is indicative of an object within the environment of the autonomous vehicle and one or more predicted future locations of the object; and \ngenerating a motion plan for the autonomous vehicle based at least in part on the autonomy output. \n | 18. The autonomous vehicle of claim 17, wherein the object is occluded from a field of view of one or more sensors of the autonomous vehicle, and wherein the motion plan comprises a trajectory for the autonomous vehicle.", "output": "The method involves obtaining a first compressed intermediate environmental representation from a first autonomous vehicle by a computing system. A first decompressed intermediate environmental representation by decompressing the first compressed intermediate environmental representation is generated by the computing system. An updated intermediate environmental representation is determined based on the first decompressed intermediate environmental representation and a second intermediate environmental representation generated by a second autonomous vehicle through the computing system using one or more machine-learned models. An autonomy output is generated for the second autonomous vehicle based on the updated intermediate environmental representation by the computing system. An INDEPENDENT CLAIM is included for a system for performing operations in an autonomous vehicle by utilizing machine-learned models for vehicle-to-vehicle communications. Method for performing operations in an autonomous vehicle (claimed) by utilizing machine-learned models for vehicle-to-vehicle communications. The method enables estimating object's position, size, and shape, as well as a predicted future trajectory of the object accurately and improving ability of the autonomous vehicle to safely plan its motion though its environment. The method enables allowing an ecosystem of autonomous vehicles or systems within a geographic area to provide inter-vehicle or system communications that improve the vehicles or systems autonomous operations, thus reducing communication bandwidth and potential information loss, and allowing for an effective and efficient sharing of vehicle information when intermediate environmental representations can be compressed without losing information. The method enables improving safety of passengers of an autonomous vehicle, safety of the surroundings of the autonomous vehicle, and experience of the rider or operator of the autonomous vehicle. ` The drawing shows a block diagram of a system for an autonomous vehicle.104 Operations Computing System 106Remote Computing Device 108 Communication Networks 112Vehicle Computing System 118Positioning System ", "instruction": "Please summarize the input" }, { "input": "INTELLIGENT DETECTION OF EMERGENCY VEHICLESAn autonomous vehicle detects an approaching emergency vehicle by performing a plurality of types of analysis of signals output by sensors in the vehicle. Each type of analysis generates a different prediction that indicates a likelihood of a proximity of an emergency vehicle to the autonomous vehicle. The predictions are processed and fused to generate a determination that indicates whether an emergency vehicle is proximate to the autonomous vehicle. In response to the determination indicating the emergency vehicle is proximate, the autonomous vehicle performs an action.We claim: \n | 1. A method comprising: \nprocessing a signal received from a sensor associated with an autonomous vehicle, the processing outputting a prediction indicating a likelihood that an emergency vehicle is present in an environment proximate to the autonomous vehicle; \nreceiving, at the autonomous vehicle, a vehicle-to-vehicle communication message from another vehicle that contains a notification of an approaching emergency vehicle; \ngenerating a determination, based on the output prediction and the vehicle-to-vehicle communication message, that indicates whether an emergency vehicle is proximate to the autonomous vehicle; \nin response to the determination indicating the emergency vehicle is proximate, causing the autonomous vehicle to perform an action. \n | 2. The method of claim 1, further comprising: \nreceiving information from a remote computing device that indicates a path of an emergency vehicle; \nwherein the determination indicating whether the emergency vehicle is proximate to the autonomous vehicle is further generated based on the information received from the remote computing device. \n | 3. The method of claim 1, wherein the action comprises autonomously navigating out of a path of the emergency vehicle.\n | 4. The method of claim 1, wherein the action comprises transmitting a vehicle-to-vehicle message indicating the emergency vehicle's presence.\n | 5. The method of claim 1, wherein the action comprises uploading a notification of the emergency vehicle's presence to a remote computing device.\n | 6. The method of claim 1, wherein processing the signal received from the sensor further comprises predicting a direction from which the emergency vehicle is approaching the autonomous vehicle.\n | 7. The method of claim 6, wherein the action is selected based on the predicted direction.\n | 8. A non-transitory computer readable storage medium storing executable instructions, the instructions when executed by one or more processors causing the one or more processors to: \nobtain computer vision data captured by sensors associated with one or more vehicles during operation of the one or more vehicles; \ntrain a model using the obtained computer vision data, the model when trained configured to receive real-time computer vision data indicative of an environment of an autonomous vehicle and output a classification of a behavior of traffic in the environment as either indicative of a presence of an emergency vehicle or not indicative of a presence of an emergency vehicle; \nreceiving classifications output by the trained model during operation of a target vehicle and predictions indicating a likelihood that an emergency vehicle is proximate to the target vehicle, the predictions generated based on processing of a signal received from a sensor in the target vehicle; and \nretraining the model based on a difference between the classifications and the predictions. \n | 9. The non-transitory computer readable storage medium of claim 8, wherein the instructions further cause the one or more processors to: \ntrain a second model using the obtained computer vision data, the second model when trained configured to receive the real-time computer vision data indicative of the environment of the target vehicle and output a classification of vehicle lights detected in the environment as either indicative of a presence of an emergency vehicle or not indicative of a presence of an emergency vehicle. \n | 10. The non-transitory computer readable storage medium of claim 9, wherein the second model when trained further outputs a classification of a type of the emergency vehicle.\n | 11. The non-transitory computer readable storage medium of claim 8, wherein the instructions further cause the one or more processors to: \ntrain a third model using the obtained computer vision data, the third model when trained configured to receive the real-time computer vision data indicative of the environment of the target vehicle and output a classification of a direction an emergency vehicle is traveling relative to the target vehicle. \n | 12. An autonomous vehicle, comprising: \na plurality of sensors configured to measure parameters of an environment around the autonomous vehicle; \none or more processors; and \na non-transitory computer readable storage medium storing executable instructions, the instructions when executed by the one or more processors causing the one or more processors to: \nperform a plurality of types of analysis of signals output by the plurality of sensors, each type of analysis generating a prediction that indicates a likelihood of a proximity of an emergency vehicle to the autonomous vehicle based on one or more of the signals; \nprocess the generated predictions to generate a determination that indicates whether an emergency vehicle is proximate to the autonomous vehicle; and \nin response to the determination indicating the emergency vehicle is proximate, cause the autonomous vehicle to perform an action. \n | 13. The autonomous vehicle of claim 12, wherein the instructions further cause the one or more processors to: \nreceive information from a remote computing device that indicates a path of an emergency vehicle; \nwherein processing the generated predictions further comprises using the received information to generate the determination indicating whether the emergency vehicle is proximate to the autonomous vehicle. \n | 14. The autonomous vehicle of claim 12, wherein the plurality of types of analysis comprise a siren sound detection analysis, and wherein performing the siren sound detection analysis comprises: \nextracting one or more features from a sound signal received from a microphone in the autonomous vehicle; \ncomparing the extracted features to expected features of an emergency vehicle siren; and \nin response to identifying a match between the extracted features and the expected features of the emergency vehicle siren, outputting a prediction that the emergency vehicle is proximate to the autonomous vehicle. \n | 15. The autonomous vehicle of claim 12, wherein the plurality of types of analysis comprise an emergency vehicle light detection analysis, and wherein performing the emergency vehicle light detection analysis comprises: \nextracting lighting features from a perception data signal received from a computer vision system in the autonomous vehicle; \ncomparing the extracted lighting features to expected features of emergency vehicle lights; and \nin response to identifying a match between the extracted lighting features and the expected features of emergency vehicle lights, outputting a prediction that the emergency vehicle is proximate to the autonomous vehicle. \n | 16. The autonomous vehicle of claim 12, wherein the plurality of types of analysis comprise a traffic behavior analysis, and wherein performing the traffic behavior analysis comprises: \nproviding perception data received from a computer vision system in the autonomous vehicle to a machine learning model, the machine learning model configured to output a classification of traffic behaviors captured in the perception data as indicative of or not indicative of an approaching emergency vehicle. \n | 17. The autonomous vehicle of claim 12, wherein the plurality of types of analysis comprise processing a vehicle-to-vehicle message received from another vehicle to determine if the vehicle-to-vehicle message contains an indication that an emergency vehicle is approaching the autonomous vehicle.\n | 18. The autonomous vehicle of claim 12, wherein processing the generated predictions to generate the determination comprises: \ndetermining the emergency vehicle is proximate to the autonomous vehicle if a specified number of the types of analysis generated a prediction indicating an emergency vehicle is likely to be proximate to the autonomous vehicle. \n | 19. The autonomous vehicle of claim 12, wherein processing the generated predictions to generate the determination comprises: \ndetermining the emergency vehicle is proximate to the autonomous vehicle if: \na first type of analysis generated a first prediction indicating a likely proximity of the emergency vehicle; and \na second type of analysis generated a second prediction indicating a likely proximity of the emergency vehicle. \n | 20. The autonomous vehicle of claim 12, wherein processing the generated predictions to generate the determination comprises: \napplying a weighting function to the predictions generated by each of the plurality of types of analysis.", "output": "The method involves processing a signal received from a sensor (110) associated with an autonomous vehicle (100), where the processing outputs a prediction indicating a likelihood that an emergency vehicle is present in an environment proximate to the autonomous vehicle. A vehicle-to-vehicle communication message is received from another vehicle that contains a notification of an approaching emergency vehicle. The autonomous vehicle is caused to perform an action in response to the determination indicating the emergency vehicle proximate, where a determination is generated based on the output prediction and the vehicle-to-vehicle communication message that indicates whether the vehicle is proximate. INDEPENDENT CLAIMS are also included for:a computer readable storage medium comprising a set of instructions for performing intelligent detection of an emergency vehicle; andan autonomous vehicle. Computer-implemented method for performing intelligent detection of an emergency vehicle e.g. autonomous vehicle such as police car, fire lorry or ambulance, on a public road. The method enables providing accurate and timely detection of emergency vehicles to enable the autonomous vehicles to observe applicable traffic laws. The drawing shows a block diagram of components of the autonomous vehicle.100Autonomous vehicle110Sensor120Sensor analysis module122Siren sound detection module124Emergency vehicle light detection module", "instruction": "Please summarize the input" }, { "input": "Direct autonomous vehicle to autonomous vehicle communicationsA system comprises a lead autonomous vehicle (AV), a control device associated with the lead AV, and a following AV. The control device receives a command to navigate the lead AV to avoid an unexpected road condition. The control device receives sensor data from a sensor of the lead AV, comprising location coordinates of objects ahead of the lead AV. The control device accesses environmental data associated with a portion of a road between the lead AV and following AV. The environmental data comprises location coordinates of objects between the lead AV and following AV. The control device determines whether an object in the sensor data or environmental data impedes performing the command by the following AV. The control device updates the command, if the control device determines that an object impedes performing the command by the following AV, and communicates the updated command to the following AV.What is claimed is: \n | 1. A system, comprising: \na lead autonomous vehicle comprising at least one vehicle sensor, wherein the lead autonomous vehicle is configured to travel along a road; \na following autonomous vehicle, different from the lead autonomous vehicle and communicatively coupled with the lead autonomous vehicle, wherein the following autonomous vehicle is traveling along the road behind the lead autonomous vehicle; \na first control device associated with the lead autonomous vehicle and comprising a first processor configured to: \nreceive a command to navigate the lead autonomous vehicle to avoid an unexpected road condition ahead of the lead autonomous vehicle; \nreceive, from the at least one vehicle sensor, sensor data comprising location coordinates of a first plurality of objects ahead of the lead autonomous vehicle; \naccess a first set of environmental data associated with a portion of the road between the lead autonomous vehicle and the following autonomous vehicle, wherein the first set of environmental data comprises location coordinates of a second plurality of objects between the lead autonomous vehicle and the following autonomous vehicle; \ndetermine whether at least one object from the first and second plurality of objects impedes performing the command by the following autonomous vehicle; \nin response to determining that the at least one object impedes performing the command by the following autonomous vehicle, update the command for the following autonomous vehicle, based at least in part upon the sensor data and the first set of environmental data, such that the updated command comprises one or more navigation instructions to avoid the at least one object while performing the command; and \ncommunicate the updated command to the following autonomous vehicle. \n | 2. The system of claim 1, wherein the unexpected road condition comprises at least one of an unexpected weather condition and an unexpected traffic congestion.\n | 3. The system of claim 1, wherein the first set of environmental data is associated with a time window during which the following autonomous vehicle is traveling along the road.\n | 4. The system of claim 1, further comprising a second control device associated with the following autonomous vehicle and comprising a second processor configured to: \nreceive, from the lead autonomous vehicle, the updated command; and \nnavigate the following autonomous vehicle according to the updated command. \n | 5. The system of claim 4, wherein the following autonomous vehicle comprises a first following autonomous vehicle and the system further comprising a second following autonomous vehicle, communicatively coupled with the first following autonomous vehicle, the second following autonomous vehicle is traveling along the road behind the first following autonomous vehicle, wherein: \nthe second processor is further configured to: \naccess a second set of environmental data associated with a portion of the road between the first following autonomous vehicle and the second following autonomous vehicle, wherein the second set of environmental data comprises location coordinates of a third plurality of objects between the first following autonomous vehicle and the second following autonomous vehicle; \ndetermine whether at least one object from the third plurality of objects impedes performing the updated command; \nin response to determining that the at least one object impedes performing the updated command, generate a second updated command by updating the updated command based at least in part upon the aggregate of the first and second sets of environmental data, such that the second updated command comprises one or more navigation instructions to avoid the at least one object from the third plurality of objects while performing the updated command; and \ncommunicate the second updated command to the second following autonomous vehicle. \n | 6. The system of claim 1, wherein: \nthe lead autonomous vehicle and the following autonomous vehicle are within a Vehicle-to-Vehicle communication range, and \nthe Vehicle-to-Vehicle communication range corresponds to a threshold distance of a Vehicle-to-Vehicle module implemented in the lead autonomous vehicle and the following autonomous vehicle to establish a communication path between the lead autonomous vehicle and the following autonomous vehicle. \n | 7. The system of claim 5, wherein the first processor is further configured to: \nin response to the lead autonomous vehicle and the second following autonomous vehicle being within a Vehicle-to-Vehicle communication range, generate the second updated command for the second following autonomous vehicle, based at least in part upon the aggregate of the first and second sets of environmental data; and \ncommunicate the second updated command to the second following autonomous vehicle. \n | 8. A method comprising: \nreceiving a command to navigate a lead autonomous vehicle to avoid an unexpected road condition ahead of the lead autonomous vehicle, wherein: \nthe lead autonomous vehicle comprises at least one vehicle sensor; and \nthe lead autonomous vehicle is configured to travel along a road; \nreceiving, from the at least one vehicle sensor, sensor data comprising location coordinates of a first plurality of objects ahead of the lead autonomous vehicle; \naccessing a first set of environmental data associated with a portion of the road between the lead autonomous vehicle and a following autonomous vehicle, wherein: \nthe following autonomous vehicle is communicatively coupled with the lead autonomous vehicle, and is traveling along the road behind the lead autonomous vehicle; and \nthe first set of environmental data comprises location coordinates of a second plurality of objects between the lead autonomous vehicle and the following autonomous vehicle; \ndetermining whether at least one object from the first and second plurality of objects impedes performing the command by the following autonomous vehicle; \nin response to determining that the at least one object impedes performing the command by the following autonomous vehicle, updating the command for the following autonomous vehicle, based at least in part upon the sensor data and the first set of environmental data, such that the updated command comprises one or more navigation instructions to avoid the at least one object while performing the command; and \ncommunicating the updated command to the following autonomous vehicle. \n | 9. The method of claim 8, further comprising: \ngenerating a second updated command by updating the command for navigating the lead autonomous vehicle, based at least in part upon the sensor data, such that the second updated command comprises one or more navigation instructions to avoid the first plurality of objects while performing the command; and \nnavigating the lead autonomous vehicle according to the second updated command. \n | 10. The method of claim 8, further comprising: \naccessing environmental data associated with a portion of the road ahead of the lead autonomous vehicle, wherein the environmental data is associated with a time window during which the lead autonomous vehicle is traveling along the road; \ncomparing the environmental data with map data that comprises expected road conditions ahead of the lead autonomous vehicle; \nbased at least in part upon comparing the environmental data with the map data, determining whether the environmental data comprises an unexpected road condition that is not included in the map data; and \nin response to determining that the environmental data comprises the unexpected road condition that is not included in the map data: \ndetermining a location coordinate of the unexpected road condition; and \ncommunicating the command to the lead autonomous vehicle to maneuver to avoid the unexpected road condition. \n | 11. The method of claim 8, further comprising: \ncomparing the sensor data with map data, wherein the map data comprises location coordinates of expected objects on the road ahead of the lead autonomous vehicle; \nbased at least in part upon comparing the sensor data with the map data, determining whether the sensor data indicates an unexpected object that is not in the map data; \nin response to determining that the sensor data indicates the unexpected object that is not in the map data: \ndetermining a location coordinate of the unexpected object; and \ndetermining a proposed navigation instruction for the lead autonomous vehicle to avoid the unexpected object. \n | 12. The method of claim 11, further comprising, in response to determining that the sensor data indicates the unexpected object that is not in the map data, performing the proposed navigation instruction.\n | 13. The method of claim 11, further comprising, in response to determining that the sensor data indicates the unexpected object that is not in the map data: \ncommunicating the proposed navigation instruction to an operation server; \ndetermining whether a confirmation is received from the operation server to perform the proposed navigation instruction; and \nin response to receiving the confirmation from the operation server, performing the proposed navigation instruction. \n | 14. The method of claim 11, further comprising, in response to determining that the sensor data indicates the unexpected object that is not in the map data: \ncommunicating the sensor data to an operation server; and \nrequesting the operation server to provide instructions to avoid the unexpected object based at least in part upon the sensor data. \n | 15. The method of claim 8, wherein the command is related to at least one of: \na transition, by the lead autonomous vehicle, from autonomous driving to manual driving; \navoiding, by the lead autonomous vehicle, obstacles on the road ahead of the lead autonomous vehicle; \navoiding, by the lead autonomous vehicle, one or more certain lanes on which one or more obstacles are detected; \navoiding, by the lead autonomous vehicle, one or more certain routes on which the unexpected road condition is detected; \ntaking, by the lead autonomous vehicle, a particular re-route; and \ndriving, by the lead autonomous vehicle, slower or faster than the speed indicated in a driving instruction associated with the lead autonomous vehicle. \n | 16. A non-transitory computer-readable medium storing instructions that when executed by one or more processors cause the one or more processors to: \nreceive a command to navigate a lead autonomous vehicle to avoid an unexpected road condition ahead of the lead autonomous vehicle, wherein: \nthe lead autonomous vehicle comprises at least one vehicle sensor; and \nthe lead autonomous vehicle is configured to travel along a road; \nreceive, from the at least one vehicle sensor, sensor data comprising location coordinates of a first plurality of objects ahead of the lead autonomous vehicle; \naccess a first set of environmental data associated with a portion of the road between the lead autonomous vehicle and a following autonomous vehicle, wherein: \nthe following autonomous vehicle is communicatively coupled with the lead autonomous vehicle, and is traveling along the road behind the lead autonomous vehicle; and \nthe first set of environmental data comprises location coordinates of a second plurality of objects between the lead autonomous vehicle and the following autonomous vehicle; \ndetermine whether at least one object from the first and second plurality of objects impedes performing the command by the following autonomous vehicle; \nin response to determining that the at least one object impedes performing the command by the following autonomous vehicle, update the command for the following autonomous vehicle, based at least in part upon the sensor data and the first set of environmental data, such that the updated command comprises one or more navigation instructions to avoid the at least one object while performing the command; and \ncommunicate the updated command to the following autonomous vehicle. \n | 17. The non-transitory computer-readable medium of claim 16, wherein: \nthe command comprises a broad command directed to the lead autonomous vehicle and one or more following autonomous vehicles that are behind the lead autonomous vehicle on the road; and \nthe broad command comprises one or more navigation instructions to avoid a particular unexpected road condition ahead of the lead autonomous vehicle. \n | 18. The non-transitory computer-readable medium of claim 16, wherein: \nthe command comprises a specific command directed to the lead autonomous vehicle; and \nthe specific command comprises one or more navigation instructions to avoid a particular unexpected road condition ahead of the lead autonomous vehicle. \n | 19. The non-transitory computer-readable medium of claim 16, wherein: \nthe command comprises a configuration command; and \nthe configuration command comprises at least one of changing a direction of the at least one vehicle sensor and changing a data sampling frequency of the at least one vehicle sensor. \n | 20. The non-transitory computer-readable medium of claim 16, wherein the at least one vehicle sensor comprises at least one of a camera, Light Detection and Ranging (LiDAR) sensor, motion sensor, and infrared sensor.", "output": "The system has a lead autonomous vehicle (AV) has a vehicle sensor. A following autonomous vehicle is communicatively coupled with the autonomous vehicle. A control device is associated with the vehicle. A processor receives a command to navigate the vehicle to avoid an unexpected road condition ahead of the vehicle. The processor receives sensor data has location coordinates of a set of objects from the sensor. The control device updates the command for the vehicle based upon the sensor data and environmental data such that the updated command comprises navigation instructions to avoid the object while performs the command, and communicates the command to the following vehicle. INDEPENDENT CLAIMS are included for:(1) a method for directing autonomous vehicle to autonomous vehicle communications;(2) a computer program with instructions for directing autonomous vehicle to autonomous vehicle communications; System for directing autonomous vehicle (AV) to autonomous vehicle communications. System minimizes data transmission overhead at one access/network, and improves tracking the AVs. The drawing shows a block diagram of a system.102Road 110Network 120Operation server 130Command 850aControl device ", "instruction": "Please summarize the input" }, { "input": "SYSTEM AND METHOD FOR AN OPTIMIZED ROUTING OF AUTONOMOUS VEHICLES WITH RISK AWARE MAPSAn autonomous vehicle (AV) route planning system and method comprising receiving data indicating a risk on each road segments of a road network, generating a risk aware map of the road network having a dynamic risk layer determined by the received data on each of the road segments of the road network, generating a set of feasible routes between an origin and destination, selecting an optimal route from the set of feasible routes, and transmitting the optimal route to the AV, where the every route in the set of feasible route has an overall risk below a predetermined risk level specified by an oversight system or a third-party. The autonomous vehicle may operate in conjunction with an oversight system, such as when coordinating a fleet of autonomous vehiclesWhat is claimed is: \n | 1. An autonomous vehicle (AV) route planning method, comprising: \nreceiving data indicating a risk on each of road segments of a road network; \ngenerating a risk aware map of the road network having a dynamic risk layer determined by the received data on the each of the road segments of the road network; \ngenerating a set of feasible routes between an origin and destination, \nwherein each route in the set of feasible route has an overall risk below a predetermined risk level; \nselecting an optimal route between the origin and destination from the set of feasible routes, \nwherein the optimal route prioritizes one or more desired parameters; and \ntransmitting the optimal route to the AV. \n | 2. The method of claim 1, wherein the overall risk is a weighted aggregate risk on each of the road segments comprising each route in the set of feasible routes.\n | 3. The method of claim 1, the optimal route is transmitted to the AV through vehicle-to-vehicle or vehicle-to-infrastructure communication systems.\n | 4. The method of claim 1, wherein the one or more desired parameters are selected by a third party.\n | 5. The method of claim 1, wherein the data indicating the risk is received from vehicle-to-vehicle or vehicle-to-infrastructure communication systems.\n | 6. The method of claim 1, wherein the dynamical layer includes a level of severity of the risk on each of the road segments, wherein the level of severity is selected from a matrix of risk severities.\n | 7. An autonomous vehicle (AV) route planning system, comprising: \na receiver configured to receive data indicating a risk on each of road segments of a road network; \na mapper configured to generate a risk aware map of the road network having a dynamic risk layer determined by the received data on each of the road segments of the road network; \nan optimizer configured to determine a set of feasible routes between an origin and destination and to select an optimal route from the set of feasible routes, \nwherein each route in the set of feasible route has an overall risk below a predetermined risk level and the optimal route prioritizes one or more desired parameters; and \na transmitter configured to transmit the optimal route to the AV. \n | 8. The system of claim 7, wherein the optimal route is transmitted to the AV through vehicle-to-vehicle or vehicle-to-infrastructure communication systems.\n | 9. The system of claim 7, wherein the optimal route is transmitted as an undirected graph representing the optimal route.\n | 10. The system of claim 7, wherein the one or more desired parameters are selected by a third party.\n | 11. The system of claim 7, wherein the one or more desired parameters cause the optimizer to minimize a fuel consumption of the AV.\n | 12. The system of claim 7, wherein the one or more desired parameters cause the optimizer to minimize travel time of the AV.\n | 13. The system of claim 7, wherein the dynamical risk layer includes a level of severity of the risk on each of the road segments, wherein the level of severity is selected from a matrix of risk severities.\n | 14. The system of claim 7, wherein the data indicating the risk is received from vehicle-to-vehicle or vehicle-to-infrastructure communication systems.\n | 15. An autonomous vehicle (AV) comprising: \na vehicle sensor subsystem, \nwherein the vehicle sensor subsystem senses the current environmental conditions surrounding a road segment; \na storage device for adding the current environmental conditions to a database of historical environmental conditions previously sensed of the road segment; \na transmitter configured to transmit the database of historical environmental conditions to an oversight system; and \na receiver configured to receive, from the oversight system, an optimal routing instruction specifying a path between an origin and destination, \nwherein the optimal routing instruction is selected from a set of routing instructions that have an overall risk the path between the origin and destination below a predetermined risk level. \n | 16. The AV of claim 15, wherein the sensing of the current environmental conditions further comprise of detecting an accident, adverse weather, or work construction.\n | 17. The AV of claim 15, wherein the transmitter transmits the database of historical environmental conditions through a roadside infrastructure unit or a vehicle-to-vehicle communication system.\n | 18. The AV of claim 15, wherein the receiver receives the optimal route through a roadside infrastructure unit or a vehicle-to-vehicle communication system.\n | 19. The AV of claim 15, wherein the optimal routing instruction minimize a fuel consumption of the AV.\n | 20. The AV of claim 15, wherein the optimal routing instruction minimize travel time of the AV.", "output": "The vehicle (105) has a vehicle sensor subsystem (144). The vehicle sensor subsystem senses the current environmental conditions surrounding a road segment. A storage device adds the current environmental conditions to a database of historical environmental conditions previously sensed of the road segment. A transmitter is configured to transmit the database of historical environmental conditions to an oversight system. A receiver is configured to receive an optimal routing instruction specifying a path between an origin and destination from the oversight system. The optimal routing instruction is selected from a set of routing instructions that have an overall risk the path between the origin and destination below a predetermined risk level. INDEPENDENT CLAIMS are included for the following:an autonomous vehicle route planning method; andan autonomous vehicle route planning system. Autonomous vehicle e.g. autonomous truck used in autonomous vehicle (AV) route planning system (claimed). Uses include but are not limited to semi tractor-trailer, 18 wheeler, lorry, class 8 vehicle, passenger vehicle, transport van, cargo van, recreational vehicle, golf cart and transport cart. The optimal routing instruction minimize a fuel consumption of the AV. The optimal route is transmitted to the AV enables a vehicle control subsystem of the AV to control operation of the AV based on the optimal route. The drawing shows a block diagram of a autonomous vehicle route planning system. 105Autonomous vehicle140Vehicle subsystems142Vehicle drive subsystem144Vehicle sensor subsystem160Vehicle subsystem interface", "instruction": "Please summarize the input" }, { "input": "SYSTEM AND METHOD FOR COMMUNICATING A DRIVING MODE OF AN AUTONOMOUS VEHICLEA system for communicating a driving mode of an autonomous vehicle (AV) comprises the AV, a control device, and a notification device. The control device defines a threshold region around the AV. The control device receives sensor data from sensors of the AV. The control device detects presence of a vehicle from the sensor data. The control device determines a distance between the vehicle and the AV. The control device determines that the vehicle is within the threshold region based on determining that the distance between the vehicle and the AV is within the threshold region. While the AV is operating in the autonomous mode, the control device triggers the notification device to notify the vehicle that the AV is operating in the autonomous mode, where notifying that the AV is operating in the autonomous mode comprises presenting a visual notification and/or communicating a data message to other vehicles.|1. A system, comprising: \na control device associated with an autonomous vehicle configured to travel along a road, the control device comprising at least one processor configured to perform operations comprising: \noperating the autonomous vehicle in an autonomous mode; \ndefining a threshold region around the autonomous vehicle; \nreceiving sensor data from at least one vehicle sensor located on the autonomous vehicle; \ndetecting, based on the received sensor data, a presence of another vehicle; \ndetermining a distance between the autonomous vehicle and the other vehicle; \ndetermining, based on the distance between the autonomous vehicle and the other vehicle, that the other vehicle is within the threshold region; and \nnotifying the other vehicle that the autonomous vehicle is operating in the autonomous mode by at least presenting, by a notification device located on the autonomous vehicle, a visual notification. \n | 2. The system of claim 1, wherein notifying the other vehicle that the autonomous vehicle is operating in the autonomous mode further comprises communicating a data message to the other vehicle, \nwherein the data message indicates the autonomous vehicle is operating in the autonomous mode, and \nwherein the data message comprises an identifier associated with the autonomous vehicle. \n | 3. The system of claim 1, wherein the notification device comprises a flashing light source, and \nwherein presenting the visual notification comprises powering on the flashing light source. \n | 4. The system of claim 1, wherein the notification device comprises a display board including a two dimensional array of light emitting diodes, and \nwherein presenting the visual notification comprises displaying, on the display board, text indicating the autonomous vehicle is operating in the autonomous mode. \n | 5. The system of claim 1, wherein the notification device comprises a device configured to display an image indicating the autonomous vehicle is operating in the autonomous mode, and \nwherein presenting the visual notification comprises presenting the image. \n | 6. The system of claim 1, wherein one or more notification devices are located on one or more sides of the autonomous vehicle.\n | 7. The system of claim 1, wherein one or more notification devices are located on one or more rear view windows of the autonomous vehicle.\n | 8. A method, comprising: \noperating an autonomous vehicle in an autonomous mode; \ndefining a threshold region around the autonomous vehicle; \nreceiving sensor data from at least one vehicle sensor located on the autonomous vehicle; \ndetecting, based on the received sensor data, a presence of another vehicle; \ndetermining a distance between the autonomous vehicle and the other vehicle; \ndetermining, based on the distance between the autonomous vehicle and the other vehicle, that the other vehicle is within the threshold region; and \nnotifying the other vehicle that the autonomous vehicle is operating in the autonomous mode by at least presenting, by a notification device located on the autonomous vehicle, a visual notification. \n | 9. The method of claim 8, further comprising \ndetecting, based on the received sensor data, a presence of another autonomous vehicle; \ndetermining a second distance between the autonomous vehicle and the other autonomous vehicle; \ndetermining, that the second distance is less than a vehicle to vehicle communication range; and \ntransmitting, in response to determining the second distance is less than the vehicle to vehicle communication range, a message from the autonomous vehicle to the other autonomous vehicle. \n | 10. The method of claim 8, wherein the at least one vehicle sensor comprises at least one of a camera, a light detection and ranging sensor, or an infrared sensor.\n | 11. The method of claim 9, wherein detecting the presence of the other autonomous vehicle further comprises determining that there is no driver in the other autonomous vehicle.\n | 12. The method of claim 9, wherein detecting the presence of the other autonomous vehicle further comprises determining that a model of the other autonomous vehicle matches one of a plurality of autonomous vehicle models.\n | 13. The method of claim 9, wherein the message transmitted by the autonomous vehicle comprises a type of the autonomous vehicle, and \nwherein the type of the autonomous vehicle comprises a truck. \n | 14. The method of claim 9, wherein the message transmitted by the autonomous vehicle comprises a software version currently installed in a control device of the autonomous vehicle.\n | 15. The method of claim 9, wherein the message transmitted by the autonomous vehicle comprises a next navigation maneuver of the autonomous vehicle, and \nwherein the next navigation maneuver comprises at least one of changing to a particular lane at a particular time, taking a particular exit at a particular time, and continuing on the current lane for a particular time period. \n | 16. The method of claim 9, further comprising: \nin response to detecting the presence of the other autonomous vehicle, increasing a distance between the autonomous vehicle and the other autonomous vehicle. \n | 17. The method of claim 8, further comprising: \noperating the autonomous vehicle in a nonautonomous mode; and \nnotifying other vehicle that the autonomous vehicle is operating in the nonautonomous mode. \n | 18. The method of claim 8, wherein presenting the visual notification comprises displaying, on display board, text indicating the autonomous vehicle is operating in the autonomous mode, and \nwherein the display board comprises a two dimensional array of light emitting diodes. \n | 19. The method of claim 8, wherein presenting the visual notification further comprises triggering the notification device to power on a flashing light source.\n | 20. A non-transitory computer-readable medium storing instructions, that when executed by one or more processors cause the one or more processors to perform operations comprising: \noperating an autonomous vehicle in an autonomous mode; \ndefining a threshold region around the autonomous vehicle; \nreceiving sensor data from at least one vehicle sensor located on the autonomous vehicle; \ndetecting, based on the received sensor data, a presence of another vehicle; \ndetermining a distance between the autonomous vehicle and the other vehicle; \ndetermining, based on the distance between the autonomous vehicle and the other vehicle, that the other vehicle is within the threshold region; and \nnotifying the other vehicle that the autonomous vehicle is operating in the autonomous mode by at least presenting, by a notification device located on the autonomous vehicle, a visual notification.", "output": "The system (100) has a control device (750) that is associated with an autonomous vehicle (702) configured to travel along a road. The control device comprises a processor (122) configured to receive sensor data (130) comprising an image of multiple objects (104) on the road from a vehicle sensor (746) located on the autonomous vehicle. The processor is configured to determine that a light condition level (134) is less than a threshold light level (136) on a particular portion of the received image. The processor is configured to adjust a headlight (110) mounted to the autonomous vehicle and configured to illuminate a portion of the road ahead of the autonomous vehicle in response to determining that the light condition level is less than the threshold light level. The headlight is adjusted to increase illumination on a particular portion of the road that appears in the particular portion of the image. INDEPENDENT CLAIMS are included for the following:a method for implementing adaptive light distribution for autonomous vehicle;a non-transitoiy computer-readable medium storing program for implementing adaptive light distribution for autonomous vehicle; anda computer program for implementing adaptive light distribution for autonomous vehicle. System for implementing adaptive light distribution for autonomous vehicle. The control device diverts the current illumination pattern to another direction to avoid blinding drivers in the oncoming traffic until the on-coming traffic passes by the autonomous vehicle. The autonomous vehicle can travel more safely, and cars surrounding an autonomous vehicle also travels more safely. The system is integrated into a practical application of implementing adaptive light distributions for autonomous vehicles. This, in turn, provides additional practical applications of improving autonomous vehicle's perception of the road ahead of the vehicle and improving the vehicle's sensors' visibility. The drawing shows a schematic diagram of a system for implementing adaptive light distributions for an autonomous vehicle. 100System for implementing adaptive light distribution for autonomous vehicle104Object110Headlight122Processor130Sensor data134Light condition level136Threshold light level702Autonomous vehicle746Vehicle sensor750Control device", "instruction": "Please summarize the input" }, { "input": "System of automatic driving assistance, roadside assistance and vehicle-side assistanceThe present document describes an autonomous driving assistance system, a roadside assistance system and a vehicle-mounted assistance system. The autonomous driving assistance system may include at least one roadside sensor, a roadside device, a roadside Vehicle to Everything (V2X) communication device and a vehicle-mounted V2X communication device. The at least one roadside sensor may be configured to collect environment information of a surrounding environment and transmit the environment information to the roadside device. The roadside device may be configured to process the received environment information to obtain perception information and transmit the perception information to the roadside V2X communication device. The roadside V2X communication device may be configured to transmit the received perception information to the vehicle-mounted V2X communication device. The vehicle-mounted V2X communication device may be configured to transmit the received perception information to a vehicle-mounted autonomous driving system.What is claimed is: \n | 1. An automatic driving assistance system, comprising a roadside assistance system and a vehicle-mounted assistance system, the roadside assistance system comprising a plurality of roadside sensors, a roadside device and a roadside Vehicle to Everything (V2X) communication device, and the vehicle-mounted assistance system comprising a vehicle-mounted V2X communication device, wherein: \neach of the plurality of roadside sensors is configured to collect environment information of an ambient environment and transmit the environment information to the roadside device; \nthe roadside device is configured to: \ngenerate a plurality of messages from the environment information collected by the plurality of roadside sensors, the plurality of messages including a plurality of pieces of sensed information, each piece of sensed information being generated from the environment information collected by a respective one of the plurality of roadside sensors, \ncombine the plurality of messages generated from the environment information collected by the plurality of roadside sensors into a single message at least by removing redundant sensed information based on a confidence level associated with each piece of sensed information, the single message comprising combined sensed information, and \ntransmit the combined sensed information to the roadside V2X communication device; \nthe roadside V2X communication device is configured to transmit the combined sensed information to the vehicle-mounted V2X communication device; and \nthe vehicle-mounted V2X communication device is configured to transmit the combined sensed information to a vehicle-mounted automatic driving system. \n | 2. The automatic driving assistance system of claim 1, wherein the roadside V2X communication device is configured to encapsulate the combined sensed information into a V2X communication message and transmit the V2X communication message to an air interface; and \nwherein the vehicle-mounted V2X communication device is configured to receive the V2X communication message from the air interface, parse the V2X communication message to obtain the combined sensed information, and transmit the combined sensed information to the vehicle-mounted automatic driving system. \n | 3. The automatic driving assistance system of claim 2, wherein the roadside device comprises a roadside communication interface, and the vehicle-mounted automatic driving system comprises a vehicle-mounted communication interface, and wherein: \nthe roadside communication interface is configured to transmit the combined sensed information to the roadside V2X communication device; \nthe vehicle-mounted V2X communication device transmitting the combined sensed information to the vehicle-mounted automatic driving system comprises transmitting the combined sensed information to the vehicle-mounted communication interface; and \nthe vehicle-mounted communication interface is configured to transmit the combined sensed information to respective functional modules of the vehicle-mounted automatic driving system. \n | 4. The automatic driving assistance system of claim 3, wherein: \nthe roadside communication interface is configured to encapsulate the combined sensed information into a Transmission Control Protocol (TCP)/User Datagram Protocol (UDP) message and transmit the TCP/UDP message to the roadside V2X communication device; \nthe roadside V2X communication device encapsulating the combined sensed information into the V2X communication message comprises: parsing the TCP/UDP message received from the roadside communication interface to obtain the combined sensed information and encapsulating the combined sensed information into the V2X communication message; \nthe vehicle-mounted V2X communication device transmitting the combined sensed information to the vehicle-mounted communication interface comprises: encapsulating the combined sensed information into a TCP/UDP message and transmitting the TCP/UDP message to the vehicle-mounted communication interface; and \nthe vehicle-mounted communication interface is configured to parse the TCP/UDP message received from the vehicle-mounted V2X communication device to obtain the combined sensed information. \n | 5. The automatic driving assistance system of claim 4, wherein the roadside communication interface encapsulating the combined sensed information into the TCP/UDP message comprises: \nserializing the combined sensed information in accordance with a predetermined serialization mechanism to obtain serial binary data; and \nencapsulating the serial binary data as payload data in the TCP/UDP message; and \n wherein the vehicle-mounted communication interface parsing the TCP/UDP message received from the vehicle-mounted V2X communication device to obtain the combined sensed information comprises: \nremoving TCP/Internet Protocol (IP) protocol stack format information; \nextracting the payload data from the TCP/UDP message; and \ndeserializing the payload data in accordance with a predetermined deserialization mechanism to obtain the combined sensed information. \n | 6. The automatic driving assistance system of claim 3, wherein the roadside communication interface is configured to transmit the combined sensed information to the roadside V2X communication device via a Universal Serial Bus (USB) interface or a serial interface; and \nthe vehicle-mounted V2X communication device transmitting the combined sensed information to the vehicle-mounted communication interface comprises: transmitting the combined sensed information to the vehicle-mounted communication interface via a USB interface or a serial interface. \n | 7. The automatic driving assistance system of claim 3, wherein each of the roadside device and the automatic driving system is based on a framework comprising open-source resources and configured to support data sharing; \nthe roadside device further comprises driving nodes each corresponding to one of the plurality of roadside sensors; \neach of the driving nodes treats the environment information collected by its corresponding roadside sensor as a message and posts the message in form of a topic. \n | 8. The automatic driving assistance system of claim 1, wherein each of the plurality of roadside sensors comprises one of: a camera, a laser radar, a millimeter wave radar, a positioning sensor, an illumination sensor, a temperature sensor, or a humidity sensor.\n | 9. A roadside assistance system, comprising a plurality of roadside sensors, a roadside device, and a roadside Vehicle to Everything (V2X) communication device, wherein: \neach of the plurality of roadside sensors is configured to collect environment information of an ambient environment and transmit the environment information to the roadside device; \nthe roadside device is configured to: \ngenerate a plurality of messages from the environment information collected by the plurality of roadside sensors, the plurality of messages including a plurality of pieces of sensed information, each piece of sensed information being generated from the environment information collected by a respective one of the plurality of roadside sensors, \ncombine the plurality of messages generated from the environment information collected by the plurality of roadside sensors into a single message at least by removing redundant sensed information based on a confidence level associated with each piece of sensed information, the single message comprising combined sensed information, and \ntransmit the combined sensed information to the roadside V2X communication device; and \nthe roadside V2X communication device is configured to transmit the combined sensed information to a vehicle-mounted V2X communication device. \n | 10. The roadside assistance system of claim 9, wherein the roadside V2X communication device is configured to encapsulate the combined sensed information into a V2X communication message and transmit the V2X communication message to an air interface.\n | 11. The roadside assistance system of claim 10, wherein the roadside device comprises a roadside communication interface, and wherein the roadside communication interface is configured to transmit the combined sensed information to the roadside V2X communication device.\n | 12. The roadside assistance system of claim 11, wherein the roadside communication interface is configured to encapsulate the combined sensed information into a Transmission Control Protocol (TCP)/User Datagram Protocol (UDP) message and transmit the TCP/UDP message to the roadside V2X communication device; and \nwherein the roadside V2X communication device encapsulating the combined sensed information into the V2X communication message comprises: \nparsing the TCP/UDP message received from the roadside communication interface to obtain the combined sensed information; and \nencapsulating the combined sensed information into the V2X communication message. \n | 13. The roadside assistance system of claim 12, wherein the roadside communication interface encapsulating the combined sensed information into the TCP/UDP message comprises: \nserializing the combined sensed information in accordance with a predetermined serialization mechanism to obtain serial binary data; and \nencapsulating the serial binary data as payload data in the TCP/UDP message. \n | 14. The roadside assistance system of claim 11, wherein the roadside communication interface is configured to transmit the combined sensed information to the roadside V2X communication device via a Universal Serial Bus (USB) interface or a serial interface.\n | 15. The roadside assistance system of claim 11, wherein: \nthe roadside device is based on a framework comprising open-source resources and configured to support data sharing; \nthe roadside device further comprises driving nodes each corresponding to one of the plurality of roadside sensors; and \neach of the driving nodes treats the environment information collected by its corresponding roadside sensor as a message and posts the message in form of a topic. \n | 16. The roadside assistance system of claim 9, wherein each of the plurality of roadside sensors comprises one: a camera, a laser radar, a millimeter wave radar, a positioning sensor, an illumination sensor, a temperature sensor, or a humidity sensor.\n | 17. A vehicle-mounted assistance system, comprising a vehicle-mounted Vehicle to Everything (V2X) communication device connected to a vehicle-mounted automatic driving system, wherein: \nthe vehicle-mounted V2X communication device is configured to receive combined sensed information from a roadside V2X communication device and transmit the combined sensed information to the vehicle-mounted automatic driving system, \nthe roadside V2X communication device is coupled to a plurality of roadside sensors, wherein each of the plurality of roadside sensors is configured to collect environment information of an ambient environment and transmit the environment information to a roadside device, and \nthe roadside V2X communication device is coupled to the roadside device that is configured to: \ngenerate a plurality of messages from the environment information collected by the plurality of roadside sensors, the plurality of messages including a plurality of pieces of sensed information, each piece of sensed information being generated from the environment information collected by a respective one of the plurality of roadside sensors, and \ncombine the plurality of messages generated from the environment information collected by the plurality of roadside sensors into a single message at least by removing redundant sensed information based on a confidence level associated with each piece of sensed information, the single message comprising combined sensed information. \n | 18. The vehicle-mounted assistance system of claim 17, wherein the vehicle-mounted V2X communication device is configured to receive the V2X communication message transmitted from the roadside V2X communication device from an air interface, parse the V2X communication message to obtain the combined sensed information and transmit the combined sensed information to the vehicle-mounted automatic driving system.\n | 19. The vehicle-mounted assistance system of claim 18, wherein the vehicle-mounted automatic driving system comprises a vehicle-mounted communication interface, and wherein: \nthe vehicle-mounted V2X communication device transmitting the combined sensed information to the vehicle-mounted automatic driving system comprises transmitting the combined sensed information to the vehicle-mounted communication interface; and \nthe vehicle-mounted communication interface is configured to transmit the combined sensed information to respective functional modules of the vehicle-mounted automatic driving system. \n | 20. The vehicle-mounted assistance system of claim 19, wherein the vehicle-mounted V2X communication device transmitting the combined sensed information to the vehicle-mounted communication interface comprises: encapsulating the combined sensed information into a Transmission Control Protocol (TCP)/User Datagram Protocol (UDP) message and transmitting the TCP/UDP message to the vehicle-mounted communication interface; and \nwherein the vehicle-mounted communication interface is configured to parse the TCP/UDP message received from the vehicle-mounted V2X communication device to obtain the combined sensed information. \n | 21. The vehicle-mounted assistance system of claim 20, wherein the vehicle-mounted communication interface parsing the TCP/UDP message received from the vehicle-mounted V2X communication device to obtain the combined sensed information comprises: \nremoving TCP/Internet Protocol (IP) protocol stack format information and extracting the payload data from the TCP/UDP message; and \ndeserializing the payload data in accordance with a predetermined deserialization mechanism to obtain the combined sensed information. \n | 22. The vehicle-mounted assistance system of claim 19, wherein the vehicle-mounted V2X communication device transmitting the combined sensed information to the vehicle-mounted communication interface comprises transmitting the combined sensed information to the vehicle-mounted communication interface via a Universal Serial Bus (USB) interface or a serial interface.\n | 23. The vehicle-mounted assistance system of claim 19, wherein the automatic driving system is based on a framework comprising open-source resources and configured to support data sharing; and \nwherein the vehicle-mounted communication interface treats the combined sensed information as a message and posts the message in form of a topic.", "output": "The system has a road side assistance system connected with road side sensors, a road side unit, a road side terminal and a vehicle-V2X communicating device. A vehicle terminal is connected with an auxiliary system that is provided with the vehicle-V2X communicating device. The road side sensor collects environmental information of a periphery device when the environment information to a road side device. The vehicle-V2X communicating device converts received sensing information when the received sensing information is transmitted to the vehicle-V2X communicating device. The vehicle-V2X communicating device transmits the received sensing information to the automatic driving system. INDEPENDENT CLAIMS are also included for the following:a road side auxiliary systema vehicle mounted auxiliary system. Automatic driving assistance system. The drawing shows a block diagram of an automatic driving assistance system. '(Drawing includes non-English language text)'", "instruction": "Please summarize the input" }, { "input": "Cellular systemA system includes a cellular transceiver to communicate with a predetermined target; one or more antennas coupled to the 5G or 6G transceiver each electrically or mechanically steerable to the predetermined target; a processor to control a directionality of the one or more antennas in communication with the predetermined target; and an edge processing module coupled to the processor and the one or more antennas to provide low-latency computation for the predetermined target.What is claimed is: \n | 1. A system to perform edge processing for a predetermined target, comprising: \none or more cellular transceivers with one or more antennas that are electrically or mechanically steerable to the predetermined target; \na processor to control communication with the predetermined target; and \none or more edge processing modules coupled to the processor and the one or more antennas to provide low-latency computation for the predetermined target; and \na container to house the transceiver, processor running a virtual radio access network, and one or more edge processing modules, the container including a heat spreader coupled to the transceiver. \n | 2. The system of claim 1, wherein the container is deployed without a construction permit.\n | 3. The system of claim 1, wherein the processor is coupled to fiber optics cable to communicate with a cloud-based radio access network (RAN) or a remote RAN.\n | 4. The system of claim 1, comprising an antenna mast, wherein the antenna mast is inside the container or external to the container.\n | 5. The system of claim 1, wherein the edge processing module comprises at least a processor, a graphical processing unit (GPU), a neural network, a statistical engine, or a programmable logic device (PLD).\n | 6. The system of claim 1, wherein the edge processing module and the antenna comprise one unit.\n | 7. The system of claim 1, comprising a cryogenic cooling system to cool the container.\n | 8. The system of claim 1, wherein the cellular transceiver comprises a 5G or 6G transceiver.\n | 9. The system of claim 1, wherein the processor coordinates beam sweeping by the one or more antennas with radio nodes or user equipment (UE) devices based upon service level agreement, performance requirement, traffic distribution data, networking requirements or prior beam sweeping history.\n | 10. The system of claim 9, wherein the beam sweeping is directed at a group of autonomous vehicles, a group of virtual reality devices, or a group of devices having a service agreement with a cellular provider.\n | 11. The system of claim 1, comprising a neural network coupled to a control plane, a management plane, or a data plane to optimize 5G or 6G parameters.\n | 12. The system of claim 1, comprising one or more cameras and sensors to capture security information.\n | 13. The system of claim 1, wherein the container includes edge sensors including LIDAR and RADAR.\n | 14. The system of claim 1, comprising a camera for individual identity identification.\n | 15. The system of claim 1, wherein the edge processing module streams data to the predetermined target to minimize loading the target.\n | 16. The system of claim 1, wherein the edge processing module shares workload with a core processing module located at a head-end and a cloud module located at a cloud data center, each processing module having increased latency and each having a processor, a graphical processing unit (GPU), a neural network, a statistical engine, or a programmable logic device (PLD).\n | 17. The system of claim 1, comprising an edge learning machine in the housing to provide local edge processing for Internet-of-Things (IOT) sensors with reduced off-chip memory access.\n | 18. The system of claim 17, wherein the edge learning machine uses pre-trained models and modifies the pre-trained models for a selected task.\n | 19. The system of claim 1, comprising a cellular device for a person crossing a street near a city light or street light, the cellular device emitting a person to vehicle (P2V) or a vehicle to person (V2P) safety message.\n | 20. The system of claim 1, comprising a cloud trained neural network whose network parameters are down-sampled and filter count reduced before transferring to the edge neural network.", "output": "System comprises one or more cellular transceivers with one or more antennas that are electrically or mechanically steerable to predetermined target. The processor to control communication with the predetermined target. The one or more edge processing modules coupled to the processor and the one or more antennas to provide low-latency computation for the predetermined target. The container to house the transceiver, processor running a virtual radio access network, and one or more edge processing modules, the container including a heat spreader coupled to the transceiver. System that is used for performing edge processing for predetermined target (claimed). The system includes a cellular transceiver to communicate with a predetermined target, one or more antennas coupled to the 5G or 6G transceiver each electrically or mechanically steerable to the predetermined target, a processor to control a directionality of the one or more antennas in communication with the predetermined target, and an edge processing module coupled to the processor and the one or more antennas to provide low-latency computation for the predetermined target. The drawing shows a schematic view of an active antenna system.", "instruction": "Please summarize the input" }, { "input": "Cellular systemA system includes a housing with one or more edge processors to handle processing on behalf of a mobile target or to provide local data to the mobile target or to provide artificial intelligence for the mobile target; one or more antennas coupled to the housing; and a processor to control a directionality of the antennas in communication with the mobile target using 5G or 6G protocols.What is claimed is: \n | 1. A system, comprising: \na mobile target to receive road or traffic parameters from one or more traffic sensors, road sensors or cameras; \na housing with one or more edge processors coupled to the one or more traffic sensors, road sensors or cameras to handle processing on behalf of the mobile target with a predetermined latency to provide augmented or virtual reality rendering data to the mobile target or to provide artificial intelligence for the mobile target, wherein the mobile target optimizes performance and power consumption by offloading augmented or virtual processing to the one or more edge processors and applying a received edge processing result within the predetermined latency to augment processing by the mobile target; one or more antennas coupled to the housing; and \na processor to communicate with the mobile target using 5G protocols. \n | 2. The system of claim 1, wherein the processor calibrates a radio link between a transceiver in the housing and a client device.\n | 3. The system of claim 1, wherein local data comprises images and wherein the processor process images from one or more mobile target cameras for location identification, ridesharing pick-up, or delivery.\n | 4. The system of claim 1, wherein local data comprises images and wherein the one or more edge processors detect real time hazard detection or road signs.\n | 5. The system of claim 1, wherein the processor moves actuators coupled to the antennas.\n | 6. The system of claim 1, wherein local data comprises weather or location data.\n | 7. The system of claim 1, wherein the one or more edge processors handle video content, healthcare, robotics, autonomous vehicle, augmented reality, virtual reality, extended reality, factory automation, gaming, asset tracking, or surveillance.\n | 8. The system of claim 1, wherein the mobile target receives high definition local road map data from the edge processors.\n | 9. The system of claim 1, wherein the local data comprises data affecting road conditions, and wherein the one or more edge processors provide traffic, transit, search, routing, telematics, weather, tracking, positioning, high-definition map, or geoenrichment data.\n | 10. The system of claim 1, wherein the one or more edge processors comprise one or more learning machines or neural networks.\n | 11. The system of claim 1, comprising one or more cameras and sensors in the housing to capture security information.\n | 12. The system of claim 1, wherein the one or more edge processors perform predictive analytics, consumer targeting, fraud detection, or demand forecast.\n | 13. The system of claim 1, comprising a camera and a processor for individual identity identification.\n | 14. The system of claim 1, wherein the one or more edge processors applies artificial intelligence to location data.\n | 15. The system of claim 1, wherein the one or more edge processors analyze sound, scent, or chemical data from sensors in the housing.\n | 16. The system of claim 1, comprising an edge learning machine in the housing to provide local edge processing for one or more Internet-of-Things (IOT) sensors.\n | 17. The system of claim 1, comprising an edge learning machine that uses pre-trained models and modifies the pre-trained models for a selected task.\n | 18. The system of claim 1, comprising a cellular device for a person crossing a street near a city light or street light, the cellular device emitting a person to vehicle (P2V) or a vehicle to person (V2P) safety message.\n | 19. A system for A system for real-time resource allocation in a wireless network, comprising: \na mobile target to receive parameters from one or more sensors; \na housing with one or more edge processors coupled to one or more sensors to handle processing on behalf of a target with a predetermined latency or to provide artificial intelligence for a target; \ndetermining resource allocation in response to numbers of users and use cases on the wireless network; \napplying artificial intelligence (AI) to allocate resources based on real-time demand and network conditions for beam management, spectrum allocation, and scheduling function to handle resource allocation demands and resource utilization with AI to process data with a predetermined latency; \none or more antennas coupled to the housing; and \na processor to communicate with the mobile target using 5G protocols. \n | 20. A method in a wireless network, comprising: \nreceiving traffic parameters from one or more sensors or cameras; \nwith one or more edge processors coupled to the one or more sensors or cameras, processing on behalf of a target with a predetermined latency of providing artificial intelligence operation for the target; \ndetermining resource allocation in response to numbers of users and use cases on the wireless network; and \napplying AI to a physical layer (PHY) to perform digital predistortion, channel estimation, and channel resource optimization; \nadjusting transceiver parameters for optimizing resource allocation with the applied AI at the PHY.", "output": "System comprises a mobile target to receive road or traffic parameters from traffic sensors, road sensors, or cameras, a housing with edge processors coupled to the traffic sensors, road sensors, or cameras to handle processing on behalf of the mobile target with a predetermined latency to provide reality rendering data to the mobile target or to provide artificial intelligence for the mobile target. The mobile target optimizes performance and power consumption by offloading processing to the edge processors and applying a received edge processing result within the predetermined latency to augment processing by the mobile target. Antennas (11) are coupled to the housing. A processor is provided to communicate with the mobile target using fifth-generation or sixth-generation protocols. System, preferably cellular system e.g. fifth generation (5G) cellular system, long term evolution system and worldwide interoperability for microwave access system. The liquid lens antenna system comprises a liquid lens that is provided with a moveable surface, where liquid is added or removed to adjust the curvature of the movable surface and an antenna is mounted on the moveable surfaces to change a direction of the antenna to a predetermined target, and thus enables to improve the signal-to-noise ratio of the wireless communication system. The drawing shows a schematic view of the man-hole cover with a small cell and steerable antennas.11Antennas 22, 24Latches ", "instruction": "Please summarize the input" }, { "input": "Cellular systemA system includes a housing with one or more edge processors to handle processing on behalf of a mobile target or to provide local data to the mobile target or to provide artificial intelligence for the mobile target; one or more antennas coupled to the housing; and a processor to control a directionality of the antennas in communication with the mobile target using 5G or 6G protocols.What is claimed is: \n | 1. A system, comprising: \na housing with one or more edge processors to handle processing on behalf of a mobile target with a predetermined latency to provide reality rendering data to the mobile target or to provide artificial intelligence for the mobile target, wherein the mobile target optimizes performance and power consumption by offloading processing to the one or more edge processors and applying a received edge processing result within the predetermined latency to augment processing by the mobile target; \none or more sensors to capture local data; \none or more antennas coupled to the housing; and \na processor to communicate with the mobile target using 5G or 6G protocols. \n | 2. The system of claim 1, wherein the processor calibrates a radio link between a transceiver in the housing and a client device.\n | 3. The system of claim 1, wherein the local data comprises images and wherein the processor process images from one or more mobile target cameras for location identification, ridesharing pick-up, or delivery.\n | 4. The system of claim 1, wherein the local data comprises images and wherein the one or more edge processors detect real time hazard detection or road signs.\n | 5. The system of claim 1, wherein the processor moves actuators coupled to the antennas.\n | 6. The system of claim 1, wherein the local data comprises weather or location data.\n | 7. The system of claim 1, wherein the one or more edge processors handle video content, healthcare, robotics, autonomous vehicle, augmented reality, virtual reality, extended reality, factory automation, gaining, asset tracking, or surveillance.\n | 8. The system of claim 1, wherein the mobile target comprises plant or manufacturing equipment.\n | 9. The system of claim 1, wherein the local data comprises data affecting road conditions, and wherein the one or more edge processors provide traffic, transit, search, routing, telematics, weather, tracking, positioning, high-definition map, or geo-enrichment data.\n | 10. The system of claim 1, wherein processor focuses 5G signals to the target with iterative changes in electrical or mechanical orientation of the one or more antennas.\n | 11. The system of claim 1, wherein the edge processors comprise one or more learning machines or neural networks.\n | 12. The system of claim 1, comprising one or more cameras and sensors in the housing to capture security information.\n | 13. The system of claim 1, wherein the one or more edge processors perform predictive analytics, consumer targeting, fraud detection, or demand forecast.\n | 14. The system of claim 1, comprising a camera and a processor for individual identity identification.\n | 15. The system of claim 1, wherein the one or more edge processors applies artificial intelligence to location data.\n | 16. The system of claim 1, wherein the one or more edge processors analyze sound, scent, or chemical data from sensors in the housing.\n | 17. The system of claim 1, comprising an edge learning machine in the housing to provide local edge processing for one or more Internet-of-Things (IOT) sensors.\n | 18. The system of claim 1, comprising an edge learning machine that uses pre-trained models and modifies the pre-trained models for a selected task.\n | 19. The system of claim 1, comprising a cellular device for a person crossing a street near the city light or street light, the cellular device emitting a person to vehicle (P2V) or a vehicle to person (V2P) safety message.\n | 20. The system of claim 1, comprising a cloud trained neural network whose network parameters are reduced before transfer to the edge neural network.\n | 21. A processing method, comprising: \nwirelessly receiving at a pole, tower, or container a remote processing request from a mobile device with a predetermined latency; \ncapturing local data with sensors coupled to the pole, tower, or container; \noffloading processing of the remote processing request with the local data to the one or more edge processors; \ncompleting the remote processing request by receiving a result from one or more edge processors at the pole, tower, or container within the predetermined latency and applying the result to augment processing by the mobile device to provide reality rendering data to the mobile target or to augment artificial intelligence processing for the mobile target while optimizing performance and power consumption by the mobile device.", "output": "The cellular system includes a housing with one or more edge processors to handle processing on behalf of a mobile target or to provide local data to the mobile target or to provide artificial intelligence for the mobile target. One or more antennas (11) are coupled to the housing. A processor controls a directionality of the antennas in communication with the mobile target using 5G or 6G protocols. Cellular system, e.g., manhole cover or security camera. The receive and transmit digital beam former (DBF) coefficients are adjusted to help maintain an improved or maximum signal quality, to help reduce or minimize in-band interference and to help maximize receive power level. By integrating the remote radio head functionality into the antenna, the esthetics of the site can be improved and wind load reduced, resulting in lower leasing and installation costs. The drawing shows views of a manhole cover with a small cell and steerable antennas. 4Manhole cover body11Antennas22Latches26CLatching control surface30Bands", "instruction": "Please summarize the input" }, { "input": "WIRELESS SYSTEMA system includes a transceiver to communicate with a predetermined target; one or more antennas coupled to the transceiver each electrically or mechanically steerable to the predetermined target; and an edge processing module coupled to the transceiver and one or more antennas to provide low-latency computation for the predetermined target.What is claimed is: \n | 1. A system, comprising: \na transceiver to communicate with a predetermined target; \none or more antennas coupled to the transceiver each electrically or mechanically steerable to the predetermined target; \nartificial intelligence (AI) software to allocate slices of frequencies to different wireless devices to optimize frequency utilization across available bandwidth; and \nan edge processing module coupled to the transceiver and one or more antennas to provide low-latency computation for the predetermined target. \n | 2. The system of claim 1, comprising a quantum computer coupled to the edge processing module.\n | 3. The system of claim 1, comprising a parser that receives classical specification and data and determines if a portion of such specification runs on a quantum computer, and if so maps classical specification to quantum algorithm and the modified code is provided to an execution unit that selects one or more quantum computers, one or more classical processor, one or more graphical processing units (GPUs), or one or more neuromorphic processors.\n | 4. The system of claim 1, wherein the processor calibrates a connection by analyzing RSSI and TSSI and moves the antennas until predetermined cellular parameters are reached.\n | 5. The system of claim 1, wherein the edge processing module comprises at least a processor, a graphical processing unit (GPU), a neural network, a quantum computer, a statistical engine, or a programmable logic device (PLD).\n | 6. The system of claim 1, wherein the edge processing module and the antenna are enclosed in a housing or shipping container, or the edge processing module is in a separate shipping container adjacent the antenna.\n | 7. The system of claim 1, wherein the transceiver comprises a 5G or 6G cellular transceiver.\n | 8. The system of claim 1, wherein the edge processing module communicates at a plurality of AI selected frequency with AI selected frequency hopping to use a full frequency allocation.\n | 9. A system, comprising: \na transceiver to communicate with a predetermined target; \none or more antennas coupled to the transceiver each electrically or mechanically steerable to the predetermined target; and \na beam sweeping module controlling the antenna in accordance with one of: artificial intelligence frequency hopping selection to maximize bandwidth allocation, a service level agreement, a performance requirement, a traffic distribution data, a networking requirement or prior beam sweeping history. \n | 10. The system of claim 9, wherein the beam sweeping is directed at a group of autonomous vehicles, a group of virtual reality devices, or a group of devices performing similar functions.\n | 11. The system of claim 1, comprising a neural network coupled to a control plane, a management plane, and a data plane to optimize 5G parameters.\n | 12. The system of claim 1, comprising one or more cameras and sensors in the housing to capture security information.\n | 13. The system of claim 1, comprising edge sensors including LIDAR and RADAR.\n | 14. The system of claim 1, comprising a camera for individual identity identification.\n | 15. The system of claim 1, wherein the edge processing module streams data to the predetermined target to minimize loading the target.\n | 16. The system of claim 1, wherein the edge processing module shares workload with a core processing module located at a head-end and a cloud module located at a cloud data center, each processing module having increased latency and each having a processor, a graphical processing unit (GPU), a neural network, a quantum computer, a statistical engine, or a programmable logic device (PLD).\n | 17. The system of claim 1, comprising an edge learning machine in a housing or shipping container to provide local edge processing for Internet-of-Things (JOT) sensors.\n | 18. The system of claim 17, wherein the edge learning machine uses pre-trained models and modifies the pre-trained models for a selected task.\n | 19. The system of claim 1, comprising a cellular device for a person crossing a street near a city light or street light, the cellular device emitting a person to vehicle (P2V) or a vehicle to person (V2P) safety message.\n | 20. The system of claim 1, comprising a cloud trained neural network whose network parameters are down-sampled and filter count reduced before transferring to the edge neural network.", "output": "System has a transceiver (124) to communicate with a predetermined target, and a set of antennas coupled to the transceiver, where each antenna (11) is electrically or mechanically steerable to the predetermined target. An artificial intelligence (AI) software allocates slices of frequencies to different wireless devices to optimize frequency utilization across available bandwidth. An edge processing module e.g. neural network (102), is coupled to transceiver and the antennas to provide low-latency computation for the target. A parser receives classical specification and data and determines if a portion of the specification runs on a quantum computer. System for providing local edge processing for Internet-of-Things (JOT) sensors with reduced off-chip memory access. The system allows a fleet of drones to operate and navigate as a flock of birds to provide real-time adjustment in coverage as needed. The system provides power and autonomous navigation and self-assemble and scatter as needed to avoid physical and wireless communication obstacles. Preferred Components: The edge processing module comprises at least a processor, a graphical processing unit (GPU), a neural network, a quantum computer, a statistical engine, or a programmable logic device (PLD). The transceiver comprises a fifth-generation (5G) or sixth-generation (6G) cellular transceiver. The edge processing module communicates at a plurality of artificial intelligence (AI) selected frequency with AI selected frequency hopping to use a full frequency allocation. The edge sensors including light detection and ranging (LIDAR) and radio detection and ranging (RADAR). The drawing shows an exemplary fourth generation-fifth generation network.11Antenna 102Network 106Communications tower 110Technician 124Transceiver ", "instruction": "Please summarize the input" }, { "input": "Cellular communicationA system includes a cellular transceiver to communicate with a predetermined target; one or more antennas coupled to the 5G or 6G transceiver each electrically or mechanically steerable to the predetermined target; a processor to control a directionality of the one or more antennas in communication with the predetermined target; and an edge processing module coupled to the processor and the one or more antennas to provide low-latency computation for the predetermined target.What is claimed is: \n | 1. A system to perform edge processing for a predetermined target, comprising: \none or more cellular transceivers with one or more antennas that are electrically or mechanically steerable to the predetermined target; \na processor to control communication with the predetermined target; and \none or more edge processing modules coupled to the processor and the one or more antennas to provide low-latency computation for the predetermined target; and \na container to house the transceiver, processor running a virtual radio access network, and one or more edge processing modules, the container moveable to a location requiring increased edge processing. \n | 2. The system of claim 1, wherein the container fits requirement to be deployed without a construction permit.\n | 3. The system of claim 1, wherein the processor is coupled to fiber optics cable to communicate with a cloud-based radio access network (RAN) or a remote RAN.\n | 4. The system of claim 1, comprising an antenna mast, wherein the antenna mast is inside the container or external to the container.\n | 5. The system of claim 1, wherein the edge processing module comprises at least a processor, a graphical processing unit (GPU), a neural network, a statistical engine, or a programmable logic device (PLD).\n | 6. The system of claim 1, wherein the edge processing module and the antenna comprise one unit.\n | 7. The system of claim 1, comprising a cryogenic cooling system to cool the container.\n | 8. The system of claim 1, wherein the cellular transceiver comprises a 5G or 6G transceiver.\n | 9. The system of claim 1, wherein the processor coordinates beam sweeping by the one or more antennas with radio nodes or user equipment (UE) devices based upon service level agreement, performance requirement, traffic distribution data, networking requirements or prior beam sweeping history.\n | 10. The system of claim 9, wherein the beam sweeping is directed at a group of autonomous vehicles, a group of virtual reality devices, or a group of devices having a service agreement with a cellular provider.\n | 11. The system of claim 1, comprising a neural network coupled to a control plane, a management plane, or a data plane to optimize 5G or 6G parameters.\n | 12. The system of claim 1, comprising one or more cameras and sensors to capture security information.\n | 13. The system of claim 1, wherein the container includes edge sensors including LIDAR and RADAR.\n | 14. The system of claim 1, comprising a camera for individual identity identification.\n | 15. The system of claim 1, wherein the edge processing module streams data to the predetermined target to minimize loading the target.\n | 16. The system of claim 1, wherein the edge processing module shares workload with a core processing module located at a head-end and a cloud module located at a cloud data center, each processing module having increased latency and each having a processor, a graphical processing unit (GPU), a neural network, a statistical engine, or a programmable logic device (PLD).\n | 17. The system of claim 1, comprising an edge learning machine in the housing to provide local edge processing for Internet-of-Things (IOT) sensors with reduced off-chip memory access.\n | 18. The system of claim 17, wherein the edge learning machine uses pre-trained models and modifies the pre-trained models for a selected task.\n | 19. The system of claim 1, comprising a cellular device for a person crossing a street near a city light or street light, the cellular device emitting a person to vehicle (P2V) or a vehicle to person (V2P) safety message.\n | 20. The system of claim 1, comprising a cloud trained neural network whose network parameters are reduced before transferring to an edge neural network in the container.", "output": "System (1) comprises one or more cellular transceivers with one or more antennas that are electrically or mechanically steerable to the predetermined target, a processor to control communication with the predetermined target; and one or more edge processing modules coupled to the processor and the one or more antennas to provide low-latency computation for the predetermined target, and a container to house the transceiver, processor running a virtual radio access network, and one or more edge processing modules, the container including a heat spreader coupled to the transceiver. System for performing edge processing for predetermined target. The 5G or 6G transceiver can be part of a portable computer, laptop computer, tablet computer, brief case, or any utensil/appliance that can be away from the body to reduce RF energy on the human body, but still linked to the display and mike/speaker to act as a UI for the user. The drawing shows an exemplary city light small cell environment with crime/pollution sniffing capabilities.1System 10Computing unit 11Light post 18Manhole cover support surface 19User interface ", "instruction": "Please summarize the input" }, { "input": "Computing systemA system includes a transceiver to communicate with a predetermined target; one or more antennas coupled to the transceiver each electrically or mechanically steerable to the predetermined target; and an edge processing module coupled to the transceiver and one or more antennas to provide low-latency computation for the predetermined target.What is claimed is: \n | 1. A system, comprising: \na transceiver to communicate with a predetermined target; \none or more antennas coupled to the transceiver each electrically or mechanically steerable to the predetermined target; \nan edge processing module coupled to the transceiver and one or more antennas to provide low-latency computation for the predetermined target; and \na parser that receives classical specification and data and determines if a portion of such specification runs on a quantum computer, and if so maps classical specification to quantum algorithm and selects code execution from one or more quantum computers, one or more classical processor, one or more graphical processing units (GPUs), or one or more neuromorphic processors. \n | 2. The system of claim 1, wherein the transceiver comprises a 5G or 6G cellular transceiver.\n | 3. A system, comprising: \na transceiver to communicate with a predetermined target; \none or more antennas coupled to the transceiver each electrically or mechanically steerable to the predetermined target; and \na beam sweeping module controlling the antenna in accordance with one of: a service level agreement, a performance requirement, a traffic distribution data, a networking requirement or prior beam sweeping history; and \nan edge processing module shares workload with a core processing module located at a head-end and a cloud module located at a cloud data center, each processing module having increased latency and each having a processor, a graphical processing unit (GPU), a neural network, a quantum computer, a statistical engine, or a programmable logic device (PLD). \n | 4. The system of claim 3, wherein the beam sweeping is directed at a group of autonomous vehicles, a group of virtual reality devices, or a group of devices performing similar functions.\n | 5. A system, comprising: \na transceiver to communicate with a predetermined target; \none or more antennas coupled to the transceiver each electrically or mechanically steerable to the predetermined target; and \nan edge processing module coupled to the transceiver and one or more antennas to provide low-latency computation for the predetermined target; and \na neural network or a learning machine coupled to a control plane, a management \nplane, and a data plane to optimize 5G parameters. \n | 6. The system of claim 5, comprising a quantum computer coupled to the edge processing module.\n | 7. A system, comprising: \na transceiver to communicate with a predetermined target; \none or more antennas coupled to the transceiver each electrically or mechanically steerable to the predetermined target; and \nan edge processing module coupled to the transceiver and one or more antennas to provide low-latency computation for the predetermined target, wherein the edge processing module shares workload with a core processing module located at a head-end and a cloud module located at a cloud data center, each processing module having increased latency and each having a processor, a graphical processing unit (GPU), a neural network, a quantum computer, a statistical engine, or a programmable logic device (PLD). \n | 8. The system of claim 7, wherein the processor calibrates a connection by analyzing RSSI and TSSI and moves the antennas until predetermined cellular parameters are reached.\n | 9. A system, comprising: \na transceiver to communicate with a predetermined target; \none or more antennas coupled to the transceiver each electrically or mechanically steerable to the predetermined target; and \nan edge processing module coupled to the transceiver and one or more antennas to provide low-latency computation for the predetermined target; and \na cloud trained neural network whose network parameters are down-sampled or filter count reduced before transferring to the edge neural network. \n | 10. The system of claim 9, wherein the edge processing module comprises at least a processor, a graphical processing unit (GPU), a neural network, a quantum computer, a statistical engine, or a programmable logic device (PLD).\n | 11. The system of claim 9, wherein the edge processing module and the antenna are enclosed in a housing or shipping container, or the edge processing module is in a separate shipping container adjacent the antenna.\n | 12. The system of claim 9, wherein the edge processing module shares workload with a core processing module located at a head-end and a cloud module located at a cloud data center, each processing module having increased latency and each having a processor, a graphical processing unit (GPU), a neural network, a quantum computer, a statistical engine, or a programmable logic device (PLD).\n | 13. The system of claim 9, comprising one or more cameras and sensors in the housing to capture security information.\n | 14. The system of claim 9, comprising edge sensors including LIDAR and RADAR.\n | 15. The system of claim 9, comprising a camera for individual identity identification.\n | 16. The system of claim 9, wherein the edge processing module streams data to the predetermined target to minimize loading the target.\n | 17. The system of claim 9, comprising an edge learning machine in a housing or shipping container to provide local edge processing for Internet-of-Things (TOT) sensors.\n | 18. The system of claim 17, wherein the edge learning machine uses pre-trained models and modifies the pre-trained models for a selected task.\n | 19. The system of claim 9 comprising a cellular device for a person crossing a street near a city light or street light, the cellular device emitting a person to vehicle (P2V) or a vehicle to person (V2P) safety message.\n | 20. The system of claim 9, wherein the edge processing module comprises a learning machine, a neural network, a quantum computer, a statistical engine, or a programmable logic device (PLD).", "output": "The system has a transceiver to communicate with a predetermined target, and antennas (11) coupled to the transceiver, where each antenna is electrically or mechanically steerable to the predetermined target. An edge processing module e.g. neural network, is coupled to transceiver and the antennas to provide low-latency computation for the target. A parser receives classical specification and data, and determines if a portion of the specification runs on a quantum computer. A beam sweeping module controls the antenna in accordance with one of a service level agreement, a performance requirement, a traffic distribution data, a networking requirement or prior beam sweeping history. System for providing local edge processing for internet-of-things (IOT) sensors with reduced off-chip memory access and low latency. Uses include but are not limited to light poles, top of buildings, street lights, autonomous vehicles and virtual reality devices. The system allows a fleet of drones to operate and navigate as a flock of birds to provide real-time adjustment in coverage as needed. The system uses smart antenna techniques to support higher data rate and coverage in an effective manner. The flock of bird antenna has power and autonomous navigation and can self-assemble and scatter as needed to avoid physical and wireless communication obstacles. The drawing shows a schematic view of an exemplary man-hole cover with a small cell and steerable antennas.11Antenna 30Bands ", "instruction": "Please summarize the input" }, { "input": "Cellular systemA system includes a housing with one or more edge processors to handle processing on behalf of a mobile target or to provide local data to the mobile target or to provide artificial intelligence for the mobile target; one or more antennas coupled to the housing; and a processor to control a directionality of the antennas in communication with the mobile target using 5G or 6G protocols.What is claimed is: \n | 1. A system to communicate a local data with a mobile target, the local data including road or traffic parameters from one or more remote traffic sensors, road sensors or cameras, the system comprising: \na fixed position housing with one or more edge processors wirelessly coupled to the one or more traffic sensors, road sensors or cameras to handle processing on behalf of the mobile target with a predetermined latency to provide virtual, augment or extended reality (AR/VR/XR) processing to the mobile target or to provide artificial intelligence (AI) processing for the mobile target, wherein the mobile target offloads the AR/VR/XR or AI processing as applied to the local data to the one or more edge processors and wherein the mobile target applies a received edge processing result within the predetermined latency to augment processing by the mobile target; and \none or more antennas coupled to the housing to communicate with the mobile target using 5G or 6G protocols. \n | 2. The system of claim 1, wherein the processor calibrates a radio link between a transceiver in the housing and a client device.\n | 3. The system of claim 1, wherein the local data comprises images and wherein the processor process images from one or more mobile target cameras for location identification, ridesharing pick-up, or delivery.\n | 4. The system of claim 1, wherein the local data comprises images and wherein the one or more edge processors detect real time hazard detection or road signs.\n | 5. The system of claim 1, wherein the processor moves actuators coupled to the antennas.\n | 6. The system of claim 1, wherein the local data comprises weather or location data.\n | 7. The system of claim 1, wherein the one or more edge processors handle video content, healthcare, robotics, autonomous vehicle, augmented reality, virtual reality, extended reality, factory automation, gaining, asset tracking, or surveillance.\n | 8. The system of claim 1, wherein the mobile target receives high definition local road map data from the edge processors.\n | 9. The system of claim 1, wherein the local data comprises data affecting road conditions, and wherein the one or more edge processors provide traffic, transit, search, routing, telematics, weather, tracking, positioning, high-definition map, or geo-enrichment data.\n | 10. The system of claim 1, wherein processor focuses 5G or 6G signals to the target with iterative changes in electrical or mechanical orientation of the one or more antennas.\n | 11. The system of claim 1, wherein the edge processors comprise one or more learning machines or neural networks.\n | 12. The system of claim 1, comprising one or more cameras and sensors in the housing to capture security information.\n | 13. The system of claim 1, wherein the one or more edge processors perform predictive analytics, consumer targeting, fraud detection, or demand forecast.\n | 14. The system of claim 1, comprising a camera and a processor for individual identity identification.\n | 15. The system of claim 1, wherein the one or more edge processors applies artificial intelligence to location data.\n | 16. The system of claim 1, wherein the one or more edge processors analyze sound, scent, or chemical data from sensors in the housing.\n | 17. The system of claim 1, comprising an edge learning machine in the housing to provide local edge processing for one or more Internet-of-Things (TOT) sensors.\n | 18. The system of claim 1, comprising an edge learning machine that uses pre-trained models and modifies the pre-trained models for a selected task.\n | 19. The system of claim 1, comprising a cellular device for a person crossing a street near the city light or street light, the cellular device emitting a person to vehicle (P2V) or a vehicle to person (V2P) safety message.\n | 20. The system of claim 1, comprising a cloud trained neural network whose network parameters are reduced before transfer to the edge neural network.", "output": "The system has a mobile target which receives the road or traffic parameters from traffic sensors, road sensors or cameras. A housing is provided with edge processors which are coupled to the sensors to handle processing on behalf of the target with a predetermined latency to provide reality rendering data to the target or to provide artificial intelligence for the target. The target optimizes performance and power consumption by offloading processing to the processors and applying a received edge processing result within the latency to augment processing by the target. Antennas (11) are coupled to a housing. A processor communicates with the target using Fifth-generation (5G) or Sixth-generation (6G) protocols. Cellular system e.g. Fifth-generation (5G) cellular system. The liquid lens antenna system comprises a liquid lens that is provided with a moveable surface, where liquid is added or removed to adjust the curvature of the movable surface and an antenna is mounted on the moveable surfaces to change a direction of the antenna to a predetermined target, and thus enables to improve the signal-to-noise ratio of the wireless communication system. The drawing shows the perspective view and top view of the man-hole cover with a small cell and steerable antennas.4Manhole cover portion 10APeripheral edge portion 11Antenna 22Latch 32Digital beam former network ", "instruction": "Please summarize the input" }, { "input": "Wireless Communication SystemA system includes a transceiver to communicate with a predetermined target; one or more antennas coupled to the transceiver each electrically or mechanically steerable to the predetermined target; and an edge processing module coupled to the transceiver and one or more antennas to provide low-latency computation for the predetermined target.What is claimed is: \n | 1. A system, comprising: \na transceiver to communicate with a predetermined target; \none or more antennas coupled to the transceiver each electrically or mechanically steerable to the predetermined target; and \nan AI processing module coupled to the transceiver and one or more antennas to provide low-latency computation for the predetermined target based on AI determination. \n | 2. The system of claim 1, comprising a quantum computer coupled to the edge processing module.\n | 3. The system of claim 1, comprising a parser that receives classical specification and data and determines if a portion of such specification runs on a quantum computer, and if so maps classical specification to quantum algorithm and the modified code is provided to an execution unit that selects one or more quantum computers, one or more classical processor, one or more graphical processing units (GPUs), or one or more neuromorphic processors.\n | 4. The system of claim 1, wherein the processor calibrates a connection by analyzing RSSI and TSSI and moves the antennas until predetermined cellular parameters are reached.\n | 5. The system of claim 1, wherein the edge processing module comprises at least a processor, a graphical processing unit (GPU), a neural network, a quantum computer, a statistical engine, or a programmable logic device (PUD).\n | 6. The system of claim 1, wherein the edge processing module and the antenna are enclosed in a housing or shipping container, or the edge processing module is in a separate shipping container adjacent the antenna.\n | 7. The system of claim 1, wherein the transceiver comprises a 5G or 6G cellular transceiver.\n | 9. A system, comprising: \na transceiver to communicate with a predetermined target; \none or more antennas coupled to the transceiver each electrically or mechanically steerable to the predetermined target; and \na beam sweeping module controlling the antenna in accordance with one of: a service level agreement, a performance requirement, a traffic distribution data, a networking requirement or prior beam sweeping history. \n | 10. The system of claim 9, wherein the beam sweeping is directed at a group of autonomous vehicles, a group of virtual reality devices, or a group of devices performing similar functions.\n | 11. The system of claim 1, comprising a neural network coupled to a control plane, a management plane, and a data plane to optimize 5G parameters.\n | 12. The system of claim 1, comprising one or more cameras and sensors in the housing to capture security information.\n | 13. The system of claim 1, comprising edge sensors including LIDAR and RADAR.\n | 14. The system of claim 1, comprising a camera for individual identity identification.\n | 15. The system of claim 1, wherein the edge processing module streams data to the predetermined target to minimize loading the target.\n | 16. The system of claim 1, wherein the edge processing module shares workload with a core processing module located at a head-end and a cloud module located at a cloud data center, each processing module having increased latency and each having a processor, a graphical processing unit (GPU), a neural network, a quantum computer, a statistical engine, or a programmable logic device (PLD).\n | 17. The system of claim 1, comprising an edge learning machine in a housing or shipping container to provide local edge processing for Internet-of-Things (IoT) sensors.\n | 18. The system of claim 17, wherein the edge learning machine uses pre-trained models and modifies the pre-trained models for a selected task.\n | 19. The system of claim 1, comprising a cellular device for a person crossing a street near a city light or street light, the cellular device emitting a person to vehicle (P2V) or a vehicle to person (V2P) safety message.\n | 20. The system of claim 1, comprising a cloud trained neural network whose network parameters are down-sampled and filter count reduced before transferring to the edge neural network.", "output": "The system has a transceiver to communicate with a predetermined target. Antennas are coupled to the transceiver, where each antenna is electrically or mechanically steerable to the predetermined target, and a beam sweeping module controls the antenna in accordance with one of a service level agreement, a performance requirement, a traffic distribution data, a networking requirement or prior beam sweeping history. An artificial intelligence (AI) processing module provides low-latency computation for the target based on AI determination. A quantum computer is coupled to an edge processing module. A parser receives classical specification and data and determines if a portion of the specification runs on the computer. System for providing local edge processing for Internet-of-Things (IoT) sensors with reduced off-chip memory access. Uses include but are not limited to a pole, a building, or a light. The focusing of the 5G signals to the target client/device can be automatically done using processor with iterative changes in the orientation of the antenna by changing the curvature or shape of the surface until predetermined criteria is achieved such as the best transmission speed, TSSI RSSI, and signal-to-noise ratio (SNR). The fleet of drones can operate and navigate as a flock of birds to provide real time adjustment in coverage as needed. The flocks of birds antenna has power and autonomous navigation and can self-assemble and scatter as needed to avoid physical and wireless communication obstacles. The drawing shows a show an exemplary 5G network architecture.", "instruction": "Please summarize the input" }, { "input": "Computing systemA system includes a transceiver to communicate with a predetermined target; one or more antennas coupled to the transceiver each electrically or mechanically steerable to the predetermined target; and an edge processing module coupled to the transceiver and one or more antennas to provide low-latency computation for the predetermined target.What is claimed is: \n | 1. A system, comprising: \na transceiver to communicate with a predetermined target; \none or more antennas coupled to the transceiver each electrically or mechanically steerable to the predetermined target; \nan edge processing module coupled to the transceiver and one or more antennas to provide low-latency computation for the predetermined target; and \na quantum computer coupled to the edge processing module. \n | 2. The system of claim 1, comprising a parser that receives classical specification and data and determines if a portion of the classical specification runs on the quantum computer, and if so maps the classical specification to a quantum algorithm which is provided to an execution unit that selects one or more quantum processors, one or more classical processors, one or more graphical processing units (GPUs), or one or more neuromorphic processors.\n | 3. The system of claim 1, wherein a processor calibrates a connection by analyzing RSSI and TSSI and moves the antennas until predetermined cellular parameters are reached.\n | 4. The system of claim 1, wherein the edge processing module comprises at least a processor, a graphical processing unit (GPU), a neural network, a quantum processor, a statistical engine, or a programmable logic device (PLD).\n | 5. The system of claim 1, wherein the edge processing module and the one or more antennas are enclosed in a housing or shipping container, or the edge processing module is in a separate shipping container adjacent the one or more antennas.\n | 6. A system, comprising: \na transceiver to communicate with a predetermined target; \none or more antennas coupled to the transceiver each electrically or mechanically steerable to the predetermined target; \nan edge processing module with a learning machine or neural network; and \na beam sweeping module controlling the one or more antennas in accordance with one of: a service level agreement, a performance requirement, a traffic distribution data, a networking requirement or prior beam sweeping history. \n | 7. The system of claim 6, wherein the beam sweeping module is directed at a group of autonomous vehicles, a group of virtual reality devices, or a group of devices performing similar functions.\n | 8. A system, comprising: \na transceiver to communicate with a predetermined target; \none or more antennas coupled to the transceiver each electrically or mechanically steerable to the predetermined target; \nan edge processing module including a learning machine or a neural network coupled to the transceiver and one or more antennas to provide low-latency computation for the predetermined target. \n | 9. The system of claim 8, wherein the transceiver comprises a 5G or 6G cellular transceiver, and wherein the predetermined target comprises a plurality of antennas coupled to a case to receive signals from the transceiver.\n | 10. The system of claim 8, comprising one or more cameras and sensors in the housing to capture security information.\n | 11. The system of claim 8, comprising edge sensors including LIDAR and RADAR.\n | 12. The system of claim 8, comprising a camera for individual identity identification.\n | 13. The system of claim 8, wherein the edge processing module streams data to the predetermined target to minimize loading the predetermined target.\n | 14. The system of claim 8, wherein the edge processing module shares workload with a core processing module located at a head-end and a cloud module located at a cloud data center, each processing module having increased latency and each having a processor, a graphical processing unit (GPU), a neural network, a quantum computer, a statistical engine, or a programmable logic device (PLD).\n | 15. The system of claim 8, comprising a cellular device for a person crossing a street near a city light or street light, the cellular device emitting a person to vehicle (P2V) or a vehicle to person (V2P) safety message.\n | 16. The system of claim 8, comprising a cloud trained neural network whose network parameters are down-sampled and filter count reduced before transferring to the edge processing module neural network.\n | 17. The system of claim 8, comprising at least a sensor, a camera, or a microphone in communication with the edge processing module.\n | 18. The system of claim 8, wherein the edge processing module offloads processing for a vehicle, a drone, a reality display, a virtual reality display, an augmented reality display, an extended reality display, a game device, a healthcare device, or a manufacturing device.\n | 19. The system of claim 8, wherein the edge processing module protects privacy or secures data communication with the transceiver.\n | 20. A system, comprising: \na transceiver to communicate with a predetermined target; \none or more antennas coupled to the transceiver each electrically or mechanically steerable to the predetermined target; and \na module coupled to the transceiver and one or more antennas to provide low-latency computation for the predetermined target, the module including an edge learning machine in a housing or shipping container to provide local edge processing for Internet-of-Things (IOT) sensors. \n | 21. The system of claim 20, wherein the edge learning machine uses pre-trained models and modifies the pre-trained models for a selected task.", "output": "The system comprises a transceiver that is provided to communicate with a predetermined target. A number of antennas (11) are coupled to the transceiver, which is electrically or mechanically steerable to the predetermined target. An edge processing module is coupled to the transceiver and antennas to provide low-latency computation for the predetermined target. A quantum computer is coupled to the edge processing module. A parser that receives classical specification and data and determines when a portion of the classical specification runs on the quantum computer. The edge processing module and antennas are enclosed in a housing. System used to select network resources, such as communication links, network devices, core network, and a data center. The esthetics of the site improves, wind load reduces, and lower installation costs by integrating the remote radio head functionality into the antenna. The drawing shows a schematic view of a network architecture. 11Antennas102Network104Base station106Communications tower108Administrator computing device", "instruction": "Please summarize the input" }, { "input": "COMPUTING SYSTEMA system includes a transceiver to communicate with a predetermined target; one or more antennas coupled to the transceiver each electrically or mechanically steerable to the predetermined target; and an edge processing module coupled to the transceiver and one or more antennas to provide low-latency computation for the predetermined target.What is claimed is: \n | 1. A system, comprising: \na transceiver to communicate with a predetermined target; \none or more antennas coupled to the transceiver each electrically or mechanically steerable to the predetermined target; and \nan edge processing module coupled to the transceiver and one or more antennas to provide low-latency computation for the predetermined target. \n | 2. The system of claim 1, comprising a quantum computer coupled to the edge processing module.\n | 3. The system of claim 1, comprising a parser that receives classical specification and data and determines if a portion of such specification runs on a quantum computer, and if so maps classical specification to quantum algorithm and the modified code is provided to an execution unit that selects one or more quantum computers, one or more classical processor, one or more graphical processing units (GPUs), or one or more neuromorphic processors.\n | 4. The system of claim 1, wherein the processor calibrates a connection by analyzing RS SI and TSSI and moves the antennas until predetermined cellular parameters are reached.\n | 5. The system of claim 1, wherein the edge processing module comprises at least a processor, a graphical processing unit (GPU), a neural network, a quantum computer, a statistical engine, or a programmable logic device (PLD).\n | 6. The system of claim 1, wherein the edge processing module and the antenna are enclosed in a housing or shipping container, or the edge processing module is in a separate shipping container adjacent the antenna.\n | 7. The system of claim 1, wherein the transceiver comprises a 5G or 6G cellular transceiver.\n | 9. A system, comprising: \na transceiver to communicate with a predetermined target; \none or more antennas coupled to the transceiver each electrically or mechanically steerable to the predetermined target; and \na beam sweeping module controlling the antenna in accordance with one of: a service level agreement, a performance requirement, a traffic distribution data, a networking requirement or prior beam sweeping history. \n | 10. The system of claim 9, wherein the beam sweeping is directed at a group of autonomous vehicles, a group of virtual reality devices, or a group of devices performing similar functions.\n | 11. The system of claim 1, comprising a neural network coupled to a control plane, a management plane, and a data plane to optimize 5G parameters.\n | 12. The system of claim 1, comprising one or more cameras and sensors in the housing to capture security information.\n | 13. The system of claim 1, comprising edge sensors including LIDAR and RADAR.\n | 14. The system of claim 1, comprising a camera for individual identity identification.\n | 15. The system of claim 1, wherein the edge processing module streams data to the predetermined target to minimize loading the target.\n | 16. The system of claim 1, wherein the edge processing module shares workload with a core processing module located at a head-end and a cloud module located at a cloud data center, each processing module having increased latency and each having a processor, a graphical processing unit (GPU), a neural network, a quantum computer, a statistical engine, or a programmable logic device (PLD).\n | 17. The system of claim 1, comprising an edge learning machine in a housing or shipping container to provide local edge processing for Internet-of-Things (TOT) sensors.\n | 18. The system of claim 17, wherein the edge learning machine uses pre-trained models and modifies the pre-trained models for a selected task.\n | 19. The system of claim 1, comprising a cellular device for a person crossing a street near a city light or street light, the cellular device emitting a person to vehicle (P2V) or a vehicle to person (V2P) safety message.\n | 20. The system of claim 1, comprising a cloud trained neural network whose network parameters are down-sampled and filter count reduced before transferring to the edge neural network.", "output": "The computing system has a transceiver to communicate with a predetermined target. One or more antennas are coupled to the transceiver and each electrically or mechanically steerable to the predetermined target. An edge processing module is coupled to the transceiver and one or more antennas to provide low-latency computation for the predetermined target. Computing system. The system provides potentially unlimited ability to process components and tasks at least because processing of the components and tasks may be divided up across the network in an intelligent and meaningful way that facilitates efficient use of resources and/or an improved user experience. By intelligently and dynamically determining how to assign processing of components and tasks based on predefined factors, the edge processing may adjust how components and tasks are performed on demand and in a manner that may facilitate efficient use of resources and/or an improved user experience. Edge computing ensures high quality of experience with highly contextualized service experience and efficient utilization of radio and network resources. The drawing shows different views of a manhole cover with a small cell and steerable antennas. 4Manhole cover body11Antennas20Latch mechanism26CLatching control surface30Bands", "instruction": "Please summarize the input" }, { "input": "CELLULAR SYSTEMA system includes a cellular transceiver to communicate with a predetermined target; one or more antennas coupled to the 5G or 6G transceiver each electrically or mechanically steerable to the predetermined target; a processor to control a directionality of the one or more antennas in communication with the predetermined target; and an edge processing module coupled to the processor and the one or more antennas to provide low-latency computation for the predetermined target.What is claimed is: \n | 1. A system to perform edge processing for a predetermined target, comprising: \none or more cellular transceivers with one or more antennas that are electrically or mechanically steerable to the predetermined target; \na processor to control communication with the predetermined target; and \none or more edge processing modules coupled to the processor and the one or more antennas to provide low-latency computation for the predetermined target; and \na container to house the transceiver, processor, and one or more edge processing modules, wherein the container is stackable laterally or on top of each other, and wherein the container conforms to a shipping standard. \n | 2. The system of claim 1, wherein the container is deployed without a construction permit.\n | 3. The system of claim 1, wherein the processor is coupled to fiber optics cable to communicate with a cloud-based radio access network (RAN) or a remote RAN.\n | 4. The system of claim 1, comprising an antenna mast, wherein the antenna mast is inside the container or external to the container.\n | 5. The system of claim 1, wherein the edge processing module comprises at least a processor, a graphical processing unit (GPU), a neural network, a statistical engine, or a programmable logic device (PLD).\n | 6. The system of claim 1, wherein the edge processing module and the antenna comprise one unit.\n | 7. The system of claim 1, comprising a cryogenic cooling system to cool the container.\n | 8. The system of claim 1, wherein the cellular transceiver comprises a 5G or 6G transceiver.\n | 9. The system of claim 1, wherein the processor coordinates beam sweeping by the one or more antennas with radio nodes or user equipment (UE) devices based upon service level agreement, performance requirement, traffic distribution data, networking requirements or prior beam sweeping history.\n | 10. The system of claim 9, wherein the beam sweeping is directed at a group of autonomous vehicles, a group of virtual reality devices, or a group of devices having a service agreement with a cellular provider.\n | 11. The system of claim 1, comprising a neural network coupled to a control plane, a management plane, or a data plane to optimize 5G or 6G parameters.\n | 12. The system of claim 1, comprising one or more cameras and sensors to capture security information.\n | 13. The system of claim 1, wherein the container includes edge sensors including LIDAR and RADAR.\n | 14. The system of claim 1, comprising a camera for individual identity identification.\n | 15. The system of claim 1, wherein the edge processing module streams data to the predetermined target to minimize loading the target.\n | 16. The system of claim 1, wherein the edge processing module shares workload with a core processing module located at a head-end and a cloud module located at a cloud data center, each processing module having increased latency and each having a processor, a graphical processing unit (GPU), a neural network, a statistical engine, or a programmable logic device (PLD).\n | 17. The system of claim 1, comprising an edge learning machine in the housing to provide local edge processing for Internet-of-Things (TOT) sensors with reduced off-chip memory access.\n | 18. The system of claim 17, wherein the edge learning machine uses pre-trained models and modifies the pre-trained models for a selected task.\n | 19. The system of claim 1, comprising a cellular device for a person crossing a street near a city light or street light, the cellular device emitting a person to vehicle (P2V) or a vehicle to person (V2P) safety message.\n | 20. The system of claim 1, comprising a cloud trained neural network whose network parameters are down-sampled and filter count reduced before transferring to the edge neural network.", "output": "The edge processing system has one or more cellular transceivers with one or more antennas (11) that are electrically or mechanically steerable to a predetermined target. A processor controls communication with the predetermined target. One or more edge processing modules coupled to the processor and one or more antennas to provide low-latency computation for the predetermined target. A container houses the transceiver, the processor and one or more edge processing modules. The container is stackable laterally or on top of each other and conforms to a shipping standard. Edge processing system for predetermined target. Enables improvements in network capacity and coverage. Improves esthetics of site and reduces wind load resulting in lower leasing and installation costs by integrating remote radio head functionality into antenna. Provides potentially unlimited ability to process components and tasks because processing of components and tasks may be divided up across network in intelligent and meaningful way that facilitates efficient use of resources and/or improved user experience. Ensures high quality of experience with highly contextualized service experience and efficient utilization of radio and network resources. Reduces size of neural networks for edge learning while maintaining accuracy to get high performance at low parameter count. The drawing shows the perspective view of a man-hole cover with a small cell and steerable antennas. 4Manhole cover body6First side8Second side11Antennas22Latches", "instruction": "Please summarize the input" }, { "input": "Cellular systemA system includes a cellular transceiver to communicate with a predetermined target; one or more antennas coupled to the 5G transceiver each electrically or mechanically steerable to the predetermined target; a processor to control a directionality of the one or more antennas in communication with the predetermined target; and an edge processing module coupled to the processor and the one or more antennas to provide low-latency computation for the predetermined target.What is claimed is: \n | 1. A system to service mobile client devices with processing loads including machine learning processing load, extended reality processing load, or autonomous driving processing load, comprising: \na cellular transceiver to offload execution of a compute-intensive code object upon request from a client device with a processing load request; \none or more antennas coupled to the cellular transceiver each electrically or mechanically steerable to the client device; \na processor coupled to the one or more antennas in communication with the client device based on signals received from the client device, wherein the client device requests the processor to process at least a portion of a client processing load; and \nan edge processing module coupled to the processor and mechanically coupled to the one or more antennas to process the compute-intensive code object on behalf of the client device and to wirelessly provide low-latency access for the client device, \nwherein the edge processing module shares processing loads with a core processing module located at a 5G head-end and a cloud module located at a cloud data center such that processing loads requiring intermediate low latency access are transferred to the core processing module at the 5G head-end, while non-time sensitive large processing loads are offloaded to the cloud module at the cloud data center, and further wherein the edge processing module having a neural network with down sampled neural network parameters associated with processing loads from the client device. \n | 2. The system of claim 1, wherein the processor calibrates a radio link between a transceiver in a housing and the client device.\n | 3. The system of claim 1, wherein the processor is coupled to fiber optics cable to communicate with a cloud-based radio access network (RAN) or a remote RAN.\n | 4. The system of claim 1, wherein the processor calibrates a connection by analyzing RSSI (Received Signal Strength Indicator) and TSSI (Transmit Signal Strength Indicator) and moves the antennas until predetermined cellular parameters are reached.\n | 5. The system of claim 1, wherein the edge processing module comprises at least a processor, a graphical processing unit (GPU), a neural network, a statistical engine, or a programmable logic device (PLD).\n | 6. The system of claim 1, wherein the edge processing module and the antenna comprise one unit.\n | 7. The system of claim 6, comprising a pole, a building, or a light.\n | 8. The system of claim 1, wherein the cellular transceiver comprises a 5G transceiver.\n | 9. The system of claim 1, wherein the processor coordinates beam sweeping by the one or more antennas with radio nodes or user equipment (UE) devices based upon service level agreement, performance requirement, traffic distribution data, networking requirements or prior beam sweeping history.\n | 10. The system of claim 9, wherein the beam sweeping is directed at a group of autonomous vehicles or a group of virtual reality devices.\n | 11. The system of claim 1, comprising a neural network coupled to a control plane, a management plane, and a data plane to optimize 5G parameters.\n | 12. The system of claim 1, comprising one or more cameras and sensors in a housing to capture security information.\n | 13. The system of claim 1, comprising edge sensors including LIDAR (Light Detection And Ranging) and RADAR (Radio Detection And Ranging).\n | 14. The system of claim 1, comprising a camera for individual identity identification.\n | 15. The system of claim 1, wherein the edge processing module streams data to the client device to minimize loading the client device.\n | 16. The system of claim 1, comprising an edge learning machine in a housing to provide local edge processing for Internet-of-Things (IOT) sensors.\n | 17. The system of claim 16, wherein the edge learning machine uses pre-trained models and modifies the pre-trained models for a selected task.\n | 18. The system of claim 1, a cellular device for a person crossing a street near a city light or street light, the cellular device emitting a person to vehicle (P2V) or a vehicle to person (V2P) safety message.\n | 19. The system of claim 1, comprising a cloud trained neural network whose network parameters are filter count reduced before transferring to the edge neural network.", "output": "The system (1) has a cellular transceiver to offload execution of a compute-intensive code object. Antennas (11) are coupled to the cellular transceiver. A processor is coupled to the antennas in communication with the client device. An edge processing module is coupled to the processor and is mechanically coupled to the antennas. The edge processing module shares processing loads with a core processing module located at a 5G head-end. A cloud module is located at a cloud data center such that processing loads requiring intermediate low latency access are transferred to the core processing module at the 5G head-end, while non-time sensitive large processing loads are offloaded to the cloud module at the cloud data center. The edge processing module has a neural network with down sampled neural network parameters associated with processing loads from the client device. System for servicing mobile client devices with processing loads including machine learning processing load, extended reality processing load, or autonomous driving processing load. The receive and transmits digital beam former (DBF) coefficients are adjusted to help maintain an improved or maximum signal quality, reduce or minimize in-band interference and maximize receive power level. The active antenna system (AAS) provides site footprint reduction, distribution of radio functions within the antenna results in built-in redundancy and improved thermal performance, and distributed transceivers supports a host of advanced electronic beam-tilt features that enables improvements in network capacity and coverage. The esthetics of the site is improved and wind load is reduced, resulting in lower leasing and installation costs, by integrating the remote radio head functionality into the antenna. The edge computing ensures high quality of experience with highly contextualized service experience and efficient utilization of radio and network resources. The local neural network performs late down-sampling and filter count reduction so as to get high performance at a low parameter count and reduces size of the neural networks for edge learning while maintaining accuracy. The drawing shows a schematic diagram of the city light small cell environment with crime/pollution sniffing capabilities. 1System for servicing mobile client devices with processing loads10Computing unit11Antenna15Road18Database", "instruction": "Please summarize the input" }, { "input": "Cellular systemA system includes one or more antennas; and a processor to control a directionality of the antennas in communication with a predetermined target using 5G protocols.What is claimed is: \n | 1. A method of communicating data with a user equipment (UE), comprising: \nreceiving a signal from the UE coupled to the or more steerable antenna beams from one or more antennas at a communication station; \ndetermining a location direction of the UE using said signal; \ngenerating digital beam forming coefficients to transmit from one of said steerable antenna beams in said location direction of the UE; \ntransmitting data from said communication station to said UE within said one transmit steerable antenna beam; \ntracking said location direction of said UE as said communication station and said UE movement relative to other UEs; \nadjusting said beam forming coefficients associated with one transmit steerable antenna beam in response to the tracking step to maintain said one transmit steerable antenna beam in the location direction of said UE; \nfurther adjusting said beam forming coefficients associated with one transmit steerable antenna beam to improve a signal quality of communication signal received at said communication station. \n | 2. The method of claim 1, comprising mounting one or more antennas on a pole, a tower, or a shipping container and directionally aiming the antennas from the pole, tower, or shipping container.\n | 3. The method of claim 1, comprising remapping the beams to avoid obstructions or issues that affect 5G/6G transmissions.\n | 4. The method of claim 1, comprising changing the beams according to load, usage, time of day, or other factors.\n | 5. The method of claim 1, comprising mounting one or more antennas on a pole, a tower, or a shipping container and manually calibrating a wireless connection by examining an RSSI or a TSSI and scanning antenna actuators or moveable lens until predetermined RSSI or TSSI level or cellular parameter is reached.\n | 6. The method of claim 1, comprising scanning and optimizing radio communication with a target client or device by moving actuators coupled to a surface.\n | 7. The method of claim 1, comprising using an array of actuators or antennas, each antenna is independently steerable to optimize 5G transmission.\n | 8. The method of claim 1, comprising using Fresnel lens can be used to improve SNR.\n | 9. The method of claim 1, comprising focusing 5G signals to the target client or device with iterative changes in an orientation of the antenna by changing a housing location based on one or more predetermined criteria.\n | 10. The method of claim 1, comprising using an array antenna onboard a cell tower with a digital beam former (DBF), said array antenna having a plurality of actuators moving the RF radiating elements for providing steerable antenna beams within an antenna footprint region, said DBF providing for each radiating element, beam forming coefficients for controlling characteristics of said steerable antenna beams.\n | 11. The method of claim 1, comprising requesting a portion of a network for a group of devices, checking for available resources to satisfy the request and assigning a network slice deployment layout satisfying the requested portion of the network including antenna level layout, and managing resources at the antenna level as part of the requested portion of the network to provide communication for the group.\n | 12. The method of claim 1, comprising a high power active antenna array mounted on a cell tower, a balloon, or a drone; and a plurality of medium power active antenna arrays wirelessly coupled to the high power active antenna, wherein the medium power antenna array relays data transmission between the high power active antenna array and the UE to reduce RF exposure on biologics.\n | 13. The method of claim 1, comprising a high power active antenna array mounted on a cell tower, balloon, or a drone; and a plurality of medium power active antenna arrays wirelessly coupled to the high power active antenna, wherein the medium power antenna array relays data transmission between the high power active antenna array and the UE to reduce RF exposure on a person.\n | 14. A system, comprising: \na mobile housing including at least a pole, a tower, or a shipping container to receive a signal from one or more predetermined targets; \none or more antennas to focus on one or more predetermined targets by generating coefficients to transmit data from one or more steerable antenna beams in the location direction of the predetermined target; \none or more millimeter wave transceivers coupled to the one or more antennas; \na processor to control the one or more transceivers and one or more antennas in communication with the predetermined target using 5G protocols upon determining a location direction of the predetermined target; and \nan edge learning machine that uses pre-trained models and modifies the pre-trained models for a selected task. \n | 15. The system of claim 14, wherein the processor calibrates a radio link between a transceiver and a client device.\n | 16. The system of claim 14, wherein the processor is coupled to fiber optics cable to communicate with a cloud-based radio access network (RAN) or a remote RAN.\n | 17. The system of claim 14, comprising a camera for individual identity identification.\n | 18. The system of claim 14, wherein the processor analyzes walking gaits and facial features for identity identification.\n | 19. The system of claim 14, comprising an edge processor to provide local edge processing for Internet-of-Things (IOT) sensors.\n | 20. The system of claim 14, comprising network slice deployment layout descriptors corresponding to a network slice deployment layout with network slice life cycle management, configuration, performance management, monetary cost associated with the network slice deployment layout, or quality-of-service values associated with the network slice deployment layout.\n | 21. The system of claim 14, comprising a cellular device for a person crossing a street near the city light or street light, the cellular device emitting a person to vehicle (P2V) or a vehicle to person (V2P) safety message.\n | 22. The system of claim 14, comprising a neural network whose network parameters are reduced before transfer to an edge neural network.\n | 23. The system of claim 14, wherein a portion of a network for a group of devices, checks for available resources to satisfy the request and assigns a network slice deployment layout satisfying the requested portion of the network including antenna level layout, and manages resources at the antenna level as part of the requested portion of the network to provide communication for the group.\n | 24. A communication method, comprising: \nproviding a beam antenna and a millimeter-wave transceiver coupled to the beam antenna secured to a shipping container or a cell tower; \nscanning the beam antenna in the direction of an autonomous vehicle; and \ncommunicating using the millimeter-wave transceiver and the beam antenna with the vehicle pursuant to a defined communication protocol by: \nreceiving at the steerable antenna a receive radio signal from the vehicle; \ndetermining beamforming coefficients transmit a transmit radio signal to the vehicle; \ntransmitting data from said steerable antenna in the direction of the vehicle moving relative to the antenna; and \napplying pre-trained learning machine models and modifies the pre-trained models for a selected task requested by the vehicle. \n | 25. The method of claim 24, comprising requesting a portion of a network for a group of devices, checking for available resources to satisfy the request and assigning a network slice deployment layout satisfying the requested portion of the network including antenna level layout, and managing resources at the antenna level as part of the requested portion of the network to provide communication for the group.\n | 26. The method of claim 24, comprising a high power active antenna array mounted on a cell tower, balloon, or a drone; and a plurality of medium power active antenna arrays wirelessly coupled to the high power active antenna, wherein the medium power antenna array relays data transmission between the high power active antenna array and the UE to reduce RF exposure on a person.", "output": "The method involves receiving a signal from the UE coupled to the steerable antenna beams from antennas at a communication station. The location direction of the UE is determined using signal. The digital beam forming coefficients to transmit from one of steerable antenna beams are generated in location direction of the UE. The data is transmitted from communication station to UE within one transmit steerable antenna beam. The location direction of UE is tracked as communication station and UE movement relative to other UEs. The beam forming coefficients associated with one transmit steerable antenna beam is adjusted in response to the tracking step to maintain one transmit steerable antenna beam in the location direction of UE. The beam forming coefficients associated with one transmit steerable antenna beam are adjusted to improve a signal quality of communication signal received at communication station. INDEPENDENT CLAIMS are included for the following:a system for communicating data with UE; anda communication method. Method for communicating data with user equipment (UE) e.g. mobile phone in cellular system. The receive and transmits digital beam former (DBF) coefficients are adjusted to help maintain an improved or maximum signal quality, to help reduce or minimize in-band interference and to help maximize receive power level. The aesthetics of the site can be improved and wind load reduced, resulting in lower leasing and installation costs. The neural network control of multiple input multiple output (MIMO) system facilitate testing of MIMO base stations, reduce power consumption during MIMO communications, allow for flexibility in capacity, allow for flexibility in MIMO signal estimation, allow routing around defective processing elements or antennas, etc. The local neural network performs late down-sampling and filter count reduction, to get high performance at a low parameter count to reduce size of the neural networks for edge learning when maintaining accuracy. The drawing shows a schematic diagram of exemplary city light small cell environment with crime or pollution sniffing capabilities.10 Computing unit 11Street device 15Road 18Database 19 User interface ", "instruction": "Please summarize the input" }, { "input": "Cellular systemA system includes a housing with one or more edge processors to handle processing on behalf of a mobile target or to provide local data to the mobile target or to provide artificial intelligence for the mobile target; one or more antennas coupled to the housing; and a processor to control a directionality of the antennas in communication with the mobile target using 5G or 6G protocols.What is claimed is: \n | 1. A system, comprising: \na housing with one or more edge processors to handle processing on behalf of a mobile target or to provide local data to the mobile target or to provide artificial intelligence for the mobile target; \none or more antennas coupled to the housing; and \na processor to control a directionality of the antennas in communication with the mobile target using 5G or 6G protocols; and \na cloud trained neural network whose network parameters are reduced before transferring to the edge neural network. \n | 2. The system of claim 1, wherein the processor calibrates a radio link between a transceiver in the housing and a client device.\n | 3. The system of claim 1, wherein the processor process images from one or more mobile target cameras for location identification, ridesharing pick-up, or delivery.\n | 4. The system of claim 1, wherein the one or more edge processors detect real time hazard detection or road signs.\n | 5. The system of claim 1, wherein the processor moves actuators coupled to the antennas.\n | 6. The system of claim 1, wherein the one or more edge processors handle local data, weather or location data.\n | 7. The system of claim 1, wherein the one or more edge processors handle video content, healthcare, robotics, autonomous vehicle, augmented reality, virtual reality, extended reality, factory automation, gaming, asset tracking, or surveillance.\n | 8. The system of claim 1, wherein the mobile target comprises plant or manufacturing equipment.\n | 9. The system of claim 1, wherein the one or more edge processors provide traffic, transit, search, routing, telematics, weather, tracking, positioning, high definition map, or geo-enrichment data.\n | 10. The system of claim 1, wherein the processor focuses 5G signals to the target with iterative changes in electrical or mechanical orientation of the one or more antennas.\n | 11. The system of claim 1, wherein neural networks comprises parameters trained with remote processors.\n | 12. The system of claim 1, comprising one or more cameras and sensors in the housing to capture security information.\n | 13. The system of claim 1, wherein the one or more edge processors perform predictive analytics, consumer targeting, fraud detection, or demand forecast.\n | 14. The system of claim 1, comprising a camera and a processor for individual identity identification.\n | 15. The system of claim 1, wherein the one or more edge processors applies artificial intelligence to location data.\n | 16. The system of claim 1, comprising an edge learning machine in the housing to provide local edge processing for one or more Internet-of-Things (IOT) sensors.\n | 17. The system of claim 16, wherein the edge learning machine uses pre-trained models and modifies the pre-trained models for a selected task.\n | 18. The system of claim 1, comprising a cellular device for a person crossing a street near the city light or street light, the cellular device emitting a person to vehicle (P2V) or a vehicle to person (V2P) safety message.", "output": "The system has a housing with one or more edge processors that handles processing on behalf of a mobile target or provides local data to the mobile target or provides artificial intelligence for the mobile target. One or more antennas (11) are coupled to the housing and a processor controls a directionality of the antennas in communication with the mobile target using fifth generation (5G) or 6G protocols. A cloud trained neural network whose network parameters are reduced before transferring to the edge neural network. Cellular system. The beamforming architecture can control the resulting interference pattern, in order to realize a steerable main lobe that provides high beam gain in a particular direction. The neural network control of the multiple-input multiple-output (MIMO) system can facilitate testing of MIMO base stations, reduce power consumption during MIMO communication, allow for flexibility in capacity, allow for flexibility in MIMO signal estimation, allow routing around defective processing elements or antennas, etc. The receive and transmit digital beam former (DBF) coefficients are adjusted to help maintain an improved or maximum signal quality, to help reduce or minimize in-band interference and to help maximize receive power level. The drawing shows a schematic view of 5G network architecture. 11Antenna102Digital beamformer104Base station108Computing device128Spoke", "instruction": "Please summarize the input" }, { "input": "Cellular systemA system includes a cellular transceiver to communicate with a predetermined target; one or more antennas coupled to the 5G transceiver each electrically or mechanically steerable to the predetermined target; a processor to control a directionality of the one or more antennas in communication with the predetermined target; and an edge processing module coupled to the processor and the one or more antennas to provide low-latency computation for the predetermined target.What is claimed is: \n | 1. A system, comprising: \na 5G cellular transceiver to communicate with a predetermined target; \none or more antennas coupled to the 5G cellular transceiver each electrically or mechanically steerable to the predetermined target; \na processor to control a directionality of the one or more antennas in communication with the predetermined target; and \nan edge processing module coupled to the processor and the one or more antennas to provide low-latency computation on a request or data generated by the predetermined target, wherein the edge processing module shares workload with a core processing module located at a head-end or a cloud module located at a cloud data center, each processing module having increased latency and each having a processor, a graphical processing unit (GPU), a neural network, a statistical engine, or a programmable logic device (PLD). \n | 2. The system of claim 1, wherein the processor calibrates a radio link between a transceiver in the housing and a client device.\n | 3. The system of claim 1, wherein the processor is coupled to fiber optics cable to communicate with a cloud-based radio access network (RAN) or a remote RAN.\n | 4. The system of claim 1, wherein the edge processing module comprises at least a processor, a graphical processing unit (GPU), a neural network, a statistical engine, or a programmable logic device (PLD).\n | 5. The system of claim 1, wherein the edge processing module and the antenna comprise one unit.\n | 6. The system of claim 5, wherein the unit comprises a pole, a building, or a light.\n | 7. The system of claim 1, comprising a neural network coupled to a control plane, a management plane, or a data plane to optimize 5G parameters.\n | 8. The system of claim 1, comprising one or more cameras and sensors in the housing to capture security information.\n | 9. The system of claim 1, comprising edge sensors including LIDAR and RADAR.\n | 10. The system of claim 1, comprising a camera for automated identity identification.\n | 11. The system of claim 1, wherein the edge processing module streams data processed at the head end or the cloud data center to the predetermined target to minimize a processing load for the target.\n | 12. The system of claim 1, comprising a wearable device to render data processed by the head end or the cloud data center and wireless sent to the wearable device to minimize processing at the wearable device.\n | 13. The system of claim 1, comprising an edge learning machine in a housing to provide local edge processing for Internet-of-Things (IOT) sensors.\n | 14. The system of claim 13, wherein the edge learning machine uses pre-trained models and modifies the pre-trained models for a selected task.\n | 15. The system of claim 1, comprising a cellular device for a person crossing a street near a city light or street light, the cellular device emitting a person to vehicle (P2V) or a vehicle to person (V2P) safety message.\n | 16. A system, comprising: \na 5G cellular transceiver to communicate with a predetermined target; \none or more antennas coupled to the 5G cellular transceiver each electrically or mechanically steerable to the predetermined target; \na processor to control a directionality of the one or more antennas in communication with the predetermined target; and \nan edge processing module coupled to the processor and the one or more antennas to provide low-latency computation on a request or data generated by the predetermined target, wherein the processor calibrates a connection by analyzing RSSI and TSSI and the one or more antennas is moved until predetermined cellular parameters are reached. \n | 17. A system, comprising: \na 5G cellular transceiver to communicate with a predetermined target; \none or more antennas coupled to the 5G cellular transceiver each electrically or mechanically steerable to the predetermined target; \na processor to control a directionality of the one or more antennas in communication with the predetermined target; and \nan edge processing module coupled to the processor and the one or more antennas to provide low-latency computation on a request or data generated by the predetermined target, wherein the edge processing module stores video content close to users. \n | 18. A system, comprising: \na 5G cellular transceiver to communicate with a predetermined target; \none or more antennas coupled to the 5G cellular transceiver each electrically or mechanically steerable to the predetermined target; \na processor to control a directionality of the one or more antennas in communication with the predetermined target; and \nan edge processing module coupled to the processor and the one or more antennas to provide low-latency computation on a request or data generated by the predetermined target, wherein the processor coordinates beam sweeping by the one or more antennas with radio nodes or user equipment (UE) devices based upon service level agreement, performance requirement, traffic distribution data, networking requirements or prior beam sweeping history. \n | 19. The system of claim 18, wherein the beam sweeping is directed at a group of autonomous vehicles or a group of virtual reality devices.\n | 20. A system, comprising: \na 5G cellular transceiver to communicate with a predetermined target; \none or more antennas coupled to the 5G cellular transceiver; \na processor to control one or more antennas in communication with the predetermined target; \nan edge processing module coupled to the processor and the one or more antennas to provide low-latency computation based on data generated by or a request from the predetermined target; and \na cloud trained edge neural network whose network parameters are down-sampled and filter count reduced before transferring to the edge neural network.", "output": "The system (1) has a fifth generation (5G) cellular transceiver that is provided to communicate with a predetermined target. Antennas are coupled to the 5G cellular transceiver and electrically or mechanically steerable to the predetermined target. A processor is provided to control a directionality of the antennas in communication with the predetermined target. An edge processing module is coupled to the processor and the antennas to provide a low-latency computation on a request or data generated by the predetermined target. The edge processing module shares a workload with a core processing module located at a head-end or a cloud module located at a cloud data center. The processing module having increased latency and including a processor, a GPU, a neural network, a statistical engine or a PLD. Cellular system for optimizing data flow in 5G network. The receive and transmits digital beam former (DBF) coefficients are adjusted to maintain the improved or maximum signal quality, to reduce or minimize the in-band interference and to maximize the receive power level. The benefits of AAS include the site footprint reduction, distribution of radio functions within the antenna results in built-in redundancy and improved thermal performance and distributed transceivers can support the host of advanced electronic beam-tilt features that enables the improvements in network capacity and coverage. The aesthetics of the site are improved and the wind load is reduced by integrating the remote radio head functionality into the antenna, thus, resulting in lower leasing and installation costs. The tasks are divided up across the network in the intelligent and meaningful way that facilitates the efficient use of resources and/or the improved user experience. The local neural network performs the late down-sampling and filter count reduction to get high performance at the low parameter count to reduce size of the neural networks for edge learning while maintaining accuracy. The drawing shows a schematic view of a city light small cell environment with crime/pollution sniffing capabilities. 1Cellular system10Computing unit11Street device15Road18Database", "instruction": "Please summarize the input" }, { "input": "Systems and methods for selecting among different driving modes for autonomous driving of a vehicleSystems and methods for selecting among different driving modes for autonomous driving of a vehicle may: generate output signals; determine the vehicle proximity information that indicates whether one or more vehicles are within the particular proximity of the vehicle; determine the internal passenger presence information that indicates whether one or more passengers are present in the vehicle; select a first driving mode or a second driving mode based on one or more determinations; and control the vehicle autonomously in accordance with the selection of either the first driving mode or the second driving mode.What is claimed is: \n | 1. A vehicle configured to select among different driving modes for autonomous driving of the vehicle, the vehicle comprising: \na set of sensors sensing vehicle proximity information indicating whether one or more vehicles are within a threshold distance from the vehicle; \none or more hardware processors configured by machine-readable instructions to: \nselect a first driving mode, wherein selecting the first driving mode is responsive to the vehicle proximity information indicating no vehicles are within the threshold distance from the vehicle; \nselect a second driving mode responsive to the vehicle proximity information indicating one or more vehicles are within the threshold distance from the vehicle, wherein the first driving mode is at least one of more energy-efficient than the second driving mode and results in shortened driving time compared to that resulting from selection of the second driving mode; and \ncontrol the vehicle autonomously in accordance with selection of either the first driving mode or the second driving mode. \n | 2. The vehicle of claim 1, wherein the set of sensors is further configured to generate output signals conveying internal passenger presence information, wherein the internal passenger presence information indicates whether one or more passengers are present in the vehicle, and wherein the one or more hardware processors are further configured by machine-readable instructions to: \ndetermine the internal passenger presence information that indicates whether one or more passengers are present in the vehicle, wherein determination of the internal passenger presence information is based on the output signals; \nwherein selecting the first driving mode is further responsive to the internal passenger presence information indicating no passengers are present in the vehicle; \nwherein selecting the second driving mode is responsive to the internal passenger presence information indicating one or more passengers are present in the vehicle. \n | 3. The vehicle of claim 1, wherein the one or more hardware processors are further configured by machine-readable instructions to: \nobtain external passenger presence information that indicates whether any passengers are present in one or more vehicles within the threshold distance from the vehicle, responsive to the vehicle proximity information indicating the one or more vehicles are within the threshold distance from the vehicle; \nselect the first driving mode, wherein selecting the first driving mode is responsive to the external passenger presence information indicating no passengers are present in the one or more vehicles within the threshold distance from the vehicle; and \nselect the second driving mode, wherein selecting the second driving mode is responsive to the external passenger presence information indicating passengers are present in the one or more vehicles within the threshold distance from the vehicle. \n | 4. The vehicle of claim 1, wherein the one or more hardware processors are further configured by machine-readable instructions to: \nobtain external control information that indicates whether one or more vehicles within the particular proximity of the vehicle are currently under autonomous control, responsive to the vehicle proximity information indicating the one or more vehicles are within the threshold distance from the vehicle; \nselect the first driving mode, wherein selecting the first driving mode is responsive to the external control information indicating the one or more vehicles within the threshold distance from the vehicle are currently under autonomous control; and \nselect the second driving mode, wherein selecting the second driving mode responsive to the external control information indicating at least one of the one or more vehicles within the threshold distance from the vehicle are currently not under autonomous control. \n | 5. The vehicle of claim 1, wherein the first driving mode allows a higher level of acceleration than a maximum level of acceleration allowed while the vehicle is operating in the second driving mode.\n | 6. The vehicle of claim 1, wherein the first driving mode allows a higher level of deceleration than a maximum level of deceleration allowed while the vehicle is operating in the second driving mode.\n | 7. The vehicle of claim 3, wherein the first driving mode allows a smaller following distance to another vehicle than a minimum following distance allowed while the vehicle is operating in the second driving mode.\n | 8. The vehicle of claim 3, wherein obtaining the external passenger presence information is accomplished through vehicle-to-vehicle communication.\n | 9. The vehicle of claim 3, wherein the output signals further convey visual information regarding an exterior of the vehicle, and wherein obtaining the external passenger presence information is accomplished through analysis of the visual information.\n | 10. The vehicle of claim 1, wherein the one or more physical computer processors are further configured by computer-readable instructions to: \nfacilitate user input from one or more passengers, wherein the user input represents a request for selection of the first driving mode; \nselect the first driving mode, wherein selection is based on the user input. \n | 11. A method for selecting among different driving modes for autonomous driving of a vehicle, the method comprising: \ngenerating output signals conveying vehicle proximity information indicating whether one or more vehicles are within a threshold distance from the vehicle; \nselecting a first driving mode responsive to the vehicle proximity information indicating no vehicles are within the threshold distance from; \nselecting a second driving mode responsive to the vehicle proximity information indicating one or more vehicles are within the threshold distance from the vehicle, wherein the first driving mode is at least one of more energy-efficient than the second driving mode and results in shortened driving time compared to that resulting from selection of the second driving mode; and \ncontrolling the vehicle autonomously in accordance with selecting either the first driving mode or the second driving mode. \n | 12. The method of claim 11, wherein the output signals further convey internal passenger presence information, wherein the internal passenger presence information indicates whether one or more passengers are present in the vehicle, the method further comprising: \ndetermining the internal passenger presence information that indicates whether one or more passengers are present in the vehicle, wherein determination of the internal passenger presence information is based on the output signals; \nwherein selecting the first driving mode is further responsive to the internal passenger presence information indicating no passengers are present in the vehicle; and \nwherein selecting the second driving mode is responsive to the internal passenger presence information indicating one or more passengers are present in the vehicle. \n | 13. The method of claim 11, further comprising: \nobtaining external passenger presence information that indicates whether any passengers are present in one or more vehicles within the particular proximity of the vehicle, responsive to the vehicle proximity information indicating the one or more vehicles are within the threshold distance from the vehicle; \nselecting the first driving mode, wherein selecting the first driving mode is responsive to the external passenger presence information indicating no passengers are present in the one or more vehicles within the threshold distance from the vehicle; and \nselecting the second driving mode, wherein selecting the second driving mode is responsive to the external passenger presence information indicating passengers are present in the one or more vehicles within the threshold distance from the vehicle. \n | 14. The method of claim 11, further comprising: \nobtaining external control information that indicates whether one or more vehicles within the threshold distance from the vehicle are currently under autonomous control, responsive to the vehicle proximity information indicating the one or more vehicles are within the threshold distance from the vehicle; \nselecting the first driving mode, wherein selecting the first driving mode is responsive to the external control information indicating the one or more vehicles within the threshold distance from the vehicle are currently under autonomous control; and \nselecting the second driving mode, wherein selecting the second driving mode responsive to the external control information indicating at least one of the one or more vehicles within the particular proximity of the vehicle are currently not under autonomous control. \n | 15. The method of claim 11, wherein the first driving mode allows a higher level of acceleration than a maximum level of acceleration allowed while the vehicle is operating in the second driving mode.\n | 16. The method of claim 11, wherein the first driving mode allows a higher level of deceleration than a maximum level of deceleration allowed while the vehicle is operating in the second driving mode.\n | 17. The method of claim 13, wherein the first driving mode allows a smaller following distance to another vehicle than a minimum following distance allowed while the vehicle is operating in the second driving mode.\n | 18. The method of claim 13, wherein obtaining the external passenger presence information is accomplished through vehicle-to-vehicle communication.\n | 19. The method of claim 13, wherein the output signals further convey visual information regarding an exterior of the vehicle, and wherein obtaining the external passenger presence information is accomplished through analysis of the visual information.\n | 20. The method of claim 11, further comprising: \nfacilitating user input from one or more passengers; \nselecting the first driving mode, wherein selection is based on the user input.", "output": "The vehicle has sensors to generate output signals, and hardware processors configured by machine-readable instructions to perform operations. The operation includes determine (206) the vehicle proximity information that indicates whether vehicles are within the particular proximity of the vehicle, select (208) a first driving mode, select (210) a second driving mode, control (212) the vehicle autonomously in accordance with selecting either the first driving mode or the second driving mode. An INDEPENDENT CLAIM is included for a method for selecting different driving modes for autonomous driving of vehicle. Vehicle with different driving modes. Enables to control the vehicle autonomously based on selection of either the first driving mode or second driving mode. The drawing shows the flowchart of the method for selecting different driving modes for autonomous driving of vehicle. 202Generate output signals204Determine vehicle proximity information206Determine internal passenger presence information208Select first driving mode210Select second driving mode212Control vehicle autonomously", "instruction": "Please summarize the input" }, { "input": "VEHICLE-TO-HUMAN COMMUNICATION IN AN AUTONOMOUS VEHICLE OPERATIONA device and method for autonomous vehicle-to-human communications are disclosed. Upon detecting a human traffic participant being proximal to a traffic yield condition of a vehicle planned route, generating a message for broadcast to the human traffic participant and sensing whether the human traffic participant acknowledges a receipt of the message. When sensing that the human traffic participant acknowledges receipt of the message, generating a vehicle acknowledgment message for broadcast to the pedestrian.|1. A method for vehicle-to-human communication operable for autonomous vehicle operation, the method comprising: \ndetecting a human traffic participant being proximal to a traffic yield condition of a vehicle planned route; \ngenerating an action message indicating a vehicle intent for broadcast to the human traffic participant; \nsensing, by a vehicle sensor device, whether the human traffic participant acknowledges receipt of the action message by at least one of: \ndetermining, by at least one processor, human traffic participant attention data indicating that the human traffic participant visually observes the action message; and \ndetermining, by the at least one processor, human traffic participant movement data to be responsive to the action message; and \nwhen the human traffic participant acknowledges the receipt of the action message, generating a vehicle acknowledgment message for broadcast to the human traffic participant. \n | 2. The method of claim 1, further comprising: \nwhen either of the human traffic participant attention data and the human traffic participant movement data are contrary to the action message, generating a counter-action message responsive to the either of the human traffic participant attention data and the human traffic participant movement data. \n | 3. The method of claim 1, wherein the generating the action message comprises at least one of: \na graphic-based message for display by an external display of the vehicle; and \nan audible message for playback from the vehicle. \n | 4. The method of claim 1, wherein the human traffic participant attention data comprises at least one of: \na human traffic participant gaze directed towards a direction of the vehicle; \na human traffic participant gesture directed towards the direction of the vehicle; and \na facial recognition indicating the human traffic participant is facing the direction of the vehicle. \n | 5. The method of claim 1, wherein the human traffic participant movement data comprises at least one of: \nthe action message conveying a human traffic participant velocity vector component slowing to a pedestrian travel rate that operates to avoid interception of the vehicle planned route; and \nmodifying a directional component to one that operates to avoid intercepting the vehicle planned route. \n | 6. The method of claim 1, wherein the vehicle acknowledgement message comprises at least one of: \na graphic user interface acknowledgment message for display via the external display; and \nan audible acknowledgment message for directional announcement by a speaker of the vehicle. \n | 7. The method of claim 1, wherein the traffic yield condition may be defined by at least one of: \nRoute Network Description File (RNDF) data; \ntraffic yield condition data; and \nobject recognition data relating to the traffic yield condition for the vehicle planned route. \n | 8. A method in a vehicle control unit for autonomous operation of a vehicle, the method comprising: \ndetecting a human traffic participant being proximal to a traffic yield condition of a vehicle planned route; \ngenerating an action message indicating a vehicle intent for broadcast to the human traffic participant; \nsensing, by a vehicle sensor device, whether the human traffic participant acknowledges receipt of the action message by: \ndetermining, by at least one processor, human traffic participant attention data indicating that the human traffic participant visually observes the action message; and \ndetermining, by the at least one processor, human traffic participant movement data to be responsive to the action message; and \nwhen the human traffic participant acknowledges the receipt of the action message, generating a vehicle acknowledgment message for broadcast to the human traffic participant. \n | 9. The method of claim 8, further comprising: \nwhen either of the human traffic participant attention data and the human traffic participant movement data are contrary to the action message, generating a counter-action message responsive to the either of the human traffic participant attention data and the human traffic participant movement data. \n | 10. The method of claim 8, wherein the generating the action message comprises at least one of: \na graphic-based message for display by an external display of the vehicle; and \nan audible message for playback from the vehicle. \n | 11. The method of claim 8, wherein the human traffic participant attention data comprises at least one of: \na human traffic participant gaze directed towards a direction of the vehicle; \na human traffic participant gesture directed towards the direction of the vehicle; and \na facial recognition indicating the human traffic participant is facing the direction of the vehicle. \n | 12. The method of claim 8, wherein the human traffic participant movement data comprises at least one of: \nthe action message conveying a human traffic participant velocity vector component slowing to a pedestrian travel rate that operates to avoid interception of the vehicle planned route; and \nmodifying a directional component to one that operates to avoid intercepting the vehicle planned route. \n | 13. The method of claim 8, wherein the vehicle acknowledgement message comprises at least one of: \na graphic user interface acknowledgment message for display via the external display; and \nan audible acknowledgment message for directional announcement by a speaker of the vehicle. \n | 14. The method of claim 8, wherein the traffic yield condition may be defined by at least one of: \nRoute Network Description File (RNDF) data; \ntraffic yield condition data received from a vehicle-to-infrastructure device; and \nobject recognition data prompting the traffic yield condition for the vehicle planned route. \n | 15. A vehicle control unit for providing vehicle-to-human communications in an autonomous vehicle operation, the vehicle control unit comprising: \na processor; \nmemory communicably coupled to the processor and to a plurality of vehicle sensor devices, the memory storing: \na vehicular operations module including instructions that when executed cause the processor to generate vehicle location data including a traffic yield condition from vehicle planned route data and sensor data; \na traffic yield condition module including instructions that when executed cause the processor to: \nreceive the vehicle location data and human traffic participant data, based on at least some of the plurality of vehicle sensor devices, to detect a human traffic participant being proximal to the traffic yield condition; \nwhen the human traffic participant is proximal to the traffic yield condition, generate message data indicating a vehicle intent for delivery to the human traffic participant; and \nan acknowledgment confirmation module including instructions that when executed cause the processor to sense, based on the at least some of the plurality of vehicle sensor devices, whether the human traffic participant acknowledges a receipt of the message by at least one of: \ndetermining human traffic participant attention data indicating whether the human traffic participant comprehends the message; and \ndetermining human traffic participant conduct data responsive to the message; \nwherein the acknowledgement confirmation module includes further instructions to, upon sensing that the human traffic participant acknowledges the receipt of the message, generate a vehicle acknowledgment message for delivery to the human traffic participant. \n | 16. The vehicle control unit of claim 15, wherein the message comprises at least one of: \na graphic-based message for an external display of the vehicle; and \nan audible message for announcement by the vehicle. \n | 17. The vehicle control unit of claim 15, wherein the acknowledgment message comprises at least one of: \na graphic-based acknowledgment message for display by an external display of the vehicle; and \nan audible acknowledgment message for announcement by an audio system of the vehicle. \n | 18. The vehicle control unit of claim 15, wherein the human traffic participant attention data comprises at least one of: \na human traffic participant gaze directed towards a direction of the vehicle; \na human traffic participant gesture directed towards the direction of the vehicle; and \na facial recognition indicating that the human traffic participant faces a direction of the vehicle. \n | 19. The vehicle control unit of claim 15, wherein the human traffic participant conduct comprises at least one of: \na human traffic participant velocity vector component slowing to a pedestrian travel rate that yields to the vehicle planned route; and \nmodifying a human traffic participant vector directional component to one that operates to avoid the vehicle planned route. \n | 20. The vehicle control unit of claim 15, wherein the traffic yield condition may be defined by at least one of: \nRoute Network Description File (RNDF) data; \ntraffic yield condition data; and \nobject recognition data relating to the traffic yield condition for the vehicle planned route.", "output": "The method (600) involves generating action message indicating vehicle intent for broadcast to a human traffic participant (606). A determination is made (608) to check whether the human traffic participant acknowledges receipt of the action message by a vehicle sensor device by determining human traffic participant attention data indicating that the human traffic participant visually observes the action message by a processor, and by determining human traffic participant movement data to be responsive to the action message. Vehicle acknowledgment message for broadcast to the human traffic participant is generated (616) when the human traffic participant acknowledges the receipt of the action message. An INDEPENDENT CLAIM is also included for a vehicle control unit for providing vehicle-to-human communications for facilitating autonomous operation of a vehicle. Method for providing vehicle-to-human communication for facilitating autonomous operation of a vehicle i.e. car (from drawings). Can also be used for utility vehicles such as lorries and construction vehicles. The method enables allowing vehicle sensor devices to be communicatively coupled to a number of servers by a network cloud, so that a vehicle planned route can be dynamically adjusted based on driving conditions. The method enables operating a human traffic participant detection module to track the human traffic participant for sufficient time duration to determine movement speed and movement direction of the detected human traffic participant in an accurate manner. The drawing shows a flowchart illustrating a method for providing vehicle-to-human communication for facilitating autonomous operation of a vehicle. 600Method for providing vehicle-to-human communication for facilitating autonomous operation of vehicle606Step for generating action message indicating vehicle intent for broadcast to human traffic participant608Step for determining whether human traffic participant acknowledges receipt of action message by vehicle sensor device616Step for generating vehicle acknowledgment message for broadcast to human traffic participant when human traffic participant acknowledges receipt of action message618Step for prompting alternate vehicle action", "instruction": "Please summarize the input" }, { "input": "Systems and methods for controlling the operation of an autonomous vehicle using multiple traffic light detectorsSystems and methods for controlling the operation of an autonomous vehicle are disclosed herein. One embodiment performs traffic light detection at an intersection using a sensor-based traffic light detector to produce a sensor-based detection output, the sensor-based detection output having an associated first confidence level; performs traffic light detection at the intersection using a vehicle-to-infrastructure-based (V2I-based) traffic light detector to produce a V2I-based detection output, the V2I-based detection output having an associated second confidence level; performs one of (1) selecting as a final traffic-light-detection output whichever of the sensor-based detection output and the V2I-based detection output has a higher associated confidence level and (2) generating the final traffic-light-detection output by fusing the sensor-based detection output and the V2I-based detection output using a first learning-based classifier; and controls the operation of the autonomous vehicle based, at least in part, on the final traffic-light-detection output.What is claimed is: \n | 1. A system for controlling operation of an autonomous vehicle, the system comprising: \none or more processors; and \na memory communicably coupled to the one or more processors and storing: \na sensor-based traffic light detection module including instructions that when executed by the one or more processors cause the one or more processors to perform sensor-based traffic light detection at an intersection to produce a sensor-based detection output, the sensor-based detection output having an associated first confidence level; \na vehicle-to-infrastructure-based (V2I-based) traffic light detection module including instructions that when executed by the one or more processors cause the one or more processors to perform V2I-based traffic light detection at the intersection to produce a V2I-based detection output, the V2I-based detection output having an associated second confidence level; \na fusion module including instructions that when executed by the one or more processors cause the one or more processors to select as a final traffic-light-detection output whichever of the sensor-based detection output and the V2I-based detection output has a higher associated confidence level subject to an override based on consideration of a risk associated with a transition through the intersection that the autonomous vehicle plans to execute; and \na control module including instructions that when executed by the one or more processors cause the one or more processors to control the operation of the autonomous vehicle based, at least in part, on the final traffic-light-detection output. \n | 2. The system of claim 1, wherein the sensor-based traffic light detection module includes instructions to produce the sensor-based detection output by analyzing image data associated with the intersection.\n | 3. The system of claim 1, wherein the V2I-based traffic light detection module includes instructions to compute the associated second confidence level, at least in part, by comparing V2I signals received from an information system of the intersection with environmental sensor data associated with the intersection using a learning-based classifier.\n | 4. The system of claim 1, wherein the V2I-based traffic light detection module includes instructions to compute the associated second confidence level, at least in part, by processing past sensor-based detection output data and past V2I-based detection output data using a learning-based classifier.\n | 5. The system of claim 1, wherein the instructions in the control module to control the operation of the autonomous vehicle based, at least in part, on the final traffic-light-detection output include instructions to control one or more of steering, acceleration, and braking.\n | 6. The system of claim 1, wherein the final traffic-light-detection output includes one or more transitions, the one or more transitions corresponding to different possible paths through the intersection, and an estimated traffic light state for each of the one or more transitions.\n | 7. The system of claim 6, wherein the final traffic-light-detection output further includes state-timing information for at least one traffic light at the intersection and an overall confidence level for the final traffic-light-detection output.\n | 8. A non-transitory computer-readable medium for controlling operation of an autonomous vehicle and storing instructions that when executed by one or more processors cause the one or more processors to: \nperform sensor-based traffic light detection at an intersection to produce a sensor-based detection output, the sensor-based detection output having an associated first confidence level; \nperform vehicle-to-infrastructure-based (V2I-based) traffic light detection at the intersection to produce a V2I-based detection output, the V2I-based detection output having an associated second confidence level; \nselect as a final traffic-light-detection output whichever of the sensor-based detection output and the V2I-based detection output has a higher associated confidence level subject to an override based on consideration of a risk associated with a transition through the intersection that the autonomous vehicle plans to execute; and \ncontrol the operation of the autonomous vehicle based, at least in part, on the final traffic-light-detection output. \n | 9. The non-transitory computer-readable medium of claim 8, wherein the instructions include instructions to compute the associated second confidence level, at least in part, by comparing V2I signals received from an information system of the intersection with environmental sensor data associated with the intersection using a learning-based classifier.\n | 10. The non-transitory computer-readable medium of claim 8, wherein the instructions include instructions to compute the associated second confidence level, at least in part, by processing past sensor-based detection output data and past V2I-based detection output data using a learning-based classifier.\n | 11. The non-transitory computer-readable medium of claim 8, wherein the final traffic-light-detection output includes one or more transitions, the one or more transitions corresponding to different possible paths through the intersection, and an estimated traffic light state for each of the one or more transitions.\n | 12. The non-transitory computer-readable medium of claim 11, wherein the final traffic-light-detection output further includes state-timing information for at least one traffic light at the intersection and an overall confidence level for the final traffic-light-detection output.\n | 13. A method of controlling operation of an autonomous vehicle, the method comprising: \nperforming traffic light detection at an intersection using a sensor-based traffic light detector to produce a sensor-based detection output, the sensor-based detection output having an associated first confidence level; \nperforming traffic light detection at the intersection using a vehicle-to-infrastructure-based (V2I-based) traffic light detector to produce a V2I-based detection output, the V2I-based detection output having an associated second confidence level; \nselecting as a final traffic-light-detection output whichever of the sensor-based detection output and the V2I-based detection output has a higher associated confidence level subject to an override based on consideration of a risk associated with a transition through the intersection that the autonomous vehicle plans to execute; and \ncontrolling the operation of the autonomous vehicle based, at least in part, on the final traffic-light-detection output. \n | 14. The method of claim 13, wherein the associated second confidence level is computed, at least in part, by comparing V2I signals received from an information system of the intersection with environmental sensor data associated with the intersection using a learning-based classifier.\n | 15. The method of claim 13, wherein the associated second confidence level is computed, at least in part, by processing past sensor-based detection output data and past V2I-based detection output data using a learning-based classifier.\n | 16. The method of claim 13, wherein the final traffic-light-detection output includes one or more transitions, the one or more transitions corresponding to different possible paths through the intersection, and an estimated traffic light state for each of the one or more transitions.\n | 17. The method of claim 16, wherein the final traffic-light-detection output further includes timing information for at least one traffic light at the intersection and an overall confidence level for the final traffic-light-detection output.", "output": "The system (170) has a fusion module (325) having instructions that when executed by processors (110) cause the processors to perform one of selecting as a final traffic-light-detection output whichever of a sensor-based detection output and a vehicle-to-infrastructure-based (V2I-based) detection output has a higher associated confidence level. The fusion module generates the final detection output by fusing the detection outputs using a learning-based classifier. A control module (330) consists of instructions that cause the processor to control an operation of an autonomous vehicle based on the detection output. INDEPENDENT CLAIMS are included for:1) a non-transitory computer-readable medium for controlling operation of an autonomous vehicle; and2) a method for controlling operation of an autonomous vehicle. System for controlling operation of autonomous vehicle using multiple traffic light detectors. The autonomous vehicle can be able to detect traffic lights and their current states to decide whether to stop or proceed or whether it is permissible to turn at a particular time and in a particular direction. The vehicle-to-infrastructure-based (V2I-based) traffic light detection module can be used to improve the accuracy of the final traffic-light-detection output. The fusion module can improve the performance of the autonomous vehicle by fusing the sensor-based detection output and the V2I based detection output using a first learning-based classifier. The drawing shows a block diagram of the traffic light detection system.110Processor 170Traffic light detection system 310Memory 325Fusion module 330Control module ", "instruction": "Please summarize the input" }, { "input": "SYSTEMS AND METHODS FOR OPERATING A VEHICLE ON A ROADWAYA host vehicle can detect two or more markers positioned on respective locations of a surrounding vehicle. The two or more markers can include data corresponding to the respective locations of each of the two or more markers. The host vehicle can determine the location of the two or more markers based on the data. Based on the location of the two or more markers, the host vehicle can determine the orientation of the surrounding vehicle. The host vehicle can determine a path to follow based on the determined orientation of the surrounding vehicle. The host vehicle can then follow the determined path.What is claimed is: \n | 1. A system for operating a host vehicle on a roadway, the system comprising: \na camera positioned to capture an image of an external environment of the host vehicle; \none or more processors communicably coupled to the camera; \na memory communicably coupled to the one or more processors and storing: \na marker location determination module including instructions that, when executed by the one or more processors, cause the one or more processors to detect, using the image captured by the camera, a first marker and at least a second marker positioned on one or more surfaces of a vehicle surrounding the host vehicle, and determine at least a location of the first and second markers on the surrounding vehicle based on data encoded each of the markers; \nan orientation determination module including instructions that, when executed by the one or more processors, cause the one or more processors to determine an orientation of the surrounding vehicle based on the location of the first and second markers; and \nan autonomous driving module including instructions that, when executed by the one or more processors, cause the one or more processors to determine a path for the host vehicle based on the orientation of the surrounding vehicle, and to control the host vehicle to follow the determined path by sending one or more control signals to one or more vehicle components to cause the host vehicle to travel along the determined path. \n | 2. The system of claim 1, wherein the first and second markers are non-visible markers, and wherein the camera is a hyperspectral camera configured to detect the non-visible first and second markers.\n | 3. The system of claim 1, further comprising: \na communications system configured to receive data generated by the surrounding vehicle that is associated with at least one of the first and second markers; and wherein the memory further stores \na feature determination module including instructions that, when executed by the one or more processors, cause the one or more processors to receive, via the communications system, the data generated by the surrounding vehicle, the data indicating a feature related to an area of the surrounding vehicle located proximate at least one of the markers, and determine, based on the retrieved data, the feature corresponding to the area of the surrounding vehicle located proximate to the at least one marker. \n | 4. The system of claim 3, wherein the autonomous driving module further includes instructions that cause the one or more processors to monitor a likelihood of a collision with the surrounding vehicle based on the orientation of the surrounding vehicle with respect to the determined path, and, responsive to determining a likely collision with the surrounding vehicle, determine a collision path for the host vehicle based on the orientation of the surrounding vehicle determined by the orientation determination module and the feature for the surrounding vehicle determined by the feature determination module.\n | 5. A method of operating for a host vehicle according to an orientation of a surrounding vehicle, the method comprising: \ndetecting, via a camera on the host vehicle, a first marker positioned on a surface of a surrounding vehicle, the first marker including data corresponding to the first marker's location on the surrounding vehicle; \ndetermining the location of the first marker on the surrounding vehicle based on the data; \ndetecting, via the camera, a second marker positioned on another surface of the surrounding vehicle, the second marker including data corresponding to the second marker's location on the surrounding vehicle; \ndetermining the location of the second marker on the surrounding vehicle based on the data; \ndetermining an orientation of the surrounding vehicle based on the locations of the first and second markers; \ndetermining a path for the host vehicle based on the orientation of the surrounding vehicle; and \ncausing the host vehicle to follow the determined path. \n | 6. The method of claim 5 further comprising: \nreceiving data associated with the first marker from the surrounding vehicle, the data representative of one or more features related to an area of the surrounding vehicle proximate the location of the first marker; and \ndetermining a likelihood of collision with the surrounding vehicle. \n | 7. The method of claim 6, wherein the received data is representative of one or more features including a fuel level of the surrounding vehicle, and wherein the determined path is a collision path based at least in part on the fuel level of the surrounding vehicle.\n | 8. The method of claim 6, wherein the received data is representative of one or more features including an occupant presence in the area of the surrounding vehicle proximate the location the first marker, and wherein the determined path is a collision path based at least in part on the occupant presence in the area of the surrounding vehicle proximate the location of the first marker.\n | 9. The method of claim 8, wherein the occupant presence includes a vulnerability assessment for any occupants present in the area of the surrounding vehicle proximate the location of the first marker, and wherein the collision path is further based at least in part on the vulnerability assessment.\n | 10. The method of claim 6, wherein the received data is representative of one or more features including a hazardous cargo assessment for the area of the surrounding vehicle proximate the location of the first marker, the hazardous cargo assessment being based on any cargo present in the area of the surrounding vehicle proximate the location of the first marker, and wherein the determined path is a collision path based at least in part on the hazardous cargo assessment for the area of the surrounding vehicle proximate the location of the first marker.\n | 11. The method of claim 5, further comprising: \ndetermining a location of the surrounding vehicle relative to the host vehicle, and wherein determining the path for the host vehicle is further based on the determined location of the surrounding vehicle. \n | 12. A system for providing information from a first vehicle to at least a second vehicle related to one or more features of the first vehicle, the system comprising: \na marker positioned on an external surface of the first vehicle, the marker including data corresponding to the marker's location on the first vehicle; \na communications system; \none or more processors; \na memory communicably coupled to the one or more processors and storing: \na feature detection module including instructions that, when executed by the one or more processors, cause the one or more processors to detect one or more features corresponding to an area of the first vehicle proximate the location of the marker; and \na data generation module including instructions that, when executed by the one or more processors, cause the one or more processors to generate data for the marker representing the determined one or more features, and to transmit, via the communications system, the data for the marker corresponding to the detected one or more features, the data being accessible by the second vehicle. \n | 13. The system of claim 12, wherein the marker positioned on the external surface of the first vehicle is a non-visible marker.\n | 14. The system of claim 12, wherein the communications system is a vehicle-to-vehicle communications system, and wherein the instructions included on data generation module cause the one or more processors to transmit, via the vehicle-to-vehicle communications system, the data for the marker corresponding to the detected one or more features to the second vehicle.\n | 15. The system of claim 12, wherein the communication system is in communication with a database accessible by the second vehicle, and wherein the instructions included on data generation module cause the one or more processors to transmit, via the communication system, the data for the marker corresponding to the detected one or more features to the database, the database being accessible by the second vehicle.\n | 16. The system of claim 12, wherein the marker is positioned on a fuel door of the first vehicle, and wherein the data generated by the data generation module includes an energy level for the first vehicle.\n | 17. The system of claim 12, wherein the marker is positioned proximate to a cargo area of the first vehicle, and wherein the data generated by the data generation module includes a cargo risk assessment associated with the cargo area.\n | 18. The system of claim 12, wherein the marker is positioned on an occupant door for the first vehicle, and wherein the data generated by the data generation module indicates presence of an occupant in the area of the first vehicle proximate the location of the marker.\n | 19. The system of claim 18 wherein the data generated by the data generation module further includes a vulnerability of any occupants present in the area of the first vehicle proximate the location of the marker on the first vehicle.", "output": "The system has a camera that is positioned to capture an image of an external environment of a host vehicle (200). The processors are communicably coupled to the camera. A marker location determination module detects the first and second markers (110) positioned on surfaces of a vehicle surrounding the host vehicle, and determines a location of the first and second markers on a surrounding vehicle (100a-100c) based on data encoded each of the markers using the image captured by the camera. An orientation determination module determines an orientation of the surrounding vehicle based on the location of the first and second markers. An autonomous driving module determines a path for the host vehicle based on the orientation of the surrounding vehicle and controls the host vehicle to follow the determined path by sending control signals to vehicle components to cause the host vehicle to travel along the determined path. INDEPENDENT CLAIMS are included for the following:a method of operating for a host vehicle according to an orientation of a surrounding vehicle; anda system for providing information from a first vehicle to at least a second vehicle related to one or more features of the first vehicle. System for operating host vehicle such as conventional vehicle and autonomous vehicle on roadway. The redundancies can permit data to be extrapolated from the steganographic pattern, even if the steganographic pattern is damaged. The determined path can be a path that avoids a collision with the surrounding vehicle. The autonomous driving modules can determine paths for the host vehicle so as to avoid particular areas of the surrounding vehicle in the event of a likely collision with the surrounding vehicle. The drawing shows a perspective view of host vehicle on a roadway including multiple surrounding vehicles. 100a-100cSurrounding vehicles110Markers200Host vehicle215Sensor system", "instruction": "Please summarize the input" }, { "input": "Dynamic speed limit for vehicles and autonomous vehiclesSystems and methods are provided for operating a vehicle using dynamic speed limits. A vehicle can monitor current road conditions by capturing real-time data that is indicative of a driving environment associated with a roadway being traversed by the vehicle. Using the captured real-time data, the vehicle can predict a dynamic speed limit, wherein the dynamic speed limit is a driving speed for the vehicle that is adapted for the monitored current road conditions. Additionally, the vehicle can automatically perform a driving operation for the vehicle in accordance with the dynamic speed limit, wherein the driving operation causes the vehicle to move at a driving speed that is approximately equal to the predicted dynamic speed limit.What is claimed is: \n | 1. A method comprising: \nmonitoring, by a vehicle, current road conditions using captured real-time data indicative of a driving environment associated with a roadway being traversed by the vehicle, wherein monitoring the current road conditions comprises: \ncapturing the real-time data from one or more vehicle sensors of the vehicle, wherein the captured real-time data is from one or more vehicle sensors sensing the current road conditions associated with the roadway being traversed by the vehicle; and \ncollecting federated real-time data from multiple communication points communicatively connected to the vehicle and a plurality of dynamic speed limits predicted by other vehicles, the federated real-time data indicating the driving environment associated with the roadway being traversed by the vehicle, and wherein collecting the federated real-time data and the plurality of dynamic speed limits predicted by other vehicles comprises receiving data via vehicle-to-vehicle (V2V) communication between the vehicle and one or more other vehicles on the roadway; \npredicting, by the vehicle, a dynamic speed limit based on the captured real-time data from the one or more vehicle sensors of the vehicle, wherein the predicted dynamic speed limit comprises a driving speed for the vehicle that is adapted for the monitored current road conditions; \noptimizing the predicted dynamic speed limit by applying the federated real-time data and the plurality of dynamic speed limits predicted by other vehicles to analysis of the dynamic speed limit to generate an optimized dynamic speed limit when the predicted dynamic speed limit is not verified; and \nautomatically performing a driving operation for the vehicle in accordance with the optimized dynamic speed limit, wherein the driving operation causes the vehicle to move at the driving speed that is approximately equal to the optimized dynamic speed limit when the predicted dynamic speed limit is not verified. \n | 2. The method of claim 1, wherein predicting the dynamic speed limit comprises applying the captured real-time data to one or more machine learning models.\n | 3. The method of claim 2, further comprising: \ndetermining whether the predicted dynamic speed limit is verified based on the federated real-time data, wherein determining that the predicted dynamic speed limit is verified comprises identifying a convergence between the federated real-time data and the captured real-time data and determining that the predicted dynamic speed limit is not verified comprises identifying a divergence between the federated real-time data and the captured real-time data; \nupon determining that the predicted dynamic speed limit is not verified, generating the optimized dynamic speed limit, wherein the optimized dynamic speed limit is based on the predicted dynamic speed limit and optimally adapted to the monitored current road conditions in accordance with federated learning techniques applied to the one or more machine learning models; and \nupon determining that the predicted dynamic speed limit is not verified, performing the driving operation for the vehicle in accordance with the optimized dynamic speed limit, wherein the driving operation causes the vehicle to move at the driving speed that is approximately equal to the optimized dynamic speed limit. \n | 4. The method of claim 1, wherein performing the driving operation for the vehicle in accordance with the predicted dynamic speed limit is fully autonomous.\n | 5. The method of claim 1, wherein performing the driving operation for the vehicle in accordance with the predicted dynamic speed limit is semi-autonomous.\n | 6. The method of claim 5, wherein performing the driving operation for the vehicle in accordance with the predicted dynamic speed limit comprises automatically displaying the predicted dynamic speed limit in a dashboard display of the vehicle.\n | 7. The method of claim 1, wherein the collecting federated real-time data from multiple communication points further comprises vehicle-to-infrastructure (V2I) communication between the vehicle and an infrastructure device.\n | 8. The method of claim 3, further comprising: \nupon determining that the predicted dynamic speed limit is verified, performing the driving operation for the vehicle in accordance with the predicted dynamic speed limit, wherein the driving operation causes the vehicle to move at the driving speed that is approximately equal to the predicted dynamic speed limit.", "output": "The method involves monitoring current road conditions by capturing real-time data indicative of a driving environment associated with a roadway traversed by the vehicle (120). A dynamic speed limit is predicted based on the captured real-time data by the vehicle. The dynamic speed limit includes a driving speed for the vehicle, that is adapted for the monitored current road conditions. A driving operation is automatically performed for the vehicle in accordance with the predicted dynamic speed limit. The driving operation causes the vehicle to move at a driving speed that is approximately equal to the predicted dynamic speed limit. The captured real-time data is applied to multiple machine learning models. An INDEPENDENT CLAIM is included for a system for controlling dynamic speed limits of vehicle. Method for controlling dynamic speed limits of vehicle, such as trucks, motorcycles and boats. Vehicle can automatically generate an alert notifying the driver of the dynamic speed limit. Machine learning aspects of dynamic speed limit module are enhanced. Reliability of the predicted dynamic speed limit is improved. The drawing shows a block diagram of vehicle-infrastructure system. 100Vehicle-infrastructure system102Road condition service105Static speed limit sign110Communication network120Vehicle121Camera125Dynamic speed limit module", "instruction": "Please summarize the input" }, { "input": "DYNAMIC SPEED LIMIT FOR VEHICLES AND AUTONOMOUS VEHICLESSystems and methods are provided for operating a vehicle using dynamic speed limits. A vehicle can monitor current road conditions by capturing real-time data that is indicative of a driving environment associated with a roadway being traversed by the vehicle. Using the captured real-time data, the vehicle can predict a dynamic speed limit, wherein the dynamic speed limit is a driving speed for the vehicle that is adapted for the monitored current road conditions. Additionally, the vehicle can automatically perform a driving operation for the vehicle in accordance with the dynamic speed limit, wherein the driving operation causes the vehicle to move at a driving speed that is approximately equal to the predicted dynamic speed limit.What is claimed is: \n | 1. A system comprising: \na plurality of vehicles communicatively coupled to transmit real-time data that is captured by each of the plurality of vehicles, wherein the captured real-time data is indicative of a driving environment associated with a roadway being traversed by the plurality of vehicles; \nan infrastructure device communicatively coupled to the plurality of vehicles for transmitting initial speed limit data; and \nat least one of the plurality of vehicles comprising a control unit configured to: \nreceive the real-time data from the plurality of vehicles; \nreceive the initial speed limit data from the infrastructure device; \ndynamically adjust the initial speed limit data based on a consensus of the real-time data received from the plurality of vehicles; and \nautomatically perform a driving operation for the at least one vehicle in accordance with the dynamically adjusted speed limit, wherein the driving operation causes the at least one vehicle to move at a driving speed that is approximately equal to the dynamically adjusted speed limit. \n | 2. The system of claim 1, wherein the communicative coupling between the plurality of vehicles comprises vehicle-to-vehicle (V2V) communication.\n | 3. The system of claim 3, wherein the plurality of vehicles transmits additional dynamically adjusted speed limits using the V2V communication, and the additional dynamically adjusted speed limits are based on the real-time data captured by the respective one or more plurality of vehicles.\n | 4. The system of claim 4, wherein the at least one vehicle receives the additional dynamically adjusted speed limits transmitted by the plurality of vehicles, and dynamically adjusting the initial speed limit is further based on a consensus of the additional dynamically adjusted speed limits.\n | 5. The system of claim 1, wherein the at least one vehicle receives real-time data relating to the driving environment associated with the roadway comprising at least one of: traffic data, weather data, hazard data, and location data.\n | 6. The system of claim 1, wherein the communicative coupling between the plurality of vehicles and the infrastructure device comprises vehicle-to-infrastructure (V2I) communication.\n | 7. A vehicle comprising: \na processor device, wherein the processor device: \ncaptures real-time data from one or more vehicle sensors of the vehicle, wherein the captured real-time data is from one or more vehicle sensors sensing current road conditions associated with a roadway being traversed by the vehicle; \nreceives federated real-time data from multiple communication points communicatively connected to the vehicle and a plurality of predicted dynamic speed limits generated by a plurality of connected vehicles; \nverifies a predicted dynamic speed limit by applying the federated real-time data and the plurality of predicted dynamic speed limits to the predicted dynamic speed limit; and \nin response to verifying the predicted dynamic speed limit, automatically performing a driving operation for the vehicle in accordance with the verified predicted dynamic speed limit, wherein the driving operation causes the vehicle to move at a driving speed that is approximately equal to the verified predicted dynamic speed limit. \n | 8. The vehicle of claim 7, wherein the processor device further: \ngenerates the predicted dynamic speed limits based on the real-time data indicative of the current road conditions. \n | 9. The vehicle of claim 7, wherein verifying the predicted dynamic speed limit comprises determining whether there is consensus between the real-time data captured from the vehicle sensors and the federated real-time data collected from the multiple communication points.\n | 10. The vehicle of claim 9, wherein verifying the predicted dynamic speed limit comprises a convergence between the real-time data captured from the vehicle sensors and the federated real-time data collected from the multiple communication points.\n | 11. The vehicle of claim 7, wherein verifying the predicted dynamic speed limit comprises determining whether there is consensus between the predicted dynamic speed limit and the plurality of predicted dynamic speed limits generated by the plurality of connected vehicles.\n | 12. The vehicle of claim 7, wherein the processor device further: \nin response to failing to verify the predicted dynamic speed limit, adjusts the predicted dynamic speed limit based on the plurality of predicted dynamic speed limits generated by the plurality of connected vehicles. \n | 13. The vehicle of claim 12, wherein adjusting the predicted dynamic speed limit comprises applying the plurality of predicted dynamic speed limits generated by the plurality of connected vehicles to a machine learning model to optimize the predicted dynamic speed limit.\n | 14. The vehicle of claim 7, further comprising a communication system communicatively connected to the multiple communication points and the plurality of connected vehicles.\n | 15. The vehicle of claim 7, wherein the communicative connection comprises vehicle-to-infrastructure (V2I) communication or vehicle-to-vehicle (V2V) communication.\n | 16. The vehicle of claim 7, wherein the multiple communication points comprise one or more of: vehicles, infrastructure devices, and road condition services.", "output": "The system (100) has a set of vehicles that are communicatively coupled to transmit real-time data that is captured by each of the vehicles, where the captured data is indicative of a driving environment associated with a roadway being traversed by the vehicles. An infrastructure device is coupled to the vehicles for transmitting initial speed limit data. The vehicles is provided with a control unit to receive the data and the initial data from an infrastructure device. The control unit automatically performs a driving operation for the vehicle in accordance with a dynamically adjusted speed limit, where driving operation causes the vehicle to move at a driving speed that is approximately equal to the dynamically adjusted limit. An INDEPENDENT CLAIM is included for a vehicle comprising: a processor device. System for dynamic cruise control of vehicle such as automobiles, trucks, motorcycles, boats, recreational vehicles. The system utilizes a vehicle computing system to employ dynamic speed limits using real-time traffic monitoring, and federated learning from multiple communication points, allows the vehicles to use dynamic speed limit and updated in tandem with the changing road conditions to enhance navigation and/or maneuver capabilities. The drawing shows an example of a vehicle-infrastructure system with which the dynamic speed limit systems.100Vehicle-infrastructure system 102Road conditions services 103Infrastructure device 105Static speed limit sign 110Communications network 120Autonomous vehicle 121Camera 125Dynamic speed limits module ", "instruction": "Please summarize the input" }, { "input": "Vehicular headlight warning systemA method is provided for alerting a driver of a vehicle of an unsafe exterior lighting status. The method includes receiving, at a first vehicle, information sufficient to detect a presence of a second vehicle. Once a vehicle is detected, the method continues to identify a direction of travel of the second vehicle with respect to the first vehicle. The method includes measuring a level of ambient light, and determining the level of ambient light is below a threshold level. The method further identifies any presence of functioning exterior lighting of the second vehicle by detecting a light color and light intensity. When an unsafe exterior lighting status is determined, the method includes alerting a driver of the second vehicle using a vehicle-to-vehicle communication signal. The signal includes a notification suggesting usage of at least one of headlights, parking lights, and hazard lights.What is claimed is: \n | 1. A method for alerting a vehicle driver of an exterior lighting operational status, the method comprising: \ndetecting, at a first vehicle, a presence of a second vehicle; \nmeasuring a level of ambient light, and determining the level of ambient light is below a threshold level; \ndetermining an operational status of a specific type of functioning exterior lighting of the second vehicle including detecting a light color, angle, and light intensity to identify a type of the exterior lighting; and \nalerting a driver of the second vehicle using a vehicle-to-vehicle communication signal, the signal including a notification related to the operational status of the exterior lighting of the second vehicle and suggesting usage of exterior lighting different from the specific type identified. \n | 2. The method according to claim 1, further comprising identifying a direction of travel of the second vehicle with respect to the first vehicle.\n | 3. The method according to claim 1, further comprising identifying the second vehicle as an oncoming vehicle moving in a direction towards the first vehicle, \nwherein determining the operational status of the exterior lighting results in a determination that headlights are not operating, and the signal includes a notification suggesting usage of headlights operated in a low beam mode. \n | 4. The method according to claim 1, further comprising identifying the second vehicle as an oncoming vehicle moving in a direction towards the first vehicle, \nwherein determining the operational status of the exterior lighting results in a determination that headlights are operating in a high beam mode, and the signal includes a notification suggesting usage of the headlights in a low beam mode. \n | 5. The method according to claim 4, wherein, upon the first vehicle passing the second vehicle, the method further comprises sending a follow-up vehicle-to-vehicle communication signal to the second vehicle, the follow-up signal including a notification suggesting that usage of the headlights can safely revert back to the high beam mode.\n | 6. The method according to claim 1, further comprising identifying the second vehicle as an oncoming vehicle moving in a direction towards the first vehicle, \nwherein determining the operational status of the exterior lighting results in a determination that only parking lights are operating, and the signal includes a notification suggesting usage of headlights operated in a low beam mode. \n | 7. The method according to claim 1, further comprising identifying the second vehicle as a stationary vehicle; and \ndetermining the second vehicle is parked or temporarily standing adjacent an intended travel path of the first vehicle, \nwherein determining the operational status of the exterior lighting results in a determination that the exterior lighting is off, and the signal includes a notification suggesting usage of parking lights or hazard lights. \n | 8. The method according to claim 7, further comprising alerting a driver of the first vehicle to the presence and location of the second vehicle.\n | 9. The method according to claim 1, further comprising monitoring for a presence of precipitation.\n | 10. The method according to claim 9, further comprising detecting a presence of precipitation, \nwherein determining the operational status of the exterior lighting results in a determination that headlights are not operating, and the signal includes a notification suggesting usage of headlights operated in a low beam mode. \n | 11. The method according to claim 1, wherein at least one of the first vehicle and the second vehicle is an autonomous vehicle.\n | 12. A method for alerting a vehicle driver of an unsafe exterior lighting operational status, the method comprising: \nreceiving and processing, at a first vehicle, information sufficient to detect a presence of a second vehicle; \nidentifying a direction of travel of the second vehicle with respect to the first vehicle; \nmeasuring a level of ambient light, and determining the level of ambient light is below a threshold level; \nidentifying a specific type of functioning exterior lighting of the second vehicle by detecting a light color, light angle, and light intensity; and \nalerting a driver of the second vehicle using a vehicle-to-vehicle communication signal, the signal including a notification suggesting usage of exterior lighting different from the specific type identified. \n | 13. A non-transitory computer-readable medium having instructions embodied thereon that, when executed by a processor, perform operations in a first vehicle, the operations comprising: \ndetecting a presence of a second vehicle; \nmeasuring a level of ambient light, and determining the level of ambient light is below a threshold level; \ndetermining an operational status of a specific type of functioning exterior lighting of the second vehicle, including detecting a light color, light angle, and light intensity to identify a type of the exterior lighting; and \nalerting a driver of the second vehicle using a vehicle-to-vehicle communication signal, the signal including a notification related to the operational status of the exterior lighting of the second vehicle, and suggesting usage of exterior lighting different from the specific type identified. \n | 14. The non-transitory computer-readable medium as recited in claim 13, wherein the operations further comprise: \nidentifying a direction of travel of the second vehicle with respect to the first vehicle. \n | 15. The non-transitory computer-readable medium as recited in claim 14, wherein the operations further comprise: \nidentifying the second vehicle as a stationary vehicle and determining the second vehicle is parked or temporarily standing adjacent an intended travel path of the first vehicle; and \nalerting a driver of the first vehicle of the presence of the second vehicle, \nwherein determining the operational status of the exterior lighting results in a determination that the exterior lighting is off, and the signal includes a notification suggesting usage of parking lights or hazard lights. \n | 16. The non-transitory computer-readable medium as recited in claim 14, wherein the operations further comprise: \nidentifying the second vehicle as an oncoming vehicle moving in a direction towards the first vehicle, \nwherein determining the operational status of the exterior lighting results in a determination that headlights are operating in a high beam mode, and the signal includes a notification suggesting the headlights be operated in a low beam mode. \n | 17. The non-transitory computer-readable medium as recited in claim 14, wherein the operations further comprise: \nidentifying the second vehicle as an oncoming vehicle moving in a direction towards the first vehicle, \nwherein determining the operational status of the exterior lighting results in a determination that only parking lights are operating, and the signal includes a notification suggesting usage of headlights operated in a low beam mode. \n | 18. The non-transitory computer-readable medium as recited in claim 14, wherein the operations further comprise: \nidentifying the second vehicle as an oncoming vehicle moving in a direction towards the first vehicle, \nwherein determining the operational status of the exterior lighting results in a determination that headlights are not operating, and the signal includes a notification suggesting usage of headlights operated in a low beam mode.", "output": "The method (48) involves detecting (60) presence of a second vehicle at a first vehicle, measuring (64) level of ambient light and determining if the level of ambient light is below a threshold level. An operational status of exterior lighting of the second vehicle is determined (66). The light color, angle and light intensity to identify type of the exterior lighting are determined (68). The driver of the second vehicle is alerted (70) using a vehicle-to-vehicle communication signal that includes a notification related to the operational status of the exterior lighting of the second vehicle. An INDEPENDENT CLAIM is included for a non-transitory computer-readable medium storing instructions to perform operations in a first vehicle. Method for alerting vehicle driver of exterior lighting operational status while driving vehicles such as passenger or commercial automobile, car, truck, motorcycle, off-road vehicle, bus, boat, airplane, helicopter, lawn mower, recreational vehicle, amusement park vehicle, farm vehicle, construction vehicle, tram, golf cart, train, or trolley, etc,. The vehicles are modified either visually or electronically so their operators can consider turning on appropriate exterior lighting for the driving conditions, once an unsafe condition is determined. The environmental conditions are monitored for presence of precipitation as rain, snow, ice, slush, fog, etc., and that impairs visibility such that certain exterior lighting should be in an operational mode. The drawing shows a flow diagram illustrating control process algorithm for alerting vehicle driver of exterior lighting operational status. 48Method for alerting vehicle driver of exterior lighting operational status60Step for detecting presence of second vehicle at first vehicle64Step for measuring level of ambient light66Step for determining operational status of exterior lighting of second vehicle68Step for determining light color and light intensity to identify type of exterior lighting70Step for alerting the driver of second vehicle using vehicle-vehicle communication signal", "instruction": "Please summarize the input" }, { "input": "The context system which improves understanding of the V2X communication by V2-X receiver|1. To provide a method for correcting the action of a vehicle component based on V2X (Vehicle-to-Everything) communication, and to provide a method for correcting the action of a vehicle component. The V2X receiver is a step of receiving a V2X message containing context data describing the context of the V2X transmitter in the environment; and The V2X transmitter can not be sufficiently identified as an originator of the V2X message in the environment; and the V2X receiver is based on the context data; and the V2X transmitter is not sufficiently identified. A step for determining digital data describing the identity of the V2X transmitter as the originator of the V2X message in the environment, and the digital data describing the identity of the V2X transmitter are provided. The method. includes a step for correcting the action of the vehicle component of the V2X receiver,. a step for correcting the action.\n | 2. The environment is a road environment including one's own vehicle and a remote vehicle which receives the V2X message; the context data further describes the context of the own vehicle in the road environment; and the method. described in claim 1 is described.\n | 3. The V2X messages are; Wi-Fi (registered trademark) messages; 3G messages; 4G messages; 5G messages; Long Term Evolution (LTE (R)) messages; millimeter-wave communications messages; Bluetooth (R) messages; and not one of the satellite communications; method. described in claim 1.\n | 4. The V2X messages are; Basic Safety Message (Basic Safety Message); method. in claim 1.\n | 5. The context data describes the position of the V2X transmitter with the substantially half accuracy of the width of the road in which the V2X transmitter is running; and the method. described in claim 1\n | 6. The V2X transmitter is its own vehicle; the method. described in claim 1.\n | 7. The V2X receiver is an autonomous vehicle; and the method. described in claim 1 is described.\n | 8. The system which is included in a V2X receiver and corrects the action of a vehicle component based on V2X communication; a processor; and a non-temporary memory communicably connected to the processor; and when it is executed by the processor; and the processor is provided. The V2X receiver is a step of receiving a V2X message containing context data describing the context of the V2X transmitter in the environment; and The V2X transmitter can not be sufficiently identified as the origination source of the V2X message in the environment; and based on the context data, the V2X transmitter can be used for the transmission source of the V2X message. A step for determining digital data describing the identity of the V2X transmitter as the originator of the V2X message in the environment, and the digital data describing the identity of the V2X transmitter are provided. A system. comprises a step to correct the action of the vehicle component of the V2X receiver, and a computer code which can be stored action so as to execute the step; and a non-temporary memory.\n | 9. The V2X messages are; Wi-Fi (registered trademark) messages; 3G messages; 4G messages; 5G messages; Long Term Evolution (LTE (R)) messages; millimeter-wave communications messages; Bluetooth (R) messages; and not one of the satellite communications; system. as described in claim 8\n | 10. The V2X messages are; Basic Safety Message (Basic Safety Message); System. described in Claim 8\n | 11. The context data describes the position of the V2X transmitter with the substantially half accuracy of the width of the road in which the V2X transmitter is running; the system. described in claim 8.\n | 12. The V2X transmitter is its own vehicle; the system. described in claim 8.\n | 13. The V2X receiver is an autonomous vehicle; and the system. described in claim 8 is provided.\n | 14. A action possible program to modify the action of a vehicle component based on V2X communication; that is, when executed by a processor of V2X receiver; The V2X receiver is a step of receiving a V2X message containing context data describing the context of the V2X transmitter in the environment; and The V2X transmitter can not be sufficiently identified as the origination source of the V2X message in the environment; and based on the context data, the V2X transmitter can be used for the transmission source of the V2X message. A step for determining digital data describing the identity of the V2X transmitter as the originator of the V2X message in the environment, and the digital data describing the identity of the V2X transmitter are provided. A program. is executed to include a step for correcting the action of the vehicle component of the V2-X receiver and a action for correcting the action.\n | 15. The context data describes the position of the V2X transmitter with the substantially half accuracy of the width of the road in which the V2X transmitter is running; and the program. described in claim 14.", "output": "The method involves receiving a V2X message including context data (191) that describes a context of a V2X transmitter in an environment. The V2X receiver is not sufficiently able to identify the V2X transmitter as an originator of the V2X message within the environment. The digital data describing an identity of the V2X transmitter is determined as the originator of the V2X message within the environment based on the context data. The operation of the vehicle component of the V2X receiver is modified based on the digital data describing the identity of the V2X transmitter. The V2X message is a Dedicated Short-Range Communication (DSRC) message. INDEPENDENT CLAIMS are included for the following:a system for modifying an operation of a vehicle component based on a Vehicle-to-Everything (V2X) communication; anda computer program product operable to modify an operation of a vehicle component based on a Vehicle-to-Everything (V2X) communication. Method for modifying an operation of a vehicle component based on a Vehicle-to-Everything (V2X) communication. Improves ability to identify the transmitter of different DSRC messages that would significantly improve the performance of a connected systems. The drawing shows a schematic block representation of an operating environment for a context system. 105Network125Processor146Communication unit150Sensor Set191Context data", "instruction": "Please summarize the input" }, { "input": "Vehicle-to-Everything data transmission for an automation vehicle|1. To provide digital data to a WiFi (R) router based on a mode of a communication unit by a communication unit of its own vehicle; or to attempt to provide it; and to provide a machine learning algorithm of the own vehicle. A network pattern describing connectability access of the communication unit is monitored; and it is determined whether the connectivity access satisfies a connection threshold value; and the connection threshold value is not satisfied in response to connection access. The method includes providing the digital data to a remote vehicle having no bandwidth constraint that prevents the digital data from being relayed to the server; or selecting an operation in a transmission mode including an attempt to be provided; and the operation in the transmit mode is performed. The connection data of one or more other end points including the remote vehicle are received; and the connectivity of the own vehicle and the access ranking of the own vehicle are generated based on the connectability data of the one or more other end points. A first geographical position of the own vehicle and one or more second geographic positions of one or more other end points including the remote vehicle are described, and a local dynamic map including the access ranking is generated. The method includes the steps of: based on the local dynamic map including the access ranking; relaying the digital data to the remote vehicle for transmission to the server.\n | 2. The relay of the digital data to the remote vehicle is carried out further based on the presence of the prospect between the own vehicle and the remote vehicle, and the method is described in claim 1.\n | 3. The system further includes a step of receiving corrective data operable to correct a vehicle component of the own vehicle; and the modified data causes a modification of the vehicle component by providing the digital data to the server. The method is determined on the basis of the digital data normally received by the server, and is described in the claim 1 or 2.\n | 4. The own vehicle is an autonomous vehicle; the vehicle component is an advanced driver assistance system (ADAS system) corrected based on the correction data; and a method described in claim 3 is provided.\n | 5. By the correction data, the real-time safety process of the autonomous vehicle provided by the ADAS system is corrected; and the method described in claim 4 is disclosed.\n | 6. According to the correction data, the operation of the brake system of the own vehicle is corrected; the operation of the brake system is controlled by the ADAS system; and the method described in claim 4 is disclosed.\n | 7. The WiFi router is directed to a home wireless network; and the connectivity access is (1) the WiFi router does not exist; and (2) the communication unit cannot wirelessly communicate with the WiFi router; and (2) the WiFi router is not available for wireless communication with the WiFi router. (3) The WiFi router is outside the transmission range of the communication unit; (4) the access to the wireless network of the WiFi router is insufficient; and (5) the WiFi router is not able to access the wireless network, or; or (6) The communication unit does not store the password of the WiFi router, and the communication unit is not authenticated so as to be wirelessly communicated with the WiFi router, based on a decision selected from a group including the; a method described in claim 1.\n | 8. Ranking data describing one or more access ranking of one or more end points is acquired; a partner endpoint is identified based on the ranking data; and the sensor data are transmitted to the partner end point. is further included in the method described in claim 1.\n | 9. The digital data include: the own vehicle; and the sensor data recorded by the set of the remote vehicle; and a method described in claim 8.\n | 10. The own vehicle is in a reception mode before being switched to the transmission mode; and a method described in claim 1 is disclosed.\n | 11. A communication unit is provided with an on-vehicle computer system of its own vehicle including a processor and a non-temporary memory for storing computer codes; and the computer code is executed by the processor; and when the computer code is executed by the processor, the computer code is received through the communication unit of the own vehicle. Based on the mode of the communication unit, it provides digital data to a WiFi (R) router, or attempts to provide it; and the machine learning algorithm of the own vehicle. A network pattern describing connectability access of the communication unit is monitored; and it is determined whether the connectivity access satisfies a connection threshold value; and the connection threshold value is not satisfied in response to connection access. The digital data are provided to the remote vehicle having no band width constraint preventing the digital data from being relayed to the server; or the operation in a transmission mode including the attempt to be provided is selected to the processor; and the processor is provided. In the operation in the transmission mode, the connection data of one or more other end points including the remote vehicle are received; and the connectivity of the own vehicle and the connectivity data of the one or more other end points are based on the connectivity data of the one or more other end points. The access ranking of the own vehicle is generated; the first geographical position of the own vehicle; and one or more second geographic positions of one or more other end points including the remote vehicle are described, and a local dynamic map including the access ranking is generated. Based on the local dynamic map including the access ranking, the system causes the processor to relay the digital data to the remote vehicle for transmission to the server.\n | 12. The computer code is further performed to the processor to receive correction data operable to correct the vehicle component of the own vehicle when the computer code is executed by the processor; and the correction data are further performed. The system is determined on the basis of the digital data normally received by the server so as to correct the vehicle component by providing the digital data to the server; and the system described in claim 11.\n | 13. The own vehicle is an autonomous vehicle; the vehicle component is an advanced driver assistance system (ADAS system) corrected based on the correction data; and a system described in claim 12 is provided.\n | 14. The own vehicle is in a reception mode before being switched to the transmission mode; and the system is described in claim 13.\n | 15. When executed by a processor, via the communication unit of the own vehicle, based on the mode of the communication unit, to provide digital data to a WiFi (R) router, or to provide it; and to provide it by the machine learning algorithm of the own vehicle. A network pattern describing connectability access of the communication unit is monitored; and it is determined whether the connectivity access satisfies a connection threshold value; and the connection threshold value is not satisfied in response to connection access. The digital data are provided to the remote vehicle having no band width constraint preventing the digital data from being relayed to the server; or the operation in a transmission mode including the attempt to be provided is selected to the processor; and the processor is provided. In the operation in the transmission mode, the connection data of one or more other end points including the remote vehicle are received; and the connectivity of the own vehicle and the connectivity data of the one or more other end points are based on the connectivity data of the one or more other end points. The access ranking of the own vehicle is generated; the first geographical position of the own vehicle; and one or more second geographic positions of one or more other end points including the remote vehicle are described, and a local dynamic map including the access ranking is generated. The program includes an instruction to cause the processor to relay the digital data to the remote vehicle for transmission to the server, based on the local dynamic map including the access ranking.", "output": "The method involves providing or attempting to provide, by a communication unit of an ego vehicle (123), digital data to a communication device based on a mode of the communication unit, in which the digital data is relayed by the communication device to be received by a server, then determining, by a processor of the ego vehicle, feedback that describes a bandwidth constraint of the communication unit, and modifying, by the processor, the mode based on the feedback so that the mode is consistent with the bandwidth constraint and the digital data is successfully received by the server. INDEPENDENT CLAIMS are also included for the following:a computer program product having instructions for vehicle-to-everything data transfer for automated vehicles; anda system for vehicle-to-everything data transfer for automated vehicles. Method for vehicle-to-everything data transfer for automated vehicles. The analysis module analyzes sensor data that is generated by a particular automated vehicle and generates modification data for the particular automated vehicle based on the sensor data. The modification data may cause a modification of a vehicle component when received by the automated vehicle, and the analysis module can generate modification data that helps the advanced driver assistance system to improve safety and efficiency of the ego vehicle. The drawing shows the block diagram illustrating an operating environment for a feedback system. 123Ego vehicle150Connectivity152Connectivity data set160Digital data199aFeedback system", "instruction": "Please summarize the input" }, { "input": "Vehicle component modification based on vehicle accident reproduction dataThe invention relates to vehicle assembly change based on vehicle accident reproduction data. The invention claims an embodiment of vehicle control system for vehicle accident reproduction (VAR) analysis, changing the connected vehicle. In some embodiments, a method comprises detecting the occurrence of collision associated with a remote vehicle; generating report data associated with collision of the remote vehicle, wherein the report data comprises one or more of V2X data describing all (V2X) wireless message received before the occurrence of the collision and event data describing one or more driving events observed by the self-vehicle; transmitting the report data to the server; receiving the change data from the server, the change data describes the change of the vehicle control system of the self-vehicle, wherein the change data is generated by VAR analysis, based on the report data, and changing the operation of the vehicle control system based on the change data so as to improve the safety of the self-vehicle.|1. A method for self-vehicle, the method comprising: generating a report data associated with a collision of a remote vehicle, wherein the report data includes a V2X data describing a first group of vehicles received from a remote vehicle before the occurrence of the collision and an event describing one or more driving events associated with a collision of a remote vehicle observed by one or more sensors of a self-vehicle collided adjacent to each other; both data; receiving the change data, the change data describes the change of the vehicle control system of the self-vehicle, wherein the change data is generated by vehicle accident reproduction analysis, based on the report data, and changing the operation of the vehicle control system based on the change data.\n | 2. The method according to claim 1, wherein: The vehicle control system includes one of an advanced driver assistance system (ADAS system) and an autonomous driving system.\n | 3. The method according to claim 2, wherein: the self-vehicle is an autonomous vehicle, and the operation of changing the vehicle control system based on the change data comprises: changing the safety process of the autonomous vehicle provided by the autonomous driving system based on the change data, so as to improve the safety of the autonomous vehicle.\n | 4. The method according to claim 2, wherein: The operation of changing the vehicle control system based on the change of the data includes: changing one or more operations of one or more of a brake system of a self-vehicle, a steering system and an acceleration system based on the change data, wherein the one or more operations are controlled by the ADAS system.\n | 5. The method according to claim 1, further comprising: based on the first V2X wireless message received from the remote vehicle, identifying the presence of a pre-collision event associated with the remote vehicle; and in response to identifying the presence of a pre-crash event: recording sensor data for one or more sensors of the self-vehicle; based on sensor data, generating event data describing one or more driving events; and before detecting the occurrence of collision, monitoring the remote vehicle, to continue receiving one or more second V2X wireless message from the remote vehicle, wherein the first group of V2X wireless message contained in the report data comprises a first V2X wireless message and one or more second V2X wireless message, and The occurrence of a collision associated with a remote vehicle is detected.\n | 6. The method according to claim 5, wherein identifying the presence of a pre-crash event associated with a remote vehicle comprises: receiving a set of initial V2X wireless messages from a set of remote vehicles through the network, wherein the set of initial V2X wireless messages comprises a first V2X wireless message from a remote vehicle; and analyzing the set of initial V2X wireless message, to identify the first V2X wireless message comprises data indicating event before collision associated with the remote vehicle.\n | 7. The method according to claim 5, wherein detecting the occurrence of a collision associated with a remote vehicle comprises: based on the event data and one or more of the one or more second V2X wireless message, detecting the remote vehicle collision.\n | 8. The method according to claim 5, wherein the event data comprises remote vehicle behaviour data describing one or more actions of the remote vehicle observed by the self-vehicle, and generating event data describing one or more driving events based on the sensor data further comprises: based on the sensor data, generating remote vehicle behaviour data describing one or more actions of the remote vehicle.\n | 9. The method according to claim 8, wherein the one or more actions of the remote vehicle comprise one or more pre-collision behaviour of the remote vehicle observed by the self-vehicle.\n | 10. The method according to claim 5, wherein: before detecting the occurrence of collision, the method further comprising: analyzing the sensor data to determine the event before the collision relates to the remote vehicle and as the end point of the other party involved in the collision; and monitoring the endpoint to receive one or more third V2X wireless messages from the endpoint; and generating event data describing one or more driving events based on the sensor data further comprises: based on the sensor data, generating endpoint behaviour data describing one or more actions of the endpoint observed by the self-vehicle.\n | 11. The method according to claim 10, further comprising: based on one or more of the one or more third V2X wireless message and endpoint behaviour data, judging that the endpoint relates to the collision with the remote vehicle, wherein the first group of V2X wireless message contained in the report data further comprises one or more third V2X wireless message from the endpoint, and wherein the event data contained in the report data further comprises endpoint behaviour data.\n | 12. The method according to claim 10, wherein the one or more actions of the endpoint include one or more pre-collision behaviour of the endpoint observed by a self-vehicle.\n | 13. The method according to claim 1, wherein the event data contained in the report data further comprises one or more pre-collision conditions observed by the self-vehicle.\n | 14. The method according to claim 1, wherein each of the V2X wireless messages in the first set of V2X wireless messages is selected from the group consisting of: a first set of V2X wireless messages, a second set of V2X wireless messages, and a second set of V2X wireless messages. a dedicated short range communication message; a basic security message; long term evolution LTE message; LTE-V2X message; 5G-V2X message; and a millimetre wave message.\n | 15. A system for a self-vehicle, comprising: A vehicle-mounted computer system of self-vehicle, comprising a communication unit, a processor and a non-transitory memory storing computer code, when executed by the processor, the computer code causes the processor: generating a report data associated with a collision of a remote vehicle, wherein the report data comprises one or more driving events describing the V2X data of all V2X wireless messages from the first group of vehicles received by the remote vehicle before detecting the occurrence of the collision and the collision associated with the remote vehicle observed by one or more sensors of the self-vehicle collided by the adjacent vehicle; event data; receiving the change data, the change data describes the change of the vehicle control system of the self-vehicle, wherein the change data is generated by vehicle accident reproduction analysis, based on the report data, and changing the operation of the vehicle control system based on the change data.\n | 16. The system according to claim 15, wherein: The vehicle control system includes one of an advanced driver assistance system (ADAS system) and an autonomous driving system.\n | 17. The system according to claim 16, wherein: the self-vehicle is an autonomous vehicle, and the computer code when executed by a processor, the processor at least by the following operation based on changing data changing vehicle control system: changing the safety process of the autonomous vehicle provided by the autonomous driving system based on the change data, so as to improve the safety of the autonomous vehicle.\n | 18. The system according to claim 16, wherein: The computer code, when executed by a processor, causes the processor to change the operation of the vehicle control system by at least the following change data: changing one or more operations of one or more of a brake system of a self-vehicle, a steering system and an acceleration system based on the change data, wherein the one or more operations are controlled by the ADAS system.\n | 19. A computer program product, wherein the computer program product comprises instructions that, when executed by a processor, cause the processor to operate, the operation comprising: generating a report data associated with a collision of a remote vehicle, wherein the report data comprises one or more driving events describing the V2X data of all V2X wireless messages from the first group of vehicles received by the remote vehicle before detecting the occurrence of the collision and the collision associated with the remote vehicle observed by one or more sensors of the self-vehicle collided by the adjacent vehicle; event data; receiving the change data, the change data describes the change of the vehicle control system of the self-vehicle, wherein the change data is generated by vehicle accident reproduction analysis, based on the report data, and changing the operation of the vehicle control system based on the change data.\n | 20. The computer program product according to claim 19, wherein: the vehicle control system comprises an autonomous driving system, the self-vehicle is an autonomous vehicle, and the instructions, when executed by the processor, cause the processor to change the operation of the vehicle control system at least by the following change data: changing the safety process of the autonomous vehicle provided by the autonomous driving system based on the change data, so as to improve the safety of the autonomous vehicle.", "output": "The method (300) involves detecting (301) an occurrence of a collision associated with a remote vehicle. A report data that is associated with the collision of the remote vehicle is generated (303). The report data includes vehicle-to-everything (V2X) data that describes a set of V2X wireless messages received prior to detecting the occurrence of the collision and event data that describes driving events that are observed by the ego vehicle. The report data is transmitted (305) to a server. The modification data that describes a modification for a vehicle control system of the ego vehicle is received (307) from the server. The modification data is generated based on the report data through a vehicular accident reconstruction analysis. An operation of the vehicle control system is modified (309) based on the modification data to improve safety of the ego vehicle. INDEPENDENT CLAIMS are included for the following:a system for modifying vehicle control system of connected vehicle; anda computer program product for modifying vehicle control system of connected vehicle. Method for modifying vehicle control system of connected vehicle based on vehicular accident reconstruction (VAR) analysis. The feedback system and the analysis system cooperate with one another to V2X data and sensor data recorded by the ego vehicle which is proximate to a crash to provide an easier and more accurate VAR analysis for the crash. The system where the ego vehicle is an autonomous vehicle and the computer code, when executed by the processor, causes the processor to modify the operation of the vehicle control system based on the modification data by modifying a safety process of the autonomous vehicle which is provided by the autonomous driving system based on the modification data to increases safety of the autonomous vehicle. Analyzing the VAR data to generate either design data or patch data that describes a modification for advanced driver assistance system (ADAS) system or an autonomous driving system have resulted in the collision being avoided or made the collision less likely to occur. The drawing shows a flowchart illustrating the method for modifying vehicle control system of connected vehicle based on result of VAR analysis. 300Method for modifying a vehicle control system of a connected vehicle based on a result of a VAR analysis301Step for detecting an occurrence of a collision associated with a remote vehicle303Step for generating report data associated with the collision of the remote vehicle305Step for transmitting the report data to a server307Step for receiving modification data that describes a modification for a vehicle control system of the ego vehicle from the server309Step for modifying an operation of the vehicle control system based on the modification data to improve safety of the ego vehicle", "instruction": "Please summarize the input" }, { "input": "Correction of a vehicle component based on Vehicle-to-Everything communication|1. A method performed by an own vehicle; and to generate received signal strength (RSS) data for describing an RSS value, for a first Vehicle-to-Everything (V2X) message, which is originated by a remote vehicle; and is transmitted by a remote vehicle. The RSS data of the second V2X message transmitted by the remote vehicle is generated, and based on each RSS data, it is determined that the first distance from the own vehicle to the remote vehicle is changed with the lapse of time; and the first distance which changes with time. The first V2X message and the second V2X message are related to the remote vehicle, and to extract a set of sensor data including a second distance varying with time; and to provide a method for extracting the sensor data. By comparing the first distance with the lapse of time with the second distance with the lapse of time, it is determined that the second distance is inaccurate. The V2X message is based on the difference between the first distance and the second distance with the lapse of time, and it is determined that the vehicle has been transmitted by a computer determined by the own vehicle when inaccurate sensor data are transmitted. An advanced driving support system (ADAS) or an autonomous driving system of its own vehicle includes: correcting the operation of an ADAS or an autonomous operation system of its own vehicle so as not to consider sensor data provided by the remote vehicle.\n | 2. The V2X message is a dedicated short-range communications message; the method described in claim 1.\n | 3. The V2X messages are WiFi messages; 3G messages; 4G messages; 5G messages; Long-Term Evolution (LTE) messages; millimeter wave communications messages; Bluetooth messages; and satellite communications; The method described in claim 1 or 2.\n | 4. The V2X message is received by a V2X wireless unit; the method described in any one of the claims 1 to 3.\n | 5. The V2X radio device is not an element of the own vehicle; and the method described in claim 4 is disclosed.\n | 6. The V2X radio device includes a plurality of channels including a reservation channel reserved for receiving the V2X message, and a method described in claim 4.\n | 7. The reservation channel is reserved for a basic safety message; a method described in claim 6.\n | 8. This is a processor communicably connected to a Vehicle-to-Everything (V2X) wireless unit and a non-transitory memory; wherein the V2X wireless unit is operable to receive V2-X messages; and the non-transitory memory stores a computer code. The processor is provided with a processor; the computer code is executed by the processor; and the processor is provided with the processor. For a first Vehicle-to-Everything (V2X) message originated by a remote vehicle, a received signal strength (RSS) data describing an RSS value is generated; The RSS data of the second V2X message transmitted by the remote vehicle is generated, and based on each RSS data, it is determined that the first distance from the own vehicle to the remote vehicle is changed with the lapse of time; and the first distance which changes with time. The first V2X message and the second V2X message are related to the remote vehicle, and to extract a set of sensor data including a second distance varying with time; and to provide a method for extracting the sensor data. By comparing the first distance with the lapse of time with the second distance with the lapse of time, it is determined that the second distance is inaccurate. The V2X message is based on the difference between the first distance and the second distance with the lapse of time, and it is determined that the vehicle has been transmitted by a computer determined by the own vehicle when inaccurate sensor data are transmitted. An advanced driving support system (ADAS) or an autonomous driving system of its own vehicle is operable so as not to consider sensor data provided by the remote vehicle; to correct the operation of an ADAS or an autonomous driving system of its own vehicle; and to execute the execution of the ADAS or the autonomous driving system.\n | 9. The V2X message is a dedicated short-range communications message; the system described in claim 8.\n | 10. A program that causes the processor to execute its behavior when it is executed by a processor; the behavior of the processor is: For a first Vehicle-to-Everything (V2X) message originated by a remote vehicle, a received signal strength (RSS) data describing an RSS value is generated; The RSS data of the second V2X message transmitted by the remote vehicle is generated, and based on each RSS data, it is determined that the first distance from the own vehicle to the remote vehicle is changed with the lapse of time; and the first distance which changes with time. The first V2X message and the second V2X message are related to the remote vehicle; a set of sensor data including a second distance varying with time is extracted; and the first distance is compared with the second distance with the lapse of time to compare the first distance with the second distance with the lapse of time. It is determined that the distance of 2 is inaccurate; and a V2X message is based on the difference between the first distance with the lapse of time and the second distance with the lapse of time; and the transmission of inaccurate sensor data is determined by a computer determined by the own vehicle. A program includes: an advanced driving support system (ADAS) or an autonomous driving system of its own vehicle; and a correction of an operation of an ADAS or an autonomous driving system of its own vehicle so as not to consider sensor data provided by the remote vehicle.\n | 11. The operation of the ADAS or the autonomous operation system is corrected, so that the ADAS or the autonomous operation system does not consider the sensor data at the present time; and the operation of the ADAS or the autonomous operation system is corrected; and the program described in claim 10 is provided.\n | 12. The operation of the ADAS or the autonomous operation system is corrected, and the operation of the ADAS or the autonomous operation system is corrected so as not to consider future sensor data received from the remote vehicle by the ADAS or the autonomous operation system. The program described in Claim 10 or 11.\n | 13. The operation of the ADAS or the autonomous operation system is corrected, and the operation of the ADAS or the autonomous operation system is corrected so as not to consider the past sensor data received from the remote vehicle by the ADAS or the autonomous operation system. The program described in any one of the claims 10-12.", "output": "The method involves generating, by an ego vehicle (123), received signal strength (RSS) data describing an RSS value for a vehicle-to-everything (V2X) message originated by a remote vehicle (124). The range data corresponding to the RSS value is determined, which describes a first range from the ego vehicle to the remote vehicle. The ego vehicle determines that the remote vehicle is providing inaccurate sensor data by comparing the first range to a second range which is described by the sensor data which is extracted from the V2X message. An operation of a vehicle component of the ego vehicle is modified so that the vehicle component does not consider the sensor data that is provided by the remote vehicle. Method executed by ego vehicle for use in vehicle-to-everything communications. The method modifies the operation of the ego vehicle's autonomous driving system so that the inaccurate sensor data transmitted by a misbehaving endpoint is ignored in the future by the ego vehicle so that the inaccurate sensor data does not create a safety hazard. The method identifies misbehaving endpoints that repeatedly transmit inaccurate sensor data or taking steps to reduce any safety hazards that would be created by ego vehicles operating their autonomous driving systems. The drawing shows the block diagram of an operating environment for a modification system. 104Computer105Network123Ego vehicle124Remote vehicle199Modification system", "instruction": "Please summarize the input" }, { "input": "The detection of the driver who is taking the sleepiness based on a vehicle versus mono-communication|1. A second connected vehicle determines that a second driver of the second connected vehicle is taking a sleepiness of the vehicle. When the second driver sleepiness and the second connected vehicle is not in the automatic operation, the second connected vehicle transmits a V2X message to notify the sleepiness of the second driver. A first connected vehicle is configured to receive a V2X message transmitted from the second connected vehicle, and to determine whether the first connected vehicle is an autonomous vehicle or not. The first connected vehicle is not an autonomous vehicle; and the first connected vehicle is responsive to the reception of the V2X message notifying the sleepiness of the second driver; and the first connected vehicle is the first connected vehicle. A notification is provided to the first driver of the first connected vehicle. The first connected vehicle is an autonomous vehicle; and in response to receipt of the V2X message for informing drowsiness of the second driver, the first connected vehicle reduces the risk caused by the second driver. The method includes automatically avoiding an avoidance operation for avoiding the second connected vehicle by the first connected vehicle.\n | 2. The V2X message is a dedicated narrow-area communications (DSRC) message; the method described in claim 1.\n | 3. The V2X messages are WiFi messages; 3G messages; 4G messages; 5G messages; Long Term Evolution (LTE) messages; millimeter wave communications messages; Bluetooth messages; and one of the satellite communications; The method described in claim 1.\n | 4. The V2X message is a basic safety message (BSM); a method described in claim 1.\n | 5. The notification is a visual notification displayed on the head unit; and a small number of voice notifications; and a method described in claim 1.\n | 6. A first processor of a first connected vehicle; a first non-temporary memory communicably connected to the first processor and storing a first computer code; a second processor of a second connected vehicle; and a second processor connected to the second processor in a communicable manner; and a second connected vehicle is connected to the first processor. The system includes a second non-temporary memory for storing a second computer code; and the second computer code determines that a second driver of the second connected vehicle has a drowsiness. When the second driver sleepiness and the second connected vehicle is not in the automatic operation, a V2X message notifying drowsiness of the second driver is transmitted; and the second processor executes the message; and the first computer code is executed. An V2X message transmitted from the second connected vehicle is received, and it is determined whether the own vehicle is an autonomous vehicle or not. The first connected vehicle is not an autonomous vehicle; and a notification is provided to a first driver of the first connected vehicle in response to reception of the V2X message for informing drowsiness of the second driver by the first connected vehicle. The first connected vehicle is an autonomous vehicle; and in response to receipt of the V2X message for informing drowsiness of the second driver, the first connected vehicle reduces the risk caused by the second driver. An avoidance operation for avoiding the second connected vehicle is automatically performed, and the system is executed to the first processor.\n | 7. The notice is at least one of visual notices displayed on the head unit and voice notification; and the system described in claim 6.", "output": "The method involves receiving a Vehicle-to-Everything (V2X) message including digital data describing a path history of a first connected vehicle by a second connected vehicle. Determination is made that a driver of the first connected vehicle is drowsy based on the path history described by the digital data included in the V2X message by the second connected vehicle. A remedial action is executed to modify an operation of the second connected vehicle based on the driver of the first connected vehicle being drowsy by the second connected vehicle such that risk created by the driver is reduced, where the V2X message is selected from a group consisting of a Basic Safety Message, Dedicated Short-Range Communication (DSRC) message, Wi-Fi message, 3G message, 4G message, 5G message, Long-Term Evolution (LTE) message, mm wave communication message, Bluetooth message and a satellite communication. INDEPENDENT CLAIMS are also included for the following:a system for detecting presence of a drowsy driver of a vehicle based on V2X communicationsa computer program product comprising a set of instructions for detecting presence of a drowsy driver of a vehicle based on V2X communications. Method for detecting presence of a drowsy driver of a vehicle based on V2X communications. The method enables allowing a drowsy detection system to provide a safer driving environment and improve operation of a connected vehicle by assisting the connected vehicle to avoid drowsy drivers and reduce risk caused by drowsy drivers. The drawing shows a flow diagram illustrating a method for modifying operation of a connected vehicle to reduce risk caused by a drowsy driver. 301Step for transmitting Basic Safety Message303Step for receiving Basic Safety Message305Step for parsing out Basic Safety Message data from Basic Safety Message307Step for analyzing path history data included in Basic Safety Message data308Step for analyzing path history data", "instruction": "Please summarize the input" }, { "input": "The collision avoidance for the connected vehicle based on a digital action twin|1. In the collision avoidance method for a connected vehicle based on a digital action twin, the vehicle is recorded with digital data describing the driving state and the driving behavior of the remote vehicle and the own vehicle, and by the own vehicle. The first digital behavior twin of the remote vehicle; the second digital action twin of the own vehicle; and the digital data are provided to specify the risk of a collision involving one or more of the remote vehicle and the own vehicle; and to the own vehicle. The method includes correcting the operation of the own vehicle on the basis of the risk; wherein the first digital action twin is a model describing the driving behavior of the remote driver of the remote vehicle in one or more different operating conditions; and the second digital action twin is provided. The collision prevention method is a model for describing the driving behavior of the own driver of the own vehicle in one or more different driving conditions.\n | 2. The method for preventing the collision of the own vehicle includes: correcting the operation of the own vehicle; and displaying a graphic output for visually depicting the risk by the electronic display of the own vehicle.\n | 3. The graphic output is an augmented reality (AR) visualization showing a collision risk in each part of a road that is currently running or is going to travel, and is a collision prevention method described in claim 2.\n | 4. The remote vehicle further includes transmission of a vehicle-to-mono (V2X) message to the own vehicle; the V2X message includes remote twin data describing the first digital behavior twin; and the collision-prevention method described in claim 1.\n | 5. The one or more different operating conditions are based on the pattern of the remote driver which accelerates or decelerates according to the change of the traffic signal, and the collision prevention method described in claim 1.\n | 6. The own vehicle is an autonomous vehicle; and the collision prevention method described in claim 1 is provided.\n | 7. The action of the own vehicle is corrected, including the action of the own vehicle to autonomously correct the risk, and the collision prevention method described in the claim 6 is provided.\n | 8. On the basis of new digital data describing different driving behaviors of the remote vehicle in the operating condition, the vehicle further includes correcting the first digital action twin by the own vehicle; and the like. The first digital behavior twin is modified to include the different driving behavior, and the collision prevention method described in claim 1 is disclosed.\n | 9. A collision avoidance system for a connected vehicle based on a digital action twin; a non-temporary memory storing digital data describing an operation state and an operation behavior of a remote vehicle and its own vehicle; and a connected vehicle. This system is provided with a processor communicably connected to the non-temporary memory; and when the non-temporary memory is executed by the processor, the first digital action twin of the remote vehicle; and the second digital action twin of the own vehicle. On the basis of the digital data, the risk of collision involving one or more of the remote vehicle and the own vehicle is specified; and the operation of the own vehicle is corrected based on the risk. A computer code for making the processor perform is stored; the first digital action twin is a model describing the driving behavior of the remote driver of the remote vehicle in one or more different operating conditions; and the second digital action twin is provided. The collision avoidance system is a model for describing the driving behavior of the own driver of the own vehicle in one or more different operating conditions.\n | 10. Correction of the operation of the own vehicle includes displaying a graphic output for visually depicting the risk by an electronic display of the own vehicle, and a collision avoidance system described in claim 9 is provided.\n | 11. The graphic output is an augmented reality (AR) visualization showing a collision risk in each part of a road that is currently running or is going to travel by using a corresponding colored region, and a collision avoidance system described in claim 10.\n | 12. The first digital action twin is received by the own vehicle; and is described by remote twin data received via a vehicle versus mono (V2X) message transmitted by the remote vehicle, and the collision avoidance system described in claim 9 is disclosed.\n | 13. When executed by the processor, on the basis of new digital data describing the different driving behavior of the remote vehicle in the operating condition; The non-temporary memory stores an additional computer code for correcting the first digital behavior twin to the processor, and the first digital action twin is corrected so as to include the different driving behavior, and the collision avoidance system described in claim 9 is disclosed.\n | 14. When executed by a processor, recording of digital data describing driving situation and driving behavior of the remote vehicle and the own vehicle in the operating condition; first digital action twin of the remote vehicle; second digital action twin of the own vehicle; Based on the digital data, the risk of collision involving one or more of the remote vehicle and the own vehicle is specified; and the operation of the own vehicle is corrected on the basis of the risk; and the processor is made to perform the above processor. The first digital action twin is a model describing the driving behavior of the remote driver of the remote vehicle in one or more different operating conditions; and the second digital action twin is provided. The program is a model describing the driving behavior of the own driver of the own vehicle in one or more different operating conditions.", "output": "The method involves recording digital data describing a driving context and a driving behavior of a remote vehicle and an ego vehicle in this driving context. A risk of a collision involving remote vehicle and the ego vehicle is determined based on a first digital behavioral twin of the remote vehicle, a second digital behavioral twin of the ego vehicle and the digital data. An operation of the ego vehicle is modified based on the risk. INDEPENDENT CLAIMS are included for the following:a system for providing digital twin service for real-world vehicle; anda computer program product providing digital twin service for real-world vehicle. Method for providing digital twin service for real-world vehicle. The twin client operates quickly and gives ego drivers more time to respond to dangerous situations, which increases driver safety for the ego driver and other drivers on the roadway. The digital behavioral twin system warns the ego driver before dangerous actions of other remote drivers are taken, which provides the ego driver more time to avoid collision. The augmented reality (AR) visualization provided by the twin client reduces mental fatigue of the ego driver by visualizing risks using more intuitive safe/unsafe regions, which is mentally processed by the ego driver using their sub-conscious. The driver must enable the automated system only when it is safe to do so. The driver does not need to adjust eye focus in order to view the projected image when an image is projected at the same three-dimensional position resulting in easy grasp of the projected image while looking at the real object. The drawing shows a block diagram illustrating an operating environment for a digital behavioral twin system and a twin client. 100Operating environment105Network107Digital twin server127A, 127BMemory199Digital behavioral twin system", "instruction": "Please summarize the input" }, { "input": "The traveling control system of an autonomous vehicle, a server device, and an autonomous vehiclePROBLEM TO BE SOLVED: To allow multiple autonomous travel vehicles to efficiently travel when multiple autonomous vehicles are traveling by mixing on the same lane. \n\nSOLUTION: A travel control system of an autonomous travel vehicle for controlling multiple autonomous travel vehicles autonomously traveling in accordance with a predetermined operation command includes: priority order setting means for setting priority order between multiple autonomous travel vehicles; and travel control means for controlling traveling of multiple autonomous travel vehicles so as to prioritize traveling of an autonomous travel vehicle at higher priority order set by the priority order setting means compared with an autonomous travel vehicle at lower priority order set by the priority order setting means when multiple autonomous travel vehicles are traveling by mixing on the same lane. \n\nSELECTED DRAWING: Figure 6|1. It is a traveling control system of an autonomous vehicle which controls several autonomous vehicle autonomously run according to predetermined|prescribed operation instruction|command,\n Comprising: \n The priority setting means which sets the priority in between several autonomous vehicles, \n When several autonomous vehicle mixes and drive|works into the same lane, \n The travel controlling means which controls driving|running|working of several autonomous vehicle to give priority to driving|running|working of an autonomous vehicle with a high priority set by the said priority setting means compared with an autonomous vehicle with a low priority set by the said priority setting means, \n The traveling control system of an autonomous vehicle provided with these. \n | 2. The said priority setting means sets a priority higher than an autonomous vehicle with a low fee-charging charge with respect to a user to an autonomous vehicle with a high fee-charging charge with respect to a user,\n The traveling control system of the autonomous vehicle of Claim 1. \n | 3. The said priority setting means sets a priority higher than an autonomous vehicle with low travel speed to an autonomous vehicle with high travel speed,\n The traveling control system of the autonomous vehicle of Claim 1. \n | 4. It is a server device which controls several autonomous vehicle autonomously run according to predetermined|prescribed operation instruction|command,\n Comprising: \n The priority setting means which sets the priority in between several autonomous vehicles, \n When several autonomous vehicle which drive|works the same lane top is detected, \n An instruction|command means to perform operation instruction|command to several autonomous vehicle so that priority may be given to driving|running|working of an autonomous vehicle with a high priority set by the said priority setting means compared with an autonomous vehicle with a low priority set by the said priority setting means, \n A server device provided with these. \n | 5. It is an autonomous vehicle autonomously run according to predetermined|prescribed operation instruction|command,\n Comprising: \n The following vehicle detection means which detects the other autonomous vehicle which follows this own vehicle on the same lane as the own vehicle, \n An acquisition means to acquire the information regarding the priority of this other autonomous vehicle by communicating from vehicle to vehicle with this other autonomous vehicle when said other autonomous vehicle is detected by the said following vehicle detection means, \n The travel controlling means which will control driving|running|working of the own vehicle to give priority to driving|running|working of said other autonomous vehicle over the own vehicle if the priority of said other autonomous vehicle acquired by the said acquisition means is higher than the priority of the own vehicle, \n An autonomous vehicle provided with these.", "output": "The traveling control system comprises a controller having at least one processor. The controller configured to set priorities (S14) among multiple autonomous traveling vehicles (100A,100B) controls traveling of multiple of autonomous traveling vehicles so that the autonomous traveling vehicle having a priority set to be high travels preferentially as compared with the autonomous traveling vehicle having the priority set to be low, if the multiple of autonomous traveling vehicles travel on an identical lane in a mixed manner. An INDEPENDENT CLAIM is included for a server apparatus for controlling multiple of autonomous traveling vehicles. Traveling control system for controlling autonomous traveling vehicle (Claimed). The traveling control system for controlling autonomous traveling vehicle suppresses the situation that the subject vehicle disturbs the smooth traveling of the following other autonomous traveling vehicle. The system allows the first autonomous traveling vehicle and the second autonomous traveling vehicle to travel in accordance with the respective needs. The drawing shows a flowchart for the flow of the data and the process performed between the respective constitutive components of the moving body system. 100A,100BAutonomous traveling vehicles200Server apparatusS10Position informationS14Set prioritiesS16Departure command", "instruction": "Please summarize the input" }, { "input": "V2-X receiver-oriented V2-X full duplex position pinpoint assistancePROBLEM TO BE SOLVED: To accurately localize a transmission source of a V2X message. \n\nSOLUTION: The disclosure describes embodiments for modifying an operation of a vehicle component based on a Vehicle-to-Everything (V2X) communication. A method includes receiving, by a remote vehicle, a request message inquiring what additional identifying information of an ego vehicle the remote vehicle requests so that the remote vehicle identifies the ego vehicle as a transmitter of a V2X message. The method includes transmitting, by the remote vehicle, a response message including response data that describes the additional identifying information of the ego vehicle that provides the remote vehicle with the capacity to identify the ego vehicle as the transmitter of the V2X message when the ego vehicle is among a plurality of vehicles. The method includes receiving, by the remote vehicle, the V2X message that includes assistance data describing the additional identifying information. The method includes modifying, by the remote vehicle, the operation of the vehicle component of the remote vehicle based on the assistance data. \n\nSELECTED DRAWING: Figure 1A|1. It is a method to correct operation|movement of a vehicle component based on V2X(Vehicle-to-Everything) communication,\n Comprising: \n What the said remote vehicle requires in order that the said remote vehicle may identify the own vehicle as a sender of V2-X message with a remote vehicle is a step which receives the request message which inquires of what kind of the said own vehicle it is additional identification information, \n The step which transmits the response message containing the response data which describes the said additional identification information of the said own vehicle for providing the said remote vehicle with the capability to identify the said own vehicle as said sender of 2X of said V message with the said remote vehicle when the said own vehicle exists in several other vehicle, \n The step which receives 2X of said V message which contains the assistance data which describe the said additional identification information with the said remote vehicle, \n The method containing the step which corrects the said operation|movement of the said vehicle component of the said remote vehicle with the said remote vehicle based on the said assistance data. \n | 2X of said V message is a DSRC (dedicated short range communications) message,\n The method of Claim 1. \n | 2X of said V message is not one of a Wi-Fi message, a 3-G message, a 4-G message, a 5-G message, the Long Term Evolution (LTE) message, a millimeter-wave communication message, a Bluetooth message, and satellite communications,\n The method of Claim 1. \n | 2X of said V message contains the payload data which describe the said own vehicle,\n The said assistance data help for the said remote vehicle to get to know which [ of these vehicles ] is described by payload data by the precision|reliability of a permissible level, \n The said operation|movement of the said vehicle component is corrected based on the said payload data, \n The method of Claim 1. \n | 5. The said assistance data describe the position of the said own vehicle at the substantially half precision of the width|variety of the road the said own vehicle is drive|working,\n The method of Claim 1. \n | 6. The said request message is broadcast by the said own vehicle,\n The said response message is unicasted with the said remote vehicle, \n 2X of said V message is broadcast by the said own vehicle, \n The method of Claim 1. \n | 7. The said remote vehicle is an autonomous vehicle,\n The method of Claim 1. \n | 8. It is a system which corrects operation|movement of a vehicle component based on V2-X communication,\n Comprising: \n The said system is contained in a remote vehicle, \n Processor, \n It is the non-temporary memory couple|bonded so as to be communicable with the said processor, \n Comprising: \n The above-mentioned non-temporary memory is the said processor when the said processor performs, \n What the said remote vehicle requires in order that the said remote vehicle may identify the own vehicle as a sender of V2-X message with the said remote vehicle is a step which receives the request message which inquires of what kind of the said own vehicle it is additional identification information, \n The step which transmits the response message containing the response data which describes the said additional identification information of the said own vehicle for providing the said remote vehicle with the capability to identify the said own vehicle as said sender of 2X of said V message with the said remote vehicle when the said own vehicle exists in several other vehicle, \n The step which receives 2X of said V message which contains the assistance data which describe the said additional identification information with the said remote vehicle, \n The system provided with non-temporary memory which memorize|stores the computer code which can operate|move so that the step which corrects the said operation|movement of the said vehicle component of the said remote vehicle with the said remote vehicle based on the said assistance data may be performed. \n | 2X of said V message is a DSRC (dedicated short range communications) message,\n The system of Claim 8. \n | 2X of said V message is not one of a Wi-Fi message, a 3-G message, a 4-G message, a 5-G message, the Long Term Evolution (LTE) message, a millimeter-wave communication message, a Bluetooth message, and satellite communications,\n The system of Claim 8. \n | 2X of said V message contains the payload data which describe the said own vehicle,\n The said assistance data help for the said remote vehicle to get to know which [ of these vehicles ] is described by payload data by the precision|reliability of a permissible level, \n The said operation|movement of the said vehicle component is corrected based on the said payload data, \n The system of Claim 8. \n | 12. The said assistance data describe the position of the said own vehicle at the substantially half precision of the width|variety of the road the said own vehicle is drive|working,\n The system of Claim 8. \n | 13. The said request message is broadcast by the said own vehicle,\n The said response message is unicasted with the said remote vehicle, \n 2X of said V message is broadcast by the said own vehicle, \n The system of Claim 8. \n | 14. The said remote vehicle is an autonomous vehicle,\n The system of Claim 8. \n | 15. As operation|movement of a vehicle component is corrected based on V2-X communication, they are the computer program products which can operate|move,\n Comprising: \n The said computer program product is the said processor when the processor of a remote vehicle performs, \n What the said remote vehicle requires in order that the said remote vehicle may identify the own vehicle as a sender of V2-X message with the said remote vehicle is operation|movement which receives the request message which inquires of what kind of the said own vehicle it is additional identification information, \n Operation|movement which transmits the response message containing the response data which describes the said additional identification information of the said own vehicle for providing the said remote vehicle with the capability to identify the said own vehicle as said sender of 2X of said V message with the said remote vehicle when the said own vehicle exists in several other vehicle, \n Operation|movement which receives 2X of said V message which contains the assistance data which describe the said additional identification information with the said remote vehicle, \n Computer program products provided with the command which performs the operation|movement which corrects the said operation|movement of the said vehicle component of the said remote vehicle with the said remote vehicle based on the said assistance data. \n | 2X of said V message is a DSRC (dedicated short range communications) message,\n The computer program products of Claim 15. \n | 2X of said V message is not one of a Wi-Fi message, a 3-G message, a 4-G message, a 5-G message, the Long Term Evolution (LTE) message, a millimeter-wave communication message, a Bluetooth message, and satellite communications,\n The computer program products of Claim 15. \n | 2X of said V message contains the payload data which describe the said own vehicle,\n The said assistance data help for the said remote vehicle to get to know which [ of these vehicles ] is described by payload data by the precision|reliability of a permissible level, \n The said operation|movement of the said vehicle component is corrected based on the said payload data, \n The computer program products of Claim 15. \n | 19. The said assistance data describe the position of the said own vehicle at the substantially half precision of the width|variety of the road the said own vehicle is drive|working,\n The computer program products of Claim 15. \n | 20. The said request message is broadcast by the said own vehicle,\n The said response message is unicasted with the said remote vehicle, \n 2X of said V message is broadcast by the said own vehicle, \n The computer program products of Claim 15.", "output": "The method (300) involves receiving (307) a request message inquiring what additional identifying information of an ego vehicle the remote vehicle requests so that the remote vehicle has a capacity to identify the ego vehicle as a transmitter of a V2X message. A response message which includes response data that describes the additional identifying information of the ego vehicle that provides the remote vehicle with the capacity to identify the ego vehicle as the transmitter of the V2X message is transmitted by the remote vehicle. The V2X message that includes assistance data describing the additional identifying information is received. The operation of the vehicle component of the remote vehicle is modified by the remote vehicle based on the assistance data. INDEPENDENT CLAIMS are included for the following:a system for modifying operation of vehicle component based on vehicle-to-everything communication; anda computer program product for modifying operation of vehicle. Method for modifying operation of vehicle component of ego vehicle or remote vehicle based on vehicle-to-everything (V2X) communication. The improved ability to identify the transmitter of different dedicated short-range communication (DSRC) messages improve the performance of connected ADAS systems, and autonomous driving systems. The driver safely turns attention away from driving tasks and prepared to take control of the autonomous vehicle when needed. The drawing shows a flow chart illustrating the method for modifying operation of vehicle. 300Method for modifying operation of vehicle component301Step for recording the sensor data302Step for analyzing the sensor data305Step for broadcasting request message307Step for receiving request message", "instruction": "Please summarize the input" }, { "input": "SYSTEMS AND METHODS FOR ACTIVE ROAD SURFACE MAINTENANCE WITH CLOUD-BASED MOBILITY DIGITAL TWINAn active road surface maintenance system and method developed for connected vehicles with the aid of a mobility digital twin (MDT) framework. A method performed in a cloud-based digital space includes receiving data regarding a physical object from a physical space connected to a vehicle. The method also includes processing the data using machine learning to model road surface conditions, in which respective penalty values are assigned to corresponding road surfaces, a respective penalty value being higher the lower a condition of the corresponding road surface. The method also includes deriving instructions based on the modeled road surface conditions and the respective penalty values to guide actuation of the vehicle along a trajectory. The method further includes transmitting the instructions to the physical space connected to the vehicle to guide actuation of the vehicle.What is claimed is: \n | 1. A method performed in a cloud-based digital space, comprising: \nreceiving data regarding a physical object from a physical space within which a vehicle is operating; \nprocessing the data using machine learning to model road surface conditions, in which respective penalty values are assigned to corresponding road surfaces, a respective penalty value being inversely related to a condition of the corresponding road surface; \nderiving instructions based on the modeled road surface conditions and the respective penalty values to guide actuation of the vehicle along a trajectory; and \ntransmitting the instructions to the physical space connected to the vehicle to guide actuation of the vehicle. \n | 2. The method of claim 1, wherein the processing of the data further comprises storing the data in a data lake.\n | 3. The method of claim 2, wherein the data lake further comprises stored historical data.\n | 4. The method of claim 3, wherein the processing of the data includes processing of the stored historical data in addition to the stored data received from the physical space.\n | 5. The method of claim 1, wherein the physical object comprises at least one of a vehicle, a human, and a traffic device.\n | 6. The method of claim 1, wherein the data is collected by one or more sensors communicating with the physical object.\n | 7. The method of claim 6, wherein the collected data is real-time information relating to one or more of the following: road surfaces, traffic flow, weather, ego vehicle, perception of neighboring vehicle, or occupant of ego vehicle.\n | 8. The method of claim 1, wherein the received data is obtained from one or more monitoring devices associated with the physical object and/or from one or more vehicle-to-anything (V2X) communications regarding the physical object.\n | 9. The method of claim 1, further comprising effecting the actuation of the vehicle along the trajectory when the vehicle is an autonomous vehicle, or prompting a human driver of the vehicle to drive along the trajectory when the vehicle is operated by the human driver.\n | 10. The method of claim 1, further comprising: \nwhen the vehicle is operated by a human driver, displaying each respective lane of the road surfaces along the trajectory with an indicator that indicates the road surface condition of the respective lane. \n | 11. The method of claim 1, wherein the processing further comprises processing the data using machine learning and historical data to model the road surface conditions and predict future road surface conditions, and using the predicted future road surface conditions to target road surfaces for maintenance.\n | 12. The method of claim 1, further comprising applying a fusion process to the received data and filtering out noisy data.\n | 13. The method of claim 12, wherein the fusion process is effected using a Kalman filter-based sensor fusion algorithm.\n | 14. A cloud-based system effectuating an end-to-end framework, comprising: \na cloud-based platform hosting one or more digital twins corresponding to one or more physical objects from a physical space within which a vehicle is operating, wherein one of the digital twins comprises a data lake and an active road maintenance microservice; \na communications layer communicatively connecting the one or more digital twins to the one or more physical objects, wherein: \nthe communications layer transmits data regarding the one or more physical objects to at least the one or more corresponding digital twins, and \nthe communications layer transmits instructions that have been derived from processing of the transmitted data by at least the active road maintenance microservice to the physical space connected to the vehicle; and \nwherein the active road maintenance microservice: \nprocesses the data using machine learning to model road surface conditions, in which a rewards function assigns respective rewards values to corresponding road surfaces, a respective rewards value corresponding to a condition of the corresponding road surface, and \nderives the instructions based on the modeled road surface conditions and on optimizing the rewards function to guide actuation of the vehicle along a trajectory. \n | 15. The cloud-based system of claim 14, wherein the one or more physical objects comprise at least one of a vehicle, a human, and a traffic device.\n | 16. The cloud-based system of claim 14, wherein the data lake further comprises stored historical data, and wherein the processing of the data further comprises storing the transmitted data in the data lake and processing the stored historical data in addition to the stored transmitted data.\n | 17. The cloud-based system of claim 14, \nwherein the data is collected by one or more sensors communicating with the one or more physical objects, \nwherein the sensors are equipped on one or more of roads, vehicles, vehicle occupants, or pedestrians, and \nwherein the sensors are one or more of perception sensor, ultrasonic sensor, camera, radar, LIDAR, in-pavement surface temperature and condition sensor, in-pavement surface chemical and concentration sensor, wearable device, road surface sensor, in-cabin sensor, mobile app, loop detector, condition sensor. \n | 18. The cloud-based system of claim 14, further comprising when the vehicle is a non-autonomous vehicle, displaying each respective lane of the road surfaces along the trajectory with an indicator that indicates the road surface condition of the respective lane.\n | 19. A method performed in a cloud-based system effectuating an end-to-end framework, comprising: \nin a digital space: \nreceiving data regarding a physical object from a physical space within which a vehicle is operating; \nprocessing the data using machine learning to model road surface conditions, including using a rewards function to assign respective rewards values to corresponding road surfaces, a respective rewards value being related to a condition of the corresponding road surface; \nderiving instructions based on optimizing the rewards function to guide actuation of the vehicle along a trajectory; \ntransmitting the instructions to the physical space connected to the vehicle to guide actuation of the vehicle; \nin the physical space: \nreceiving the transmitted instructions; \ndetermining whether the vehicle is an autonomous vehicle or a non-autonomous vehicle; and \nwhen the vehicle is an autonomous vehicle navigating the vehicle along the trajectory using the instructions, or when the vehicle is a non-autonomous vehicle prompting a human driver to navigate the vehicle along with trajectory using the instructions. \n | 20. The method of claim 19, further comprising: \nwhen the vehicle is a non-autonomous vehicle, displaying each respective lane of the road surfaces along the trajectory with an indicator that indicates the road surface condition of the respective lane.", "output": "The method (500) involves receiving data regarding a physical object from a physical space within which a vehicle i.e. car is operated (502), where the physical object comprises at least one vehicle, a human and a traffic device. The data is processed (504) using machine learning to model road surface conditions in which respective penalty values are assigned to corresponding road surfaces, where the respective penalty value is inversely related to a condition of corresponding road surface. Instructions are derived (506) based on the modeled road surface conditions and the respective penalty values to guide actuation of the vehicle along a trajectory. The instructions are transmitted (508) to the physical space connected to the vehicle to guide actuation of the vehicle. The data is stored in a data lake, where the data lake further comprises stored historical data. Fusion process is effected by using a Kalman filter-based sensor fusion algorithm. INDEPENDENT CLAIMS are included for:(1) a cloud-based system effectuating an end-to-end framework;and(2) a method for performing active road surface maintenance in a cloud-based digital space for connected vehicles with aid of a mobility digital twin framework. Method for performing active road surface maintenance in cloud-based digital space for connected vehicles i.e. car with aid of mobility digital twin framework i.e. end-to-end framework. The method enables generating the guidance information by road digital twins and sent back to the connected vehicles in the real world, thus assisting the autonomous vehicles or the human drivers to drive in a certain way to avoid excessive loads on certain areas of the road surface in an effective manner. The drawing shows a flow diagram ilustrating a method for performing active road surface maintenance in cloud-based digital space for connected vehicles with aid of a mobility digital twin framework.500Method for performing active road surface maintenance in cloud-based digital space for connected vehicles with aid of a mobility digital twin framework 502Method for receiving data regarding a physical object from a physical space 504Method for processing data using machine learning to model road surface conditions 506Method for deriving instructions based on the modeled road surface conditions and the respective penalty values to guide actuation of the vehicle along a trajectory 508Method for transmtting instructions to the physical space connected to the vehicle to guide actuation of the vehicle ", "instruction": "Please summarize the input" }, { "input": "Controller and autonomous driving vehicleA controller for transmitting control information necessary for autonomous driving to an autonomous driving vehicle is provided. The controller includes: a reception device configured to receive a signal transmitted from a wireless communication device included in the autonomous driving vehicle; a first transmission device configured to transmit first control information to the wireless communication device included in the autonomous driving vehicle; and a second transmission device configured to transmit information to a satellite based on a reception condition of the signal from the autonomous driving vehicle received by the reception device, the information being used by the satellite to transmit second control information to a satellite communication device included in the autonomous driving vehicle.What is claimed is: \n | 1. A controller for transmitting control information necessary for autonomous driving to an autonomous driving vehicle, comprising: \na receiver configured to receive a signal transmitted from a wireless communication device of the autonomous driving vehicle; \na first transmitter configured to transmit, when the signal is not interrupted, first control information to the wireless communication device of the autonomous driving vehicle, the first control information including a speed limit; and \na second transmitter configured to transmit, when the signal is interrupted, information to a satellite that is used by the satellite to transmit second control information configured to control a component of the autonomous driving vehicle to limit speed of the autonomous driving vehicle, from the satellite directly to a satellite receiver of the autonomous driving vehicle, the second control information not including the speed limit and being smaller in data volume than the first control information. \n | 2. The controller according to claim 1, wherein: \nthe first control information is generated based on the signal from the autonomous driving vehicle received by the receiver; and \nthe second control information is generated based on a signal from a vehicle other than the autonomous driving vehicle received by the receiver. \n | 3. A controller for transmitting control information necessary for autonomous driving to an autonomous driving vehicle, comprising: \na receiver configured to receive a signal transmitted from a wireless communication device of the autonomous driving vehicle; \na first transmitter configured to transmit, when the signal is not interrupted, first control information to the wireless communication device of the autonomous driving vehicle, the first control information including a speed limit; and \na second transmitter configured to transmit, when the signal is interrupted, information to another vehicle that is used by the other vehicle to transmit second control information configured to control a component of the autonomous driving vehicle to limit speed of the autonomous driving vehicle, to a vehicle-to-vehicle communication device of the autonomous driving vehicle, the second control information not including the speed limit and being smaller in data volume than the first control information. \n | 4. An autonomous driving vehicle, comprising: \na transmitter configured to transmit a signal to a controller; \na receiver configured to receive first control information from the controller when the signal is not interrupted, the first control information including a speed limit; and \na satellite receiver configured to receive second control information from a satellite when the signal is interrupted, the second control information being control instructions configured to control a component of the autonomous driving vehicle to limit speed of the autonomous driving vehicle, the second control information not including the speed limit and being smaller in data volume than the first control information. \n | 5. The autonomous driving vehicle according to claim 4, wherein the satellite receiver receives the second control information based on failure of transmission of the signal to the controller with use of the transmitter.\n | 6. The autonomous driving vehicle according to claim 4, further comprising a first controller configured to generate the second control information based on the first control information.", "output": "The controller for transmitting control information necessary for autonomous driving vehicle (10), comprises reception device to receive a signal transmitted from a wireless communication device (50) included in the autonomous driving vehicle. A first transmission device transmit first control information to the wireless communication device included in the autonomous driving vehicle. A second transmission device transmit information to satellite based on a reception condition of the signal from the autonomous driving vehicle received by the reception device. The information is used by the satellite to transmit second control information to a satellite communication device included in the autonomous driving vehicle. The automated driving control system determines, whether or not autonomous driving is possible based on the acquired location from estimation unit (24). An INDEPENDENT CLAIM is included for: an autonomous driving vehicle. Controller for transmitting control information necessary for autonomous driving vehicle even when a communication failure occurs. The Controller allows continuous autonomous driving even when a communication failure occurs. The reception condition from the autonomous driving vehicle may include interruption of the signal, reception of an urgent signal indicates abnormality from the autonomous driving vehicle, and reception of a signal generated in abnormal situations. When there is no information sufficient enough to estimate the vehicle speed of the autonomous driving vehicle, the control device of the control center generate a control instruction for simply shifting to a constant speed low enough to observe the speed limit, and may transmit the control instruction to the communication satellite second control information. When the autonomous driving vehicle fails to transmit the location information the controller switches to second control information received from the satellite communication device for temporary location identification. The drawing shows a block diagram of a Controller for transmitting control information necessary for autonomous driving vehicle.10Autonomous driving vehicle24Location from estimation unit50Wireless communication device60Input-output device", "instruction": "Please summarize the input" }, { "input": "Vehicle-to-everything communication-based lane change collision avoidance warningThe disclosure describes embodiments for modifying a whether an ego vehicle changes lanes to a target lane at a target time based on a payload of a Vehicle-to-Everything (V2X) message originated by a remote vehicle. In some embodiments, a method includes determining, based on the payload, whether the remote vehicle is changing lanes to the target lane at the target time. The method includes determining that the ego vehicle is changing lanes to the target lane at approximately the target time. The method includes estimating that the ego vehicle and the remote vehicle will collide at the target lane at the target time. The method includes modifying an operation of a vehicle component of the ego vehicle so that the ego vehicle does not change lanes to the target lane at the target time.What is claimed is: \n | 1. A method executed by an ego vehicle that includes an autonomous driving system, the method comprising: \ndetermining, based on remote data that describes sensor measurements for a remote vehicle that are described in a Vehicle-to-Everything (V2X) message originated by the remote vehicle, that the remote vehicle is changing lanes to a target lane at a target time, wherein the remote data includes one or more of a turning direction for a turn signal of the remote vehicle when the turn signal is engaged and a change in a steering wheel angle of the remote vehicle over time; \ndetermining that the ego vehicle is changing lanes to the target lane at the target time; \nestimating that the ego vehicle and the remote vehicle will collide at the target lane at the target time; and \nmodifying, with the autonomous driving system, a steering wheel of the ego vehicle so that the ego vehicle does not steer into the target lane at the target time wherein this modifying is based on the determination that the ego vehicle is changing lanes to the target lane at the target time. \n | 2. The method of claim 1, further comprising: \ngenerating warning data based on estimating that the ego vehicle and the remote vehicle will collide; and \ninputting the warning data to the autonomous driving system; \nwherein modifying the steering wheel is responsive to the warning data being input to the autonomous driving system and based on the determination that the ego vehicle is changing lanes to the target lane at the target time and the estimation that the ego vehicle and the remote vehicle will collide at the target lane at the target time. \n | 3. The method of claim 2, wherein the warning data includes digital data that is operable, when inputted to an actuator of the autonomous driving system, to cause the actuator to provide a counter-steering force on the steering wheel.\n | 4. The method of claim 1, wherein the V2X message is received by a V2X radio.\n | 5. The method of claim 1, wherein the remote data further includes a location of the remote vehicle that is accurate to within 1.5 meters.\n | 6. The method of claim 4, wherein the V2X radio includes a plurality of channels including a first reserved channel that is reserved for receiving the V2X message.\n | 7. The method of claim 6, wherein the first reserved channel is reserved for a Basic Safety Message and a second reserved channel is reserved for receiving a Pedestrian Safety Message.\n | 8. The method of claim 1, wherein the V2X message is a basic safety message and the sensor measurements include a speed, a heading, and the steering wheel angle of the remote vehicle.\n | 9. A system included in an ego vehicle, the system comprising: \na processor communicatively coupled to an autonomous driving system, a Vehicle-to-Everything (V2X) radio, and a non-transitory memory, wherein the V2X radio is operable to receive a V2X message and the non-transitory memory stores computer code that is operable, when executed by the processor, to cause the processor to: \ndetermine, based on remote data that describes sensor measurements for a remote vehicle that are described in a Vehicle-to-Everything (V2X) message originated by the remote vehicle, that the remote vehicle is changing lanes to a target lane at a target time, wherein the remote data includes one or more of a turning direction for a turn signal of the remote vehicle when the turn signal is engaged or a change in a steering wheel angle of the remote vehicle over time; \ndetermine that the ego vehicle is changing lanes to the target lane at the target time; \nestimate that the ego vehicle and the remote vehicle will collide at the target lane at the target time; and \nmodify, with the autonomous driving system, a steering wheel of the ego vehicle so that the ego vehicle does not change lanes to the target lane at the target time wherein this modification is based on the determination that the ego vehicle is changing lanes to the target lane at the target time and the estimation that the ego vehicle and the remote vehicle will collide at the target lane at the target time. \n | 10. The system of claim 9, wherein the computer code is further operable to cause an audio system to generate an auditory warning.\n | 11. The system of claim 9, wherein the computer code is further operable to cause the processor to: \ngenerating warning data based on estimating that the ego vehicle and the remote vehicle will collide; and \ninputting the warning data to the autonomous driving system; \nwherein modifying the steering wheel is responsive to the warning data being input to the autonomous driving system and based on the determination that the ego vehicle is changing lanes to the target lane at approximately the target time and the estimation that the ego vehicle and the remote vehicle will collide at the target lane at the target time. \n | 12. The system of claim 9, wherein the V2X message is received by a V2X radio.\n | 13. The system of claim 11, wherein the warning data includes digital data that is operable, when inputted to an actuator of the autonomous driving system, to cause the actuator to provide a counter-steering force on the steering wheel.\n | 14. The system of claim 12, wherein the V2X radio includes a plurality of channels including a first reserved channel that is reserved for receiving the V2X message.\n | 15. The system of claim 14, wherein the first reserved channel is reserved for a Basic Safety Message and a second reserved channel is reserved for receiving a Pedestrian Safety Message.\n | 16. The system of claim 9, wherein the V2X message is a basic safety message and the sensor measurements include a speed, a heading, and the steering wheel angle of the remote vehicle.\n | 17. A computer program product comprising instructions that, when executed by a processor of an ego vehicle including an autonomous driving system, causes the processor to perform operations comprising: \ndetermining, based on remote data that describes sensor measurements for a remote vehicle that are described in a Vehicle-to-Everything (V2X) message originated by the remote vehicle, that the remote vehicle is changing lanes to a target lane at a target time, wherein the remote data includes one or more of a turning direction for a turn signal of the remote vehicle when the turn signal is engaged or a change in a steering wheel angle of the remote vehicle over time; \ndetermining that the ego vehicle is changing lanes to the target lane at the target time; \nestimating that the ego vehicle and the remote vehicle will collide at the target lane at the target time; and \nmodifying, with the autonomous driving system, a steering wheel of the ego vehicle so that the ego vehicle does not change lanes to the target lane at the target time wherein the modifying is based on the determination that the ego vehicle is changing lanes to the target lane at the target time and the estimating that the ego vehicle and the remote vehicle will collide at the target lane at the target time. \n | 18. The computer program product of claim 17, wherein the operations further comprise: \ngenerating warning data based on estimating that the ego vehicle and the remote vehicle will collide; and \ninputting the warning data to the autonomous driving system; \nwherein modifying the steering wheel is responsive to the warning data being input to the autonomous driving system and based on the determination that the ego vehicle is changing lanes to the target lane at the target time and the estimation that the ego vehicle and the remote vehicle will collide at the target lane at the target time. \n | 19. The computer program product of claim 17, wherein the operations further comprise providing a warning to a driver of the ego vehicle.\n | 20. The computer program product of claim 19, wherein the warning is selected from a group that consists of: generating a warning message that is displayed on a display device of the ego vehicle; and generating a warning sound that is played over a speaker of the ego vehicle.", "output": "The method involves determining, based on a payload for a Vehicle-to-Everything (V2X) message originated by a remote vehicle, whether the remote vehicle is changing lanes to a target lane at a target time. It is determined that the ego vehicle is changing lanes to the target lane at approximately the target time. It is estimated that the ego vehicle and the remote vehicle will collide at the target lane at the target time. An operation of a vehicle component of the ego vehicle is modified so that the ego vehicle does not change lanes to the target lane at the target time. The Vehicle-to-Everything message is a dedicated Short-Range Communication message. INDEPENDENT CLAIMS are included for the following:a system included in an ego vehicle; anda computer program product comprising instructions that, when executed by a processor, causes the processor to perform operations. Method for providing a warning to a driver of an ego vehicle about a potential collision. When the automated system is enabled, driver attention is not required for the autonomous vehicle to operate safely and consistent with accepted norms. The drawing shows a schematic representation of an operating environment for a warning system. 100Operating environment105Network123Ego vehicle124Remote vehicle125Processor", "instruction": "Please summarize the input" }, { "input": "Generating real-time high-definition (HD) maps using wireless vehicle data of a remote vehicleThe disclosure includes embodiments for generating a real-time high-definition (HD) map for an ego vehicle using wireless vehicle data of a remote vehicle. In some embodiments, a method includes receiving a V2X wireless message which includes remote GPS data and remote road parameter data of the remote vehicle. The method includes retrieving ego GPS data of the ego vehicle. The method includes generating ego road parameter data describing an initial estimate of a geometry of a road on which the ego vehicle is located. The method includes fusing the ego road parameter data and the remote road parameter data to form fused road parameter data which describes an improved estimate of the geometry of the road that is more accurate than the initial estimate. The method includes generating a real-time HD map based on the remote GPS data, the ego GPS data, and the fused road parameter data.What is claimed is: \n | 1. A method for an ego vehicle including a Global Positioning System (GPS) unit, comprising: \nreceiving a Vehicle-to-Anything (V2X) wireless message from a network, wherein the V2X wireless message includes remote GPS data, remote road parameter data, and path history data describing path history points of a remote vehicle generated by the remote vehicle; \nretrieving ego GPS data from the GPS unit of the ego vehicle; \ngenerating ego road parameter data describing an initial estimate of a geometry of a road on which the ego vehicle is located; \nfusing the ego road parameter data and the remote road parameter data to form fused road parameter data; and \ngenerating high-definition (HD) map data describing a real-time HD map based on the remote GPS data, the ego GPS data, the path history data, and the fused road parameter data by: \ngenerating one or more interpolated points based on the path history points; and \ngenerating a path of the remote vehicle on the real-time HD map based on the path history points and the one or more interpolated points. \n | 2. The method of claim 1, wherein the ego vehicle further includes a vehicle control system and the method further comprises: \ninputting the HD map data to the vehicle control system to improve accuracy of a vehicle tracking technique controlled by the vehicle control system; and \nexecuting the vehicle tracking technique based on the HD map data to track movement of the remote vehicle. \n | 3. The method of claim 2, wherein the vehicle tracking technique is a vehicle-lane-change detection technique, and executing the vehicle tracking technique based on the HD map data comprises: \nexecuting the vehicle-lane-change detection technique to detect a lane change of the remote vehicle based on the HD map data. \n | 4. The method of claim 2, wherein inputting the HD map data to the vehicle control system includes modifying an operation of the vehicle control system based on the HD map data so that the vehicle control system controls an operation of the ego vehicle based on the HD map data.\n | 5. The method of claim 2, wherein the vehicle control system includes one of an Advanced Driver Assistance System (ADAS system) or an autonomous driving system.\n | 6. The method of claim 1, wherein: \nthe remote road parameter data includes a lateral offset of the remote vehicle from a center of a reference lane; and \na lateral offset for each of the one or more interpolated points is estimated based on the lateral offset of the remote vehicle. \n | 7. The method of claim 6, wherein the remote road parameter data further includes one or more of a relative heading of the remote vehicle, curvature of the reference lane, or a curvature change rate of the reference lane.\n | 8. The method of claim 1, wherein the real-time HD map provides an estimate of a position of the remote vehicle that is accurate within plus or minus half a width of a lane on the road.\n | 9. The method of claim 1, wherein the remote GPS data describes a geographical location of the remote vehicle, and the ego GPS data describes a geographical location of the ego vehicle.\n | 10. The method of claim 1, wherein the fusing is achieved by Kalman filtering.\n | 11. The method of claim 1, wherein the V2X wireless message is selected from a group that consists of: a Basic Safety Message; a Long-Term Evolution (LTE) message; a LTE-V2X message; a 5G-LTE message; or a millimeter wave message.\n | 12. A system comprising: \na Global Positioning System (GPS) unit of an ego vehicle; and \nan onboard vehicle computer system that is communicatively coupled to the GPS unit, the onboard vehicle computer system including a non-transitory memory storing computer code which, when executed by the onboard vehicle computer system, causes the onboard vehicle computer system to: \nreceive a Vehicle-to-Anything (V2X) wireless message from a network, wherein the V2X wireless message includes remote GPS data, remote road parameter data, and path history data describing path history points of a remote vehicle generated by a remote vehicle; \nretrieve ego GPS data from the GPS unit of the ego vehicle; \ngenerate ego road parameter data describing an initial estimate of a geometry of a road on which the ego vehicle is located; \nfuse the ego road parameter data and the remote road parameter data to form fused road parameter data; and \ngenerate high-definition (HD) map data describing a real-time HD map based on the remote GPS data, the ego GPS data, the path history data, and the fused road parameter data by: \ngenerating one or more interpolated points based on the path history points; and \ngenerating a path of the remote vehicle on the real-time HD map based on the path history points and the one or more interpolated points. \n | 13. The system of claim 12, wherein the system further includes a vehicle control system and the computer code which, when executed by the onboard vehicle computer system, causes the onboard vehicle computer system further to: \ninput the HD map data to the vehicle control system to improve accuracy of a vehicle tracking technique controlled by the vehicle control system; and \nexecute the vehicle tracking technique based on the HD map data to track movement of the remote vehicle. \n | 14. The system of claim 13, wherein the computer code which, when executed by the onboard vehicle computer system, causes the onboard vehicle computer system further to: \ninput the HD map data to the vehicle control system for modifying an operation of the vehicle control system based on the HD map data so that the vehicle control system controls an operation of the ego vehicle based on the HD map data. \n | 15. The system of claim 13, wherein the vehicle control system includes one of an Advanced Driver Assistance System (ADAS system) or an autonomous driving system.\n | 16. The system of claim 12, wherein: \nthe remote road parameter data includes a lateral offset of the remote vehicle from a center of a reference lane; and \na lateral offset for each of the one or more interpolated points is estimated based on the lateral offset of the remote vehicle. \n | 17. The system of claim 16, wherein the remote road parameter data further includes one or more of a relative heading of the remote vehicle, curvature of the reference lanes a curvature change rate of the reference lane.\n | 18. A computer program product comprising a non-transitory memory of an onboard vehicle computer system of an ego vehicle storing computer-executable code that, when executed by a processor, causes the processor to: \nreceive a Vehicle-to-Anything (V2X) wireless message from a network, wherein the V2X wireless message includes remote GPS data, remote road parameter data, and path history data describing path history points of a remote vehicle generated by the remote vehicle; \nretrieve ego GPS data from a GPS unit of the ego vehicle; \ngenerate ego road parameter data describing an initial estimate of a geometry of a road on which the ego vehicle is located; \nfuse the ego road parameter data and the remote road parameter data to form fused road parameter data; and \ngenerate high-definition (HD) map data describing a real-time HD map based on the remote GPS data, the ego GPS data, the path history data, and the fused road parameter data by: \ngenerating one or more interpolated points based on the path history points; and \ngenerating a path of the remote vehicle on the real-time HD map based on the path history points and the one or more interpolated points. \n | 19. The computer program product of claim 18, wherein the computer-executable code that, when executed by the processor, causes the processor further to: \ninput the HD map data to a vehicle control system to improve accuracy of a vehicle tracking technique controlled by the vehicle control system; and \nexecute the vehicle tracking technique based on the HD map data to track movement of the remote vehicle. \n | 20. The computer program product of claim 18, wherein the computer-executable code that, when executed by the processor, causes the processor further to: \ninput the HD map data to a vehicle control system for modifying an operation of the vehicle control system based on the HD map data so that the vehicle control system controls an operation of the ego vehicle based on the HD map data.", "output": "The method involves receiving a vehicle-to-anything wireless message from a network (125), where the vehicle-to-anything wireless message includes remote Global positioning system data and remote road parameter data generated by a remote vehicle (124). The ego global positioning system data is retrieved from the global positioning system unit of an ego vehicle (123). The ego road parameter data is generated, which describes an initial estimate of a geometry of a road, on which the ego vehicle is located. INDEPENDENT CLAIMS are included for the following:a system for generating a real-time high-definition map for an ego vehicle having a global positioning system unit; anda computer program product. Method for generating a real-time high-definition map for an ego vehicle having a global positioning system unit. The accuracy of real-time high-definition maps is ensured. The drawing shows a schematic representation of a process for generating a real-time high-definition map. 123Ego vehicle124Remote vehicle125Network127Memory145Communication unit", "instruction": "Please summarize the input" }, { "input": "Theft deterrent system for connected vehicles based on wireless messagesThe disclosure includes embodiments for theft deterrent for a connected vehicle using Basic Safety Message (BSM)-based Vehicle-to-Anything (V2X) communication. In some embodiments, a method includes receiving, by a V2X radio, a wireless message that is transmitted by a first connected vehicle whose ignition is disengaged. In some embodiments, the method includes determining, based on a payload of the wireless message, that the first connected vehicle is being subjected to criminal activity. In some embodiments, the method includes taking a remedial action responsive to determining that the first connected vehicle is being subjected to criminal activity. In some embodiments, the remedial action is operable to deter the occurrence of the criminal activity (i.e., stop the first connected vehicle from being stolen) or gather visual evidence of the criminal activity.What is claimed is: \n | 1. A method comprising: \ntransmitting, by a first Vehicle-to-Anything (V2X) radio of a first connected vehicle to a second connected vehicle, a first wireless message, wherein the first V2X radio is powered on while an ignition of the first connected vehicle is disengaged and responsive to the ignition being disengaged switching from a first transmission rate to a second transmission rate to prevent battery drainage and wherein the second transmission rate is less frequent than the first transmission rate; \ndetermining, based on a payload of the first wireless message, whether the first connected vehicle is being subjected to criminal activity; \nresponsive to determining that the first connected vehicle is being subjected to criminal activity, taking a remedial action that includes triggering an alarm system of the second connected vehicle so that the alarm system provides a warning notification that the first connected vehicle is being subjected to criminal activity; \nand transmitting, to the second connected vehicle, a disengagement notification that states that the first connected vehicle is going to disengage, wherein the disengagement notification is trigged by a low battery level. \n | 2. The method of claim 1, wherein the first transmission rate is once every 0.10 seconds and the second transmission rate is once every three to five seconds.\n | 3. The method of claim 1, wherein the disengagement notification is part of a basic safety message.\n | 4. The method of claim 1, wherein the first connected vehicle is an autonomous vehicle.\n | 5. The method of claim 1, wherein the warning notification includes one or more of honking a horn or flashing headlamps.\n | 6. The method of claim 1, wherein a group of second connected vehicles receive the first wireless message such that the group of second connected vehicles activate their alarm systems and simultaneously provide the warning notification that the first connected vehicle is being subjected to criminal activity.\n | 7. The method of claim 1, wherein the first connected vehicle is parked and an ignition of the first connected vehicle is disengaged.\n | 8. The method of claim 1, wherein the remedial action further includes activating one or more onboard external cameras of the first connected vehicle so that one or more of images and video of the criminal activity are recorded.\n | 9. The method of claim 1, wherein a group of second connected vehicles receive the first wireless message such that the group of second connected vehicles activate their onboard external cameras and simultaneously record one or more of images and video of the criminal activity from various points of view.\n | 10. The method of claim 8, wherein the one or more of the images and the video are wirelessly transmitted to a third connected device that is operated by a law enforcement agency.\n | 11. The method of claim 1, wherein the remedial action further includes displaying a warning message indicating that the first connected vehicle is being subjected to criminal activity or one or more images of the criminal activity.\n | 12. A system comprising: \na processor communicatively coupled to a first Vehicle-to-Anything (V2X) radio of a first connected vehicle and non-transitory memory, wherein the first V2X radio is operable to transmit a first wireless message to a second connected vehicle, the first V2X radio is powered on while an ignition of the first connected vehicle is disengaged and responsive to the ignition being disengaged switching from a first transmission rate to a second transmission rate to prevent battery drainage, the second transmission rate is less frequent than the first transmission rate, and the non-transitory memory stores computer code that is operable, when executed by the processor, to cause the processor to: \ndetermine, based on a payload of the first wireless message, that the first connected vehicle is being subjected to criminal activity; \nresponsive to determining that the first connected vehicle is being subjected to criminal activity, take a remedial action that includes triggering an alarm system of the second connected vehicle so that the alarm system provides a warning notification that the first connected vehicle is being subjected to criminal activity; \nand transmit, to the second connected vehicle, a disengagement notification that states that the first connected vehicle is IP-A-3443 going to disengage, wherein the disengagement notification is trigged by a low battery level. \n | 13. The system of claim 12, wherein the first connected vehicle is an autonomous vehicle.\n | 14. The system of claim 12, wherein the first transmission rate is once every 0.10 seconds and the second transmission rate is to once every three to five seconds.\n | 15. The system of claim 12, wherein a group of second connected vehicles receive the first wireless message such that the group of second connected vehicles activate their alarm systems and simultaneously provide the warning notification that the first connected vehicle is being subjected to criminal activity.\n | 16. The system of claim 13, wherein the first connected vehicle is parked and an ignition of the first connected vehicle is disengaged.\n | 17. The system of claim 12, wherein the remedial action further includes activating one or more onboard external cameras of the first connected vehicle so that one or more of images and video of the criminal activity are recorded.\n | 18. The system of claim 12, wherein a group of second connected vehicles receive the first wireless message such that the group of second connected vehicles activate their onboard external cameras and simultaneously record one or more of images and video of the criminal activity from various points of view.\n | 19. The system of claim 18, wherein the remedial action further includes displaying a warning message indicating that the first connected vehicle is being subjected to criminal activity or one or more images of the criminal activity.\n | 20. A non-transitory computer program product comprising instructions that, when executed by one or more processors, cause the one or more processors to perform operations comprising: \ninstructing a first Vehicle-to-Anything (V2X) radio of a first connected vehicle to transmit a first wireless message to a second connected vehicle, wherein the first V2X radio is powered on while an ignition of the first connected vehicle is disengaged and responsive to the ignition being disengaged switching from a first transmission rate to a second transmission rate to prevent battery drainage and wherein the second transmission rate is less frequent than the first transmission rate; \ndetermining, based on a payload of the first wireless message, that the first connected vehicle is being subjected to criminal activity; \nresponsive to determining that the first connected vehicle is being subjected to criminal activity, taking a remedial action that includes triggering an alarm system of the second connected vehicle so that the alarm system provides a warning notification that the first connected vehicle is being subjected to criminal activity; \nand transmitting, to the second connected vehicle, a disengagement notification that states that the first connected vehicle is going to disengage, wherein the disengagement notification is trigged by a low battery level.", "output": "The method involves receiving a wireless message that is transmitted by a first connected vehicle whose ignition is disengaged by a V2X radio (144). The first connected vehicle is subjected to criminal activity is determined based on a payload of the wireless message. A remedial action is taken responsive to determine that the first connected vehicle is subjected to criminal activity. The wireless message is a dedicated short-range communication (DSRC) message and the payload is compliant with the DSRC standard. The DSRC message is not one of the followingwireless fidelity (WiFi) message,third generation (3G) message,fourth generation (4G) message, long-term evolution (LTE) message, millimeter wave communication message,Bluetooth message and a satellite communication. INDEPENDENT CLAIMS are included for the following:Theft deterrent system for connected vehicle; anda computer program product for performing theft deterrent for connected vehicle. Method for performing theft deterrent for connected vehicle using basic safety message (BSM)-based vehicle-to-anything (V2X) communication. The driver attention is not required for the autonomous vehicle to operate safely and consistent with accepted norms when the automated system is enabled. A global positioning system (GPS) unit provides positional information that positions the conventional GPS unit with an accuracy of plus or minus 10 meters of the actual position of the conventional GPS unit. The deterrent system provides reduced vehicle theft, improves the performance of a car alarm system and reduces the cost of manufacturing car alarm system by connecting multiple vehicles through BSM-based V2X communication. A vehicle theft or suspicious circumstance is detected by the deterrent system. The drawing shows a block diagram of a deterrent system for connected vehicle. 105Network120Bus125Processor127Memory144V2X radio", "instruction": "Please summarize the input" }, { "input": "Target-lane relationship recognition apparatusA target-lane relationship recognition apparatus mounted on a vehicle includes a sensor that detects a situation around the vehicle, a memory device in which a map data indicating a boundary position of a lane on a map is stored, and a processing device. The processing device is configured to: (a) acquire, based on the sensor detection result, target information regarding a moving target and a stationary target around the vehicle; (b) acquire, based on the map data and position-orientation of the vehicle, lane geometry information indicating a lane geometry around the vehicle; (c) adjust the lane geometry to generate an adjusted lane geometry satisfying a condition that the moving target is located within a lane and the stationary target is located outside of any lane; and (d) generate target-lane relationship information indicating a positional relationship between the moving target and the adjusted lane geometry.What is claimed is: \n | 1. A target-lane relationship recognition apparatus mounted on a vehicle, comprising: \na sensor configured to detect a situation around the vehicle; \na memory device in which a map data indicating a boundary position of a lane on a map is stored; and \na processor configured to perform: \ntarget information acquisition processing that acquires, based on a result of detection by the sensor, target information regarding a moving target and a stationary target around the vehicle; \nlane geometry acquisition processing that acquires, based on the map data and a position and an orientation of the vehicle, lane geometry information indicating a lane geometry around the vehicle; \nlane geometry adjustment processing that adjusts the lane geometry to generate an adjusted lane geometry satisfying a condition that the moving target is located within a lane and the stationary target is located outside of any lane; and \ninformation generation processing that generates target-lane relationship information indicating a positional relationship between the moving target and the adjusted lane geometry, wherein \nthe lane geometry is represented by a group of plural elements, and \nthe processor performs the lane geometry adjustment processing such that the adjusted lane geometry maintains a relative positional relationship between the plural elements. \n | 2. The target-lane relationship recognition apparatus according to claim 1, wherein \nthe target information includes a position of a representative point of the moving target, and \nthe condition includes the representative point being located within the lane within which the moving target is located. \n | 3. The target-lane relationship recognition apparatus according to claim 1, wherein \nthe target information includes respective positions of a plurality of detected points defining a size of the moving target, and \nthe condition includes all of the plurality of detected points being located within the lane within which the moving target is located. \n | 4. The target-lane relationship recognition apparatus according to claim 1, wherein \nthe target information includes a trajectory of the moving target, and \nthe condition includes the trajectory being located within the lane within which the moving target is located. \n | 5. The target-lane relationship recognition apparatus according to claim 1, wherein \nthe target information includes a position and a velocity of the moving target, and wherein \nin the lane geometry adjustment processing, the processor predicts a future position of the moving target based on the position and the velocity of the moving target, and the condition further includes that the future position of the moving target remains in the same lane as a current position of the moving target. \n | 6. The target-lane relationship recognition apparatus according to claim 1, wherein \nthe target information includes a position and a velocity of the moving target, and wherein \nin the lane geometry adjustment processing, the processor calculates a tangent line of a lane boundary closest to the moving target, and adds an angle between the tangent line and a vector of the velocity of the moving target being equal to or less than a threshold value to the condition. \n | 7. The target-lane relationship recognition apparatus according to claim 1, wherein \nin the target information acquisition processing, the processor further acquires target information regarding a lane changing target that is a second moving target in a middle of lane changing, and wherein \nin the lane geometry adjustment processing, the processor adds the lane changing target overlapping a lane boundary to the condition. \n | 8. The target-lane relationship recognition apparatus according to claim 1, further comprising a communication device configured to acquire, through a vehicle-to-vehicle communication or a vehicle-to-infrastructure communication, another vehicle's lane information indicating a travel lane of another vehicle around the vehicle, wherein \nin the lane geometry adjustment processing, the condition further includes that the moving target is located within the travel lane of the another vehicle. \n | 9. The target-lane relationship recognition apparatus according to claim 1, wherein \nin the lane geometry adjustment processing, the processor recognizes a white line position based on the result of detection by the sensor, and adds a distance between the white line position and a lane boundary being equal to or less than a threshold value to the condition. \n | 10. The target-lane relationship recognition apparatus according to claim 1, wherein \nthe processor further performs a driving assist control or an autonomous driving control by using the target-lane relationship information.", "output": "The apparatus has a sensor detecting a situation around a vehicle (1). A memory device stores map data indicating a boundary position of a lane (L1). A processing device performs lane geometry adjustment processing that adjusts a lane geometry (LG) to generate an adjusted lane geometry satisfying a condition that a moving target (TM1) is located within the lane and a stationary target (TS1) is located outside of the lane and information generation processing that generates target-lane relationship information indicating a positional relationship between the moving target and the lane geometry. Apparatus for recognizing a positional relationship between a target and a lane around a vehicle. The apparatus utilizes the adjusted lane geometry consistent with an actual condition so as to accurately recognize the positional relationship between the surrounding target and the surrounding lane, thus accurately recognizing the preceding vehicle. The drawing shows a schematic view of an apparatus for recognizing a positional relationship between a target and a lane around a vehicle. L1, L2LanesLGLane geometryTM1, TM2Moving targetsTS1, TS2Stationary targets1Vehicle", "instruction": "Please summarize the input" }, { "input": "AUTOMATED VALET PARKING SYSTEM, CONTROL METHOD OF AUTOMATED VALET PARKING SYSTEM, AND AUTONOMOUS DRIVING VEHICLEAn automatic parking system for performing automatic parking of an autonomously driving vehicle by giving an instruction regarding a target vehicle speed and a target route to the autonomously driving vehicle in a parking lot, comprising: an autonomously driving vehicle and a communication capable vehicle that is not capable of autonomous driving but is capable of vehicle-to-vehicle communication; A vehicle information acquisition unit that acquires location information in the parking lot of the target vehicle, and a condition determination unit that determines whether the target vehicle satisfies the vehicle-to-vehicle communication driving condition based on the location information in the parking lot of the target vehicle acquired by the vehicle information acquisition unit and a vehicle-to-vehicle communication driving instructing unit for causing the target vehicle determined by the condition determining unit to satisfy the vehicle-to-vehicle communication driving condition to perform vehicle-to-vehicle communication traveling by vehicle-to-vehicle communication, wherein in vehicle-to-vehicle communication driving, automatic parking Prioritize vehicle speed adjustment of vehicle access suppression by vehicle-to-vehicle communication over the target vehicle speed for self-driving vehicles.|1. An automatic parking system that performs automatic parking of the autonomous vehicle by giving instructions regarding target vehicle speed and target route to the autonomous vehicle in a parking lot, wherein the vehicle information acquisition unit acquires location information of the autonomous vehicle in the parking lot. and a condition determination unit that determines whether the autonomous vehicle satisfies preset vehicle-to-vehicle communication driving conditions, based on the location information of the autonomous vehicle in the parking lot acquired by the vehicle information acquisition unit, and the condition determination unit. and an inter-vehicle communication driving instruction unit that causes the autonomous vehicle, which is determined to have satisfied the inter-vehicle communication driving conditions, to perform inter-vehicle communication driving by inter-vehicle communication, wherein in the inter-vehicle communication driving, the automatic parking The vehicle speed adjustment for vehicle approach suppression by vehicle-to-vehicle communication is given priority over the target vehicle speed for the autonomous driving vehicle, and the vehicle information acquisition unit, Acquires location information within the parking lot of a target vehicle including the autonomous driving vehicle and a communication capable vehicle that is incapable of autonomous driving but is capable of vehicle-to-vehicle communication, and the condition determination unit determines the location of the target vehicle within the parking lot acquired by the vehicle information acquisition unit. Based on the information, it is determined whether the target vehicle satisfies a preset vehicle-to-vehicle communication driving condition, and the vehicle-to-vehicle communication driving instruction unit determines that the vehicle-to-vehicle communication driving condition is determined by the condition determination unit. The target vehicle is caused to perform the vehicle-to-vehicle communication drive by vehicle-to-vehicle communication, and the condition determination unit determines that, when the number of the target vehicles in the same lane in the parking lot is equal to or greater than the lane number threshold, the condition determination unit determines that the number of target vehicles in the same lane is greater than or equal to the lane number threshold. An automatic parking system that determines that the plurality of target vehicles running satisfies the vehicle-to-vehicle communication driving conditions.\n | 2. delete\n | 3. delete\n | 4. The method of claim 1, wherein the condition determination unit is configured to select, when there are a plurality of target vehicles whose inter-vehicle distances are less than the inter-vehicle distance threshold, the plurality of target vehicles whose inter-vehicle distances are less than the inter-vehicle distance threshold. An automatic parking system that determines that vehicle-to-vehicle communication driving conditions have been met.\n | 5. The method according to claim 1 or 4, wherein the vehicle information acquisition unit acquires location information within the parking lot of a general vehicle that is unable to drive autonomously and is unable to communicate between vehicles, and the condition determination unit determines that the general vehicle is in the parking lot. An automatic parking system where, when there are a plurality of target vehicles located on a driving lane, it is determined that the plurality of target vehicles on the lane satisfy the vehicle-to-vehicle communication driving condition.\n | 6. The method according to claim 1 or 4, wherein the vehicle information acquisition unit acquires location information within the parking lot of a general vehicle incapable of autonomous driving and vehicle-to-vehicle communication, and the condition determination unit determines the location of the vehicle within a preset area in the parking lot. An automatic parking system that determines that, when the general vehicle enters, the plurality of target vehicles in the setting area satisfy the vehicle-to-vehicle communication driving conditions.\n | 7. The method according to claim 1 or 4, wherein, when the target vehicle that performs vehicle-to-vehicle communication travel exists, the target vehicle is based on position information in the parking lot of a plurality of target vehicles acquired by the vehicle information acquisition unit. It further includes a release condition determination unit that determines whether the preset release condition is satisfied, and the vehicle-to-vehicle communication driving instruction unit determines that the vehicle-to-vehicle connection of the target vehicle has been determined to have satisfied the release condition by the release condition decision unit. An automatic parking system that terminates communication driving.\n | 8. A control method of an automatic parking system that executes automatic parking of an autonomous vehicle by giving instructions regarding target vehicle speed and target route to the autonomous vehicle in a parking lot, wherein location information of the autonomous vehicle in the parking lot is acquired. an information acquisition step, and a condition determination step for determining whether the autonomous vehicle satisfies preset vehicle-to-vehicle communication driving conditions based on location information of the autonomous vehicle within the parking lot acquired in the vehicle information acquisition step; and an inter-vehicle communication driving instruction step that causes the autonomous vehicle, which is determined to have satisfied the inter-vehicle communication driving conditions in the condition determination step, to perform inter-vehicle communication driving by inter-vehicle communication, and in the inter-vehicle communication driving,, the vehicle speed adjustment for vehicle approach suppression by the vehicle-to-vehicle communication is given priority to the autonomous driving vehicle over the target vehicle speed for the automatic parking, and in the vehicle information acquisition step, Position information in the parking lot of target vehicles including the autonomous driving vehicle and a communication capable vehicle that is incapable of autonomous driving but is capable of vehicle-to-vehicle communication is acquired, and in the condition determination step, the parking lot of the target vehicle acquired in the vehicle information acquisition step is acquired. Based on the position information in the vehicle, it is determined whether the target vehicle satisfies a preset inter-vehicle communication driving condition, and in the inter-vehicle communication driving instruction step, it is determined that the inter-vehicle communication driving condition is satisfied in the condition determination step. The target vehicle is made to perform the vehicle-to-vehicle communication drive by vehicle-to-vehicle communication, and in the condition determination step, when the number of target vehicles in the same lane in the parking lot is equal to or greater than the lane number threshold, A control method of an automatic parking system, wherein it is determined that the plurality of target vehicles traveling in the same lane satisfy the vehicle-to-vehicle communication driving conditions.\n | 9. An autonomous driving vehicle that performs automatic parking in a parking lot based on instructions regarding a target vehicle speed and a target route from an automatic parking system in the parking lot, comprising: a own vehicle location recognition unit that recognizes a location in the parking lot; and the autonomous driving vehicle. A driving state recognition unit that recognizes the driving state of the autonomous vehicle based on an internal sensor of the vehicle, the driving state of the autonomous vehicle recognized by the driving state recognition unit, and the target vehicle speed indicated by the automatic parking system, or Based on the comparison result of the target route, a vehicle-side condition determination unit for determining whether the autonomous vehicle satisfies a preset vehicle-side vehicle-to-vehicle communication driving condition, and the vehicle-side condition determination unit determines whether the vehicle-side vehicle-to-vehicle communication driving condition is and an inter-vehicle communication driving executing unit that performs inter-vehicle communication driving of the autonomous driving vehicle by inter-vehicle communication when it is determined that the inter-vehicle communication driving conditions are met, and in the inter-vehicle communication driving, The vehicle speed adjustment for vehicle approach suppression by the vehicle-to-vehicle communication is given priority over the target vehicle speed for automatic parking, and the vehicle-side condition determination unit compares the target vehicle speed indicated by the automatic parking system according to the location in the parking lot. Then, when the vehicle speed of the autonomous vehicle at the location is greater than the vehicle speed determination threshold, it is determined that the vehicle-side vehicle-to-vehicle communication driving condition is satisfied.\n | 10. delete", "output": "The system (1) has a vehicle information acquisition unit for acquiring positional information of an autonomous driving vehicle (23) in a parking place. A condition determination unit determines whether the autonomous driving vehicle satisfies a preset inter-vehicle communication traveling condition based on the positional information of the autonomous driving vehicle in the parking place acquired by the vehicle information acquisition unit. An inter-vehicle communication traveling instruction unit causes the autonomous driving vehicle determined to satisfy the inter-vehicle communication traveling condition by the condition determination unit to perform inter-vehicle communication traveling by inter-vehicle communication, where the autonomous driving vehicle preferentially performs vehicle speed adjustment for vehicle approach suppression by the inter-vehicle communication over target vehicle speed for an automated valet parking in the inter-vehicle communication traveling. INDEPENDENT CLAIMS are included for:(a) Method for controlling an automated valet parking system;(b) Autonomous driving vehicle. System for executing automated valet parking of an autonomous driving vehicle (claimed) in a parking place by issuing an instruction related to target vehicle speed and a target route to the autonomous driving vehicle. The system prevents inter-vehicle distance between the autonomous driving vehicle and the target vehicle from being close in the parking place. The system comprises a vehicle-side condition determination unit that determines whether the autonomous vehicle satisfies the inter- vehicle communication traveling condition based on a comparison result between the traveling state of the vehicle and target vehicle speed or the target route as instructed from the automated valet parking system, so that vehicle speed adjustment for vehicle approach suppression by the inter-vehicle communication is preferentially performed over the vehicle vehicle speed for the automated Valet parking. The drawing shows a schematic view of an automated valet parking system.1System for executing automated valet parking of autonomous driving vehicle 3Parking place sensor 4Parking place map database 23Autonomous driving vehicle NNetwork ", "instruction": "Please summarize the input" }, { "input": "Enhanced vehicle-to-everything (V2X) communications using a satellite/airborne interfaceA system, method and apparatus for mobile communications including sidelink transmissions is provided. A user equipment (UE) maintains a first interface between the first UE and a radio access network (RAN) infrastructure node, a second interface between the first UE and a second UE and a third interface between the first UE and a satellite node or an airborne node. The UE switches between interfaces according to a current connectivity state and based on signal attributes associated with individual interfaces.The invention claimed is: \n | 1. A method of wireless communications, comprising: maintaining, by a first user equipment (UE), a plurality of interfaces including a first interface between the first UE and a radio access network (RAN) infrastructure node, a second interface between the first UE and a second UE and a third interface between the first UE and a satellite node or an airborne node; switching, by the first UE, between a first connectivity state and a second connectivity state based on at least one of: \na first signal attribute associated with the first interface, the first signal attribute determined based on a measurement of at least a signal strength or a signal quality associated with the first interface; \na second signal attribute associated with the second interface, the second signal attribute determined based on a measurement of at least a signal strength or a signal quality associated with the second interface; and \na third signal attribute associated with the third interface, the third signal attribute determined based on a measurement of at least a signal strength or a signal quality associated with the third interface; \nwherein: \nthe first UE communicates in accordance with at least one of the first interface and the second interface when the first UE is determined to be in the first connectivity state; and \nthe first UE communicates in accordance with at least one of the second interface and the third interface when the first UE is determined to be in the second connectivity state. \n | 2. The method of claim 1, wherein: \nthe first user equipment (UE) communicates in accordance with the first interface and the second interface when the first UE is determined to be in the first connectivity state; and \nthe first UE communicates in accordance with the third interface when the first UE is determined to be in the second connectivity state. \n | 3. The method of claim 1, wherein a signal strength attribute associated with at least one of the first signal attribute, the second signal attribute and the third signal attribute is based on a reference signal strength indicator (RSSI) or a reference signal received power (RSRP) of a reference signal received via a respective signal interface.\n | 4. The method of claim 1, wherein a signal quality attribute associated with at least one of the first signal attribute, the second signal attribute and the third signal attribute is based on a reference signal received quality (RSRQ) of a reference signal received via a respective interface.\n | 5. The method of claim 1 further comprising receiving configuration parameters of one or more reference signals for determining at least one of the first signal attribute, the second signal attribute or third signal attribute.\n | 6. The method of claim 1, wherein the radio access network (RAN) infrastructure node is a base station.\n | 7. The method of claim 1, wherein the first interface is a Uu interface.\n | 8. The method of claim 7, wherein the Uu interface is based on a first plurality of protocol terminations between the first user equipment (UE) and the radio access network (RAN) infrastructure node.\n | 9. The method of claim 1, wherein the second interface is a PC-5 interface.\n | 10. The method of claim 9, wherein the PC-5 interface is based on a second plurality of protocol terminations between the first user equipment (UE) and the second UE.\n | 11. The method of claim 1, wherein the third interface is a satellite/airborne interface.\n | 12. The method of claim 11, wherein the satellite/airborne interface is based on a third plurality of protocol terminations between the first user equipment (UE) and the satellite node or the airborne node.\n | 13. The method of claim 1, wherein the switching between the first connectivity state to the second connectivity state is based on: \ncomparing the first signal attribute with a first threshold; \ncomparing the second signal attribute with a second threshold; and \ncomparing the third signal attribute with a third threshold. \n | 14. The method of claim 13, wherein the switching between the first connectivity state and the second connectivity state is further based on one or more trigger events.\n | 15. The method of claim 13, wherein the first threshold, the second threshold and the third threshold are pre-determined.\n | 16. The method of claim 13 further comprising receiving, by the first user equipment (UE), configuration parameters indicating the first threshold, the second threshold and the third threshold.\n | 17. The method of claim 16, wherein the configuration parameters are received from the radio access network (RAN) infrastructure node and via the first interface.\n | 18. The method of claim 17, wherein receiving the configuration parameters includes receiving one or more radio resource control (RRC) messages.\n | 19. The method of claim 1, wherein the switching between the first connectivity state and the second connectivity state is further based on quality of service (QoS) requirements of data packets transmitted or received by the first UE.\n | 20. The method of claim 19, wherein: \nthe first user equipment (UE) communicates in accordance with the first interface and the second interface when the first UE is determined to be in the first connectivity state; \nthe first UE communicates in accordance with the third interface when the first UE is determined to be in the second connectivity state; and \nthe first UE switches from the first connectivity state to the second connectivity state based on a determination that a latency associated with communications associated with the third interface match a latency requirement of the data packets. \n | 21. The method of claim 19 further comprising, determining by the first user equipment (UE), the quality of service (QoS) requirements of the data packets based on priority tags of the data packets.\n | 22. The method of claim 21, wherein the priority tags of the data packets indicate that the data packets are associated with one or more of: \nsafety related information; \nfirefighters or government agencies; \nautonomous driving; and \ntraffic information. \n | 23. The method of claim 1 further comprising synchronizing the connectivity state of the first user equipment (UE) with a Global Navigation Satellite System (GNSS).\n | 24. The method of claim 1, wherein: \nthe maintaining the first interface, the second interface and the third interface comprises registering by the first user equipment (UE) with a Core Network node, wherein a context of the first UE associated with the first interface, the second interface and the third interface is established in the Core Network. \n | 25. The method of claim 1, wherein the switching from the first connectivity state to the second connectivity state is further based on one or more of: \na first duration for which the first signal attribute maintains at least one of a first signal strength or signal quality requirement; \na second duration for which the second signal attributes maintains at least one of a second signal strength or signal quality requirement; and \na third duration for which the third signal attribute maintains at least one of a third signal strength or signal quality requirement. \n | 26. The method of claim 25, wherein \nthe first user equipment (UE) communications in accordance with the first interface and the second interface when the first UE is determined to be in the first connectivity state; \nwherein the first UE communicates in accordance with the third interface when the first UE is determined to be in the second connectivity state; and \nthe first UE switches from the first connectivity state to the second connectivity state when the first duration is smaller than a first duration threshold or the second duration is smaller than a second duration threshold, and, the third duration is larger than a third duration threshold. \n | 27. The method of claim 26 further comprising receiving, by the first user equipment (UE), configuration parameters indicating the first duration threshold, the second duration threshold and the third duration threshold.\n | 28. The method of claim 26, wherein the first duration threshold, the second duration threshold and the third duration threshold are pre-determined values.\n | 29. A method of wireless communications, comprising: \nmaintaining, by a first user equipment (UE), a plurality of interfaces including between the first UE and a set of network components associated with wireless communication, wherein the plurality of interfaces facilitate communication of data packets associated with Vehicle to Anything (V2X) services and wherein the plurality of interfaces include a first interface between the first UE and a radio access network (RAN) infrastructure node, a second interface between the first UE and a second UE and a third interface between the first UE and a satellite node or an airborne node; \nmanaging at least one of receipt or transmission of V2X data packets based on a determined connectivity state of the of the first UE, wherein managing the at least one of receipt or transmission of V2X data packets is based on switching, by the first UE, between a first connectivity state and a second connectivity state based on signal attributes associated with at least one of the first, second or third interfaces; \nwherein: \nthe first UE communicates in accordance with at least one of the first interface and the second interface when the first UE is determined to be in the first connectivity state; and \nthe first UE communicates in accordance with at least one of the second interface and the third interface when the first UE is determined to be in the second connectivity state. \n | 30. The method of claim 29, wherein: \nthe first user equipment (UE) communicates in accordance with the first interface and the second interface when the first UE is determined to be in the first connectivity state; and \nthe first UE communicates in accordance with the third interface when the first UE is determined to be in the second connectivity state. \n | 31. The method of claim 29, wherein a signal strength attribute associated with at least one of a first signal attribute, a second signal attribute and a third signal attribute is based on a reference signal strength indicator (RSSI) or a reference signal received power (RSRP) of a reference signal received via a respective signal interface.\n | 32. The method of claim 29, wherein a signal quality attribute associated with at least one of a first signal attribute, a second signal attribute and a third signal attribute is based on a reference signal received quality (RSRQ) of a reference signal received via a respective interface.\n | 33. The method of claim 32 further comprising receiving configuration parameters of one or more reference signals for determining at least one of the first signal attribute, the second signal attribute or third signal attribute.\n | 34. The method of claim 29, wherein the switching between the first connectivity state to the second connectivity state is based on: \ncomparing a first signal attribute with a first threshold; \ncomparing a second signal attribute with a second threshold; and \ncomparing a third signal attribute with a third threshold. \n | 35. The method of claim 34, wherein the switching between the first connectivity state and the second connectivity state is further based on one or more trigger events.\n | 36. The method of claim 29, wherein the switching between the first connectivity state and the second connectivity state is further based on quality of service (QoS) requirements of data packets transmitted or received by the first UE.\n | 37. The method of claim 29, wherein: \nthe first user equipment (UE) communicates in accordance with the first interface and the second interface when the first UE is determined to be in the first connectivity state; \nthe first UE communicates in accordance with the third interface when the first UE is determined to be in the second connectivity state; and \nthe first UE switches from the first connectivity state to the second connectivity state based on a determination that a latency associated with communications associated with the third interface match a latency requirement of the data packets. \n | 38. The method of claim 29, wherein the switching from the first connectivity state to the second connectivity state is further based on one or more of: \na first duration for which a first signal attribute maintains at least one of a first signal strength or signal quality requirement; \na second duration for which a second signal attributes maintains at least one of a second signal strength or signal quality requirement; and \na third duration for which a third signal attribute maintains at least one of a third signal strength or signal quality requirement. \n | 39. The method of claim 38, wherein \nthe first user equipment (UE) communications in accordance with the first interface and the second interface when the first UE is determined to be in the first connectivity state; \nwherein the first UE communicates in accordance with the third interface when the first UE is determined to be in the second connectivity state; and \nthe first UE switches from the first connectivity state to the second connectivity state when the first duration is smaller than a first duration threshold or the second duration is smaller than a second duration threshold, and, the third duration is larger than a third duration threshold.", "output": "The method involves maintaining a set of interfaces including an interface between a user equipment (UE) and a radio access network (RAN) infrastructure node by the UE. Another interface is provided between the UE and another UE, and a third interface is connected between the former UE and a satellite node or an airborne node. The UE is switched between two connectivity states based on a signal attribute associated with the former interface, where the UE communicates in accordance with one of the interfaces when the UE is determined to be in the connectivity states, and the latter interface is a PC-5 interface. Method for performing wireless communication between UE and nodes in a RAN such as evolved universal terrestrial RAN (EUTRAN) , and universal terrestrial RAN (UTRAN) , using interfaces e.g. Uu interface with a RAN node, PC-5 interface with a pedestrian, PC-5 interface with a vehicle and PC-5 interface with an infrastructure node. Uses include but are not limited to smartphones, tablets, laptops, computers, wireless transmission and/or reception units in a vehicle, V2X or Vehicle to Vehicle (V2V) devices, wireless sensors, and internet-of-things (IoT) devices. The method enables utilizing the computing devices to utilize the wireless communication network to facilitate interactions with other devices that can access the network or to facilitate interaction, through the network, with devices utilizing other communication networks in an efficient manner. The drawing shows a block diagram of connectivity states and state transitions.", "instruction": "Please summarize the input" }, { "input": "Information processing apparatus, information processing method, and recording mediumAn information processing apparatus includes: a memory configured to store information about mobile bodies, each of which forms a vehicle by being coupled with a main body unit and is capable of autonomous driving; and a processor configured to transmit a summon command to a first mobile body existing within a predetermined range, the summon command summoning the first mobile body to a predetermined assembly location, and cause a predetermined number of first mobile bodies assembled at the predetermined assembly location to carry one first main body unit that is larger or heavier than a main body unit that can be carried by one first mobile body, and to cause the predetermined number of first mobile bodies to move to a predetermined destination.What is claimed is: \n | 1. An information processing apparatus comprising: \na memory configured to store information about mobile bodies, each of which forms a vehicle by being coupled with a main body unit and is capable of autonomous driving; and \na processor configured to: \nselect, from a plurality of first mobile bodies existing within a predetermined range, a needed number of the first mobile bodies for transport of one first main body unit that is larger or heavier than a main body unit that can be carried by only one of the first mobile bodies; \ntransmit, to user terminals of users who own the selected first mobile bodies, use permission requests for the selected first mobile bodies; \ndetermine whether or not a number of acceptance responses, which are from the user terminals and are for using the selected first mobile bodies, reaches the needed number; \nin a case where the number of acceptance responses does not reach the needed number, (i) select additionally, from the plurality of first mobile bodies existing within the predetermined range, a number of next first mobile bodies corresponding to a shortage of the first mobile bodies and (ii) transmit a use permission request to a user terminal of a user who owns one of the next first mobile bodies additionally selected; \nin a case where the number of acceptance responses reaches the needed number, transmit a summon command to the first mobile bodies for which the acceptance responses from the user terminals were transmitted, those first mobile bodies being summoned first mobile bodies, and the summon command summoning the summoned first mobile bodies to a predetermined assembly location; \ncause the summoned first mobile bodies assembled at the predetermined assembly location to carry the one first main body unit; and \ncause the summoned first mobile bodies to move to a predetermined destination, wherein \neach of the plurality of first mobile bodies is owned by a predetermined user as a private-use vehicle. \n | 2. The information processing apparatus according to claim 1, wherein, when the summoned first mobile bodies carrying the one first main body unit are no longer being used, the processor transmits a dismiss command to the summoned first mobile bodies, the dismiss command instructing movement of the summoned first mobile bodies to respective return locations.\n | 3. The information processing apparatus according to claim 1, wherein \nthe information about the mobile bodies includes information indicating a state of at least one of the mobile bodies, and \nthe first mobile bodies to which the processor transmits the summon command are each in an idle state. \n | 4. The information processing apparatus according to claim 1, wherein the summoned first mobile bodies caused by the processor to assemble at the predetermined assembly location are each in an idle state at a time of reception of the summon command.\n | 5. The information processing apparatus according to claim 1, wherein the processor causes the summoned first mobile bodies to move and carry the one first main body unit by convoy-traveling.\n | 6. The information processing apparatus according to claim 5, wherein the processor controls each of the summoned first mobile bodies in relation to the convoy-traveling.\n | 7. The information processing apparatus according to claim 5, wherein \nthe processor is configured to select from the summoned first mobile bodies, according to a predetermined condition, a mobile body to be a leader of the convoy-traveling at a time of moving and carrying the one first main body unit, and \nthe summoned first mobile bodies perform vehicle-to-vehicle communication with one another and perform the convoy-traveling according to an instruction from the mobile body selected as the leader. \n | 8. The information processing apparatus according to claim 5, wherein the processor selects a first mobile body that is to be a leader from among the summoned first mobile bodies, and the processor causes the summoned first mobile bodies to perform the convoy-traveling.\n | 9. The information processing apparatus according to claim 1, wherein the processor determines the needed number of the first mobile bodies based on the information about the mobile bodies.\n | 10. An information processing method comprising: \nstoring, in a memory, information about mobile bodies, each of which forms a vehicle by being coupled with a main body unit and is capable of autonomous driving; \nselecting, from a plurality of first mobile bodies existing within a predetermined range, a needed number of the first mobile bodies for transport of one first main body unit that is larger or heavier than a main body unit that can be carried by only one of the first mobile bodies; \ntransmitting, to user terminals of users who own the selected first mobile bodies, use permission requests for the selected first mobile bodies; \ndetermining whether or not a number of acceptance responses, which are from the user terminals and are for using the selected first mobile bodies, reaches the needed number; \nin a case where the number of acceptance responses does not reach the needed number, (i) selecting additionally, from the plurality of first mobile bodies existing within the predetermined range, a number of next first mobile bodies corresponding to a shortage of the first mobile bodies and (ii) transmitting a use permission request to a user terminal of a user who owns one of the next first mobile bodies additionally selected; \nin a case where the number of acceptance responses reaches the needed number, transmitting a summon command to the first mobile bodies for which the acceptance response from the user terminals were transmitted, those first mobile bodies being summoned first mobile bodies, and the summon command summoning the summoned first mobile bodies to a predetermined assembly location; \ncausing the summoned first mobile bodies assembled at the predetermined assembly location to carry the one first main body unit; and \ncausing the summoned first mobile bodies to move to a predetermined destination, wherein \neach of the plurality of first mobile bodies is owned by a predetermined user as a private-use vehicle. \n | 11. The information processing method according to claim 10, comprising transmitting, when the summoned first mobile bodies carrying the one first main body unit are no longer being used, a dismiss command to the summoned first mobile bodies, the dismiss command instructing movement of the summoned first mobile bodies to respective return locations.\n | 12. The information processing method according to claim 10, wherein the information about the mobile bodies includes information indicating a state of the at least one of the mobile bodies, and the first mobile bodies to which the summon command is transmitted are each in an idle state.\n | 13. The information processing method according to claim 10, wherein the summoned first mobile bodies move and carry the one first main body unit by convoy-traveling.\n | 14. The information processing method according to claim 13, wherein each of the summoned first mobile bodies is controlled in relation to the convoy-traveling.\n | 15. The information processing method according to claim 13, wherein \na mobile body, which is to be a leader of the convoy-traveling at a time of moving and carrying the one first main body unit, is selected from the summoned first mobile bodies according to a predetermined condition, and \nthe summoned first mobile bodies perform vehicle-to-vehicle communication with one another and perform the convoy-traveling according to an instruction from the mobile body selected as the leader. \n | 16. The information processing method according to claim 13, wherein the summoned first mobile bodies perform vehicle-to-vehicle communication with one another, select from the summoned first mobile bodies a first mobile body that is to be a leader, and perform the convoy-traveling.\n | 17. The information processing method according to claim 10, wherein the needed number of the first mobile bodies is determined based on the information about the mobile bodies.\n | 18. A non-transitory computer-readable medium storing an information processing program for causing a user terminal to: \nreceive a use permission request for a mobile body that is owned by an owner of the user terminal as a private-use vehicle, the mobile body forming a vehicle by being coupled with a main body unit and being capable of autonomous driving; \ntransmit a response to the use permission request; and \nreceive a summon command summoning the mobile body to a predetermined assembly location in a predetermined case, wherein \nthe use permission request requests permission to use the mobile body as a first mobile body, in a case where one first main body unit that is larger or heavier than a main body unit that can be carried by only one first mobile body is to be carried and moved by a needed number of first mobile bodies existing within a predetermined range, and \nthe predetermined case is \na case where the response is an acceptance response for using the mobile body as the first mobile body, and \na case where a number of acceptance responses are determined to reach the needed number for transport of the one first main body unit.", "output": "The apparatus has a memory which is configured to store information about mobile bodies that are configured to form a vehicle by coupled with a main frame unit (300) and is configured to provide autonomous driving. A processor is configured to transmit a summon command to a first mobile frame existed within a predetermined range. The summon command is configured to summon the first mobile frame to a predetermined assembly location. A predetermined number of first mobile frames assembled at the predetermined assembly location is caused to carry one first main frame unit that is larger and heavier than a main frame unit that is carried by one first mobile frame, and the predetermined number of the first mobile frames is caused to move to a predetermined destination. An INDEPENDENT CLAIMS is included for the following:(a)an information processing method;(b)a non-transitory computer-readable medium storing program for processing information. Information processing apparatus. The information processing apparatus is not needed to perform control regarding the convoy-traveling for each of the predetermined number of first mobile bodies, and thus processing load on the information processing apparatus is reduced. The traveling unit not used without permission from the owner user, and the owner user is prevented from interrupted from using the traveling unit. The number of traveling units is enabled to transport one main frame unit to be reduced to smaller number. The drawing shows a schematic view illustrating the system configuration of traveling-unit summoning system.1Center server 4A,4BUser terminals 100Traveling-unit summoning system 200A,200B Traveling units 300Main frame unit ", "instruction": "Please summarize the input" }, { "input": "Road sign recognition for connected vehiclesThe disclosure includes embodiments for providing road sign recognition for connected vehicles. In some embodiments, a method includes determining that a first set of first vehicles have assessed by that a content for a road sign is a first value. In some embodiments, the method includes determining that the content for the road sign is a second value assessed by a second set of second vehicles based, at least in part, on the first vehicles having a same make and model.What is claimed is: \n | 1. A method that is executed by an onboard vehicle computer system of an ego vehicle, the method comprising: \ndetermining first content data from a first set of vehicles that describes content for a road sign; \ndetermining second content data from a second set of vehicles that describes the content for the road sign; and \naccepting the second content data as describing the content of the road sign based, at least in part, on a number of vehicles in the first set of vehicles having a same make and model exceeding a threshold value. \n | 2. The method of claim 1, further comprising: \nassigning a first weight to the first content data based on the number of vehicles in the first set of vehicles exceeding the threshold value; and \nassigning a second weight to the second content data based on a number of vehicles in the second set of vehicles failing to have the same make and model; \nwherein accepting the second content data as describing the content of the road sign is further based on the second weight being greater than the first weight. \n | 3. The method of claim 2, further comprising: \ndetermining a presence of a cluster of vehicles in the first set of vehicles having the same make and model responsive to the number of vehicles in the first set of vehicles exceeding the threshold value; and \nlowering the first weight based on determining the presence of the cluster. \n | 4. The method of claim 1, wherein the ego vehicle is included in the first set of vehicles.\n | 5. The method of claim 1, wherein the ego vehicle included in the second set of vehicles.\n | 6. The method of claim 1, wherein the ego vehicle receives a Vehicle-to-Anything (V2X) message transmitted by a remote vehicle that includes sensor data that includes an image of the road sign.\n | 7. The method of claim 6, further comprising determining that the content for the road sign is the second content data based at least in part on the image of the road sign.\n | 8. The method of claim 6, wherein determining the first content data includes determining a type of road sign or a rule described by the road sign.\n | 9. The method of claim 1, further comprising: \nreceiving sensor data from the first set of vehicles and the second set of vehicles, wherein the sensor data includes images of the road sign; and \ncomparing the sensor data to a sign database, wherein determining the first content data and the second content data is based on comparing the sensor data to the sign database. \n | 10. An onboard vehicle computer system of an ego vehicle comprising: \na processor communicatively coupled to a non-transitory memory that stores computer code that is operable, when executed by the processor, to cause the processor to: \ndetermine first content data from a first set of vehicles that describes content for a road sign; \ndetermine second content data from a second set of vehicles that describes the content for the road sign; and \naccept the second content data as describing the content of the road sign based, at least in part, on a number of vehicles in the first set of vehicles having a same make and model exceeding a threshold value. \n | 11. The system of claim 10, wherein the non-transitory memory stores additional computer code that is operable, when executed by the processor, to cause the processor to: \nassign a first weight to the first content data based on the number of vehicles in the first set of vehicles exceeding the threshold value; and \nassign a second weight to the second content data based on a number of second vehicles in the second set of vehicles failing to have the same make and model; \nwherein accepting the second content data as describing the content of the road sign is further based on the second weight being greater than the first weight. \n | 12. The system of claim 11, wherein the non-transitory memory stores additional computer code that is operable, when executed by the processor, to cause the processor to: \ndetermine a presence of a cluster of vehicles in the first set of vehicles having the same make and model responsive to the number of vehicles in the first set of vehicles exceeding the threshold value; and \nlower the first weight based on determining the presence of the cluster. \n | 13. The system of claim 10, wherein the ego vehicle is included in the first set of vehicles.\n | 14. The system of claim 10, wherein the ego vehicle included in the second set of vehicles.\n | 15. The system of claim 12, further comprising a Vehicle-to-Anything (V2X) radio communicatively coupled to the processor, wherein the V2X radio is operable to receive a V2X message transmitted by a remote vehicle that includes sensor data that includes an image of the road sign.\n | 16. The system of claim 15, wherein determining the first content data includes determining a type of road sign or a rule described by the road sign.\n | 17. A non-transitory computer program product that is an element of an onboard system of an ego vehicle comprising instructions that, when executed by one or more processors, cause the one or more processors to perform operations comprising: \ndetermining first content data from a first set of vehicles that describes content for a road sign; \ndetermining second content data from a second set of vehicles that describes the content for the road sign; and \naccepting the second content data as describing the content of the road sign based, at least in part, on a number of vehicles in the first set of vehicles having a same make and model exceeding a threshold value. \n | 18. The computer program product of claim 17, wherein the operations further comprise: \nassigning a first weight to the first content data based on the number of vehicles in the first set of vehicles exceeding the threshold value; and \nassigning a second weight to the second content data based on a number of second vehicles in the second set of vehicles failing to have the same make and model; \nwherein accepting the second content data as describing the content of the road sign is further based on the second weight being greater than the first weight. \n | 19. The computer program product of claim 17, wherein the ego vehicle includes an Advanced Driver Assistance System (ADAS system) and the ADAS system of the ego vehicle receives digital data describing the determination of the second content data of the road sign generated by the computer program product and uses the determination of the second content data of the road sign to control an ADAS function of the ego vehicle.\n | 20. The computer program product of claim 17, wherein the ego vehicle is an autonomous vehicle and the onboard system of the autonomous vehicle receives digital data describing the determination of the second content data of the road sign generated by the computer program product and uses the determination of the second content data of the road sign to autonomously control an operation of the autonomous vehicle.", "output": "The method involves determining a first set of first vehicles which is judged that a content for a traffic sign (160) is a first value, and determining the content for the traffic sign is a second value that is represented by a second set of second vehicles based on the first vehicles. The first set is numerically greater than the second set such that a majority of a group including the first vehicles and the second vehicles are judged that the content for the traffic sign is the first value. The method is performed by a vehicle-mounted computer system of an ego vehicle (123). INDEPENDENT CLAIMS are included for the following:a system for recognizing traffic sign for networked vehicles; anda computer program product for recognizing traffic sign for networked vehicles. Method for recognizing traffic sign for networked vehicles. The accuracy of the image recognition results of the ego vehicle is improved. The image recognition process is enhanced by correcting for artifact or pixel hyperbole. The control decisions are improved over the control decisions made by the ADAS system without the benefit of the digital data provided by the traffic sign system since the traffic sign is accurately described by digital data. The drawing shows a schematic view of the operating environment for a traffic sign system. (Drawing includes non-English language text) 100Operating environment120Bus122Networked device123Ego vehicle160Traffic sign", "instruction": "Please summarize the input" }, { "input": "ENHANCED VEHICLE-TO-EVERYTHING (V2X) COMMUNICATIONS USING A SATELLITE/AIRBORNE INTERFACEA system, method and apparatus for mobile communications including sidelink transmissions is provided. A user equipment (UE) maintains a first interface between the first UE and a radio access network (RAN) infrastructure node, a second interface between the first UE and a second UE and a third interface between the first UE and a satellite node or an airborne node. The UE switches between interfaces according to a current connectivity state and based on signal attributes associated with individual interfaces.|1-39. (canceled)\n | 40. A method of wireless communications, comprising: \nmaintaining, by a first user equipment (UE), a plurality of interfaces including a first interface between the first UE and a radio access network (RAN) infrastructure node, a second interface between the first UE and a second UE and a third interface between the first UE and a satellite node or an airborne node; \nswitching, by the first UE, between a first connectivity state and a second connectivity state based on at least one of: \na first signal attribute associated with the first interface, the first signal attribute determined based on a measurement of at least a signal strength or a signal quality associated with the first interface; \na second signal attribute associated with the second interface, the second signal attribute determined based on a measurement of at least a signal strength or a signal quality associated with the second interface; and \na third signal attribute associated with the third interface, the third signal attribute determined based on a measurement of at least a signal strength or a signal quality associated with the third interface; \n wherein: \nthe first UE communicates in accordance with at least one of the first interface and the second interface when the first UE is determined to be in the first connectivity state; and \nthe first UE communicates in accordance with the third interface when the first UE is determined to be in the second connectivity state. \n | 41. The method of claim 40, wherein a signal strength attribute associated with at least one of the first signal attribute, the second signal attribute and the third signal attribute is based on a reference signal strength indicator (RSSI) or a reference signal received power (RSRP) of a reference signal received via a respective signal interface.\n | 42. The method of claim 40, wherein a signal quality attribute associated with at least one of the first signal attribute, the second signal attribute and the third signal attribute is based on a reference signal received quality (RSRQ) of a reference signal received via a respective interface.\n | 43. The method of claim 40 further comprising receiving configuration parameters of one or more reference signals for determining at least one of the first signal attribute, the second signal attribute or third signal attribute.\n | 44. The method of claim 40, wherein the switching between the first connectivity state to the second connectivity state is based on: \ncomparing the first signal attribute with a first threshold; \ncomparing the second signal attribute with a second threshold; and \ncomparing the third signal attribute with a third threshold. \n | 45. The method of claim 44, wherein the switching between the first connectivity state and the second connectivity state is further based on one or more trigger events.\n | 46. The method of claim 44 further comprising receiving, by the first user equipment (UE), configuration parameters indicating the first threshold, the second threshold and the third threshold.\n | 47. The method of claim 46, wherein the configuration parameters are received from the radio access network (RAN) infrastructure node and via the first interface.\n | 48. The method of claim 47, wherein receiving the configuration parameters includes receiving one or more radio resource control (RRC) messages.\n | 49. The method of claim 40, wherein the switching between the first connectivity state and the second connectivity state is further based on quality of service (QoS) requirements of data packets transmitted or received by the first UE.\n | 50. The method of claim 49, wherein: \nthe first user equipment (UE) communicates in accordance with the first interface and the second interface when the first UE is determined to be in the first connectivity state; \nthe first UE communicates in accordance with the third interface when the first UE is determined to be in the second connectivity state; and \nthe first UE switches from the first connectivity state to the second connectivity state based on a determination that a latency associated with communications associated with the third interface match a latency requirement of the data packets. \n | 51. The method of claim 49 further comprising, determining by the first user equipment (UE), the quality of service (QoS) requirements of the data packets based on priority tags of the data packets.\n | 52. The method of claim 51, wherein the priority tags of the data packets indicate that the data packets are associated with one or more of: \nsafety related information; \nfirefighters or government agencies; \nautonomous driving; and \ntraffic information. \n | 53. The method of claim 40 further comprising synchronizing the connectivity state of the first user equipment (UE) with a Global Navigation Satellite System (GNSS).\n | 54. The method of claim 40, wherein: \nthe maintaining the first interface, the second interface and the third interface comprises registering by the first user equipment (UE) with a Core Network node, \nwherein a context of the first UE associated with the first interface, the second interface and the third interface is established in the Core Network. \n | 55. The method of claim 40, wherein the switching from the first connectivity state to the second connectivity state is further based on one or more of: \na first duration for which the first signal attribute maintains at least one of a first signal strength or signal quality requirement; \na second duration for which the second signal attributes maintains at least one of a second signal strength or signal quality requirement; and \na third duration for which the third signal attribute maintains at least one of a third signal strength or signal quality requirement. \n | 56. The method of claim 55, wherein \nthe first user equipment (UE) communications in accordance with the first interface and the second interface when the first UE is determined to be in the first connectivity state; \nwherein the first UE communicates in accordance with the third interface when the first UE is determined to be in the second connectivity state; and \nthe first UE switches from the first connectivity state to the second connectivity state when the first duration is smaller than a first duration threshold or the second duration is smaller than a second duration threshold, and, the third duration is larger than a third duration threshold. \n | 57. The method of claim 56 further comprising receiving, by the first user equipment (UE), configuration parameters indicating the first duration threshold, the second duration threshold and the third duration threshold.\n | 58. A first user equipment (UE), comprising a controller configured to execute processes of: \nmaintaining, by a first user equipment (UE), a plurality of interfaces including a first interface between the first UE and a radio access network (RAN) infrastructure node, a second interface between the first UE and a second UE and a third interface between the first UE and a satellite node or an airborne node; \nswitching, by the first UE, between a first connectivity state and a second connectivity state based on at least one of: \na first signal attribute associated with the first interface, the first signal attribute determined based on a measurement of at least a signal strength or a signal quality associated with the first interface; \na second signal attribute associated with the second interface, the second signal attribute determined based on a measurement of at least a signal strength or a signal quality associated with the second interface; and \na third signal attribute associated with the third interface, the third signal attribute determined based on a measurement of at least a signal strength or a signal quality associated with the third interface; \nwherein: \nthe first UE communicates in accordance with at least one of the first interface and the second interface when the first UE is determined to be in the first connectivity state; and \nthe first UE communicates in accordance with the third interface when the first UE is determined to be in the second connectivity state. \n | 59. The first UE of claim 58, wherein the switching between the first connectivity state to the second connectivity state is based on: \ncomparing the first signal attribute with a first threshold; \ncomparing the second signal attribute with a second threshold; and \ncomparing the third signal attribute with a third threshold. \n | 60. The first UE of claim 59, wherein the switching between the first connectivity state and the second connectivity state is further based on one or more trigger events.\n | 61. The first UE of claim 59 further comprising receiving, by the first UE, configuration parameters indicating the first threshold, the second threshold and the third threshold.\n | 62. The first UE of claim 61, wherein the configuration parameters are received from the radio access network (RAN) infrastructure node and via the first interface.\n | 63. The first UE of claim 62, wherein receiving the configuration parameters includes receiving one or more radio resource control (RRC) messages.\n | 64. The first UE of claim 58 further comprising synchronizing the connectivity state of the first user equipment (UE) with a Global Navigation Satellite System (GNSS).\n | 65. A method of wireless communications, comprising: \nsending parameters to a first user equipment (UE) for maintaining, by the first user equipment, a plurality of interfaces including a first interface between the first UE and a radio access network (RAN) infrastructure node, a second interface between the first UE and a second UE and a third interface between the first UE and a satellite node or an airborne node; \nsending a message to command the first UE to switch between a first connectivity state and a second connectivity state based on at least one of: \na first signal attribute associated with the first interface, the first signal attribute determined based on a measurement of at least a signal strength or a signal quality associated with the first interface; \na second signal attribute associated with the second interface, the second signal attribute determined based on a measurement of at least a signal strength or a signal quality associated with the second interface; and \na third signal attribute associated with the third interface, the third signal attribute determined based on a measurement of at least a signal strength or a signal quality associated with the third interface; \nwherein: \nthe base station receives the first signal attribute, the second signal attribute and the third signal attribute from the first UE; \nthe first UE communicates in accordance with at least one of the first interface and the second interface when the first UE is determined to be in the first connectivity state; and \nthe first UE communicates in accordance with the third interface when the first UE is determined to be in the second connectivity state. \n | 66. A base station comprising a controller configured to execute processes of: \nsending parameters to a first user equipment (UE) for maintaining, by the first user equipment, a plurality of interfaces including a first interface between the first UE and a radio access network (RAN) infrastructure node, a second interface between the first UE and a second UE and a third interface between the first UE and a satellite node or an airborne node; \nsending a message to command the first UE to switch between a first connectivity state and a second connectivity state based on at least one of: \na first signal attribute associated with the first interface, the first signal attribute determined based on a measurement of at least a signal strength or a signal quality associated with the first interface; \na second signal attribute associated with the second interface, the second signal attribute determined based on a measurement of at least a signal strength or a signal quality associated with the second interface; and \na third signal attribute associated with the third interface, the third signal attribute determined based on a measurement of at least a signal strength or a signal quality associated with the third interface; \nwherein: \nthe base station receives the first signal attribute, the second signal attribute and the third signal attribute from the first UE; \nthe first UE communicates in accordance with at least one of the first interface and the second interface when the first UE is determined to be in the first connectivity state; and \nthe first UE communicates in accordance with the third interface when the first UE is determined to be in the second connectivity state.", "output": "The method involves maintaining a set of interfaces including a first interface between a user equipment (UE) (125A) and a radio access network (RAN) infrastructure node, a second interface between the UE and another UE and a third interface between the UE and a satellite node or airborne node. The UE is switched between a first connectivity state and a second connectivity state based on one of a first signal attribute associated with the first interface, where the UE communicates in accordance with one of the interfaces when the UE is determined to be in the first state, and UE communicates with the third interface when UE is in the second state. Method for realizing wireless communications of a mobile communication system between a user equipment and a base station (all claimed) by a wireless communications system operator to provide services in residential, commercial or industrial settings such as IoT and industrial IOT (HOT). Uses include but are not limited to a mobile network operator (MNO), a private network operator, a multiple system operator (MSO), an internet-of-things (JOT) network operator, voice, data, messaging, vehicular communications services such as vehicle to everything (V2X) communications services, safety services and mission critical service, smartphones, tablets, laptops, computers, wireless transmission and/or reception units in vehicle. The method enables utilizing the computing devices to utilize the wireless communication network to facilitate interactions with other devices that can access the network or to facilitate interaction, through the network, with devices utilizing other communication networks in an efficient manner. The drawing shows a schematic view of a mobile communication system.100Mobile communications 105Next generation ran 115GNb 120ng-evolved node b (eNB) 125AUser equipment ", "instruction": "Please summarize the input" }, { "input": "AUTOMATED VALET PARKING SYSTEM AND CONTROL METHOD OF AUTOMATED VALET PARKING SYSTEMIt is an automatic parking system having a parking control server that parks an autonomous vehicle in a target parking space in a parking lot, wherein the parking control server includes position information of an autonomous vehicle in the parking lot and position information of a general vehicle manually driven by a driver in the parking lot a vehicle information acquisition unit for acquiring, a notification target vehicle detection unit that detects a first notification target vehicle that is an autonomous driving vehicle that is a notification target for the server communication capable vehicle, and when the first notification target vehicle is detected by the notification target vehicle detection unit, the server communication capable vehicle and a notification unit for notifying the existence of the first notification target vehicle.|1. An automatic parking system having a parking control server that parks the self-driving vehicle in a target parking space in the parking lot by instructing the self-driving vehicle in the parking lot, wherein the parking control server includes location information of the self-driving vehicle in the parking lot And a vehicle information acquisition unit for acquiring location information of a general vehicle manually driven by a driver in the parking lot, and a communication availability determination unit for determining whether the general vehicle is a server communication capable vehicle capable of communicating with the parking lot control server; a notification target vehicle detection unit configured to detect a first notification target vehicle, which is the self-driving vehicle, which is a notification target for the server communication capable vehicle, based on the location information of the autonomous driving vehicle and the location information of the server communication capable vehicle; and a notification unit that, when the first notification target vehicle is detected by the notification target vehicle detection unit, notifies the server communication capable vehicle of the existence of the first notification target vehicle; automatic parking system.\n | 2. The vehicle detection unit according to claim 1, wherein the notification target vehicle detection unit is based on the location information of the server communication incapable vehicle and the self-driving vehicle position information of the general vehicle that is determined not to be the server communication capable vehicle by the communication availability determination unit. to detect a second notification target vehicle, which is the self-driving vehicle as a notification target for the server communication incapable vehicle, and when vehicle-to-vehicle communication between the server communication incapable vehicle and the second notification target vehicle is possible, the second notification target vehicle 2 An automatic parking system that connects vehicle-to-vehicle communication between a vehicle to be notified and a vehicle incapable of communicating with the server.\n | 3. The distance threshold value according to claim 2, wherein the informing unit sets the inter-vehicle distance between the server communication incapable vehicle and the second notification target vehicle with respect to the second notification target vehicle connecting the server communication incapable vehicle and vehicle-to-vehicle communication. Approach of the second notification target vehicle to the server communication incapable vehicle through vehicle-to-vehicle communication Instructing an approach notification reservation to notify an automatic parking system.\n | 4. The method according to any one of claims 1 to 3, wherein the parking control server is configured to, when the general vehicle is approaching from the rear of the self-driving vehicle, or when the general vehicle crosses in front of the self-driving vehicle. and a stop instructing unit for stopping the self-driving vehicle until the general vehicle passes by when the vehicle is being driven.\n | 5. A control method of an autonomous parking system having a parking control server for parking the self-driving vehicle in a target parking space in the parking lot by instructing the self-driving vehicle in the parking lot, wherein the location information of the self-driving vehicle in the parking lot and the parking lot A vehicle information acquisition step of acquiring location information of a general vehicle manually driven by a driver within a vehicle information acquisition step; a communication availability determination step of determining whether the general vehicle is a server communication capable vehicle capable of communicating with the parking lot control server; a first notification target vehicle detection step of detecting a first notification target vehicle that is the self-driving vehicle as a notification target for the server communication capable vehicle based on the location information of the driving vehicle and the location information of the server communication capable vehicle; a notification step of notifying the server communication capable vehicle of the existence of the first notification target vehicle when the first notification target vehicle is detected in the first notification target vehicle detection step; Control method of automatic parking system.", "output": "The system (1) has a vehicle information acquisition unit that acquires positional information of the autonomous driving vehicle (2) in the parking place and positional information of a general vehicle (3) manually driven by a driver in the parking place. A communication availability determination unit determines whether or not the general vehicle is a server communicable vehicle that is able to communicate with the parking place control server. A notification target vehicle detection unit detects a first notification target vehicle that is the autonomous driving vehicle as a notification target for the server communicable vehicle based on the positional information of the autonomous driving vehicle. A notification unit is configured to notify the server communicable vehicle of presence of the first notification target vehicle when the notification target vehicle detection unit detects the first notification target vehicle. An INDEPENDENT CLAIM is included for a method for controlling automated valet parking system. Automated valet parking system. The system prevents the server communicable vehicle in the parking place from erroneously recognizing the autonomous driving vehicle as the general vehicle. The drawing shows the schematic diagram of an automated valet parking system.1Automated valet parking system 2Autonomous driving vehicle 3General vehicle 4Parking place sensor 10Communication unit ", "instruction": "Please summarize the input" }, { "input": "MODIFYING A VEHICULAR RADIO BASED ON A SCHEDULE OF POINT-TO-POINT VEHICULAR COMMUNICATIONSThe disclosure includes embodiments for modifying a vehicle-to-everything (V2X) radio of an ego vehicle that is a connected vehicle. In some embodiments, a method includes analyzing, by a machine learning module executed by a processor, a local dynamic map generated by the ego vehicle to determine schedule data describing a schedule for the ego vehicle to transmit a millimeter wave (mmWave) message to a remote vehicle. The method includes transmitting a V2X message including the schedule data for receipt by the remote vehicle so that the remote vehicle has access to the schedule. The method includes modifying an operation of the V2X radio of the ego vehicle based on the schedule so that the V2X radio transmits the mmWave message to the remote vehicle in compliance with the schedule. The method includes transmitting the mmWave message to the remote vehicle in compliance with the schedule.What is claimed is: \n | 1. A method for an ego vehicle, comprising: \ndetecting an intention of a first endpoint to exchange a millimeter wave (mmWave) message with a second endpoint; \ndetermining scenario data describing a scenario of one or more of the first endpoint and the second endpoint; \nrequesting a recommended beam alignment setting from a server based on the scenario data; \nreceiving feedback data describing the recommended beam alignment setting from the server; and \nmodifying an operation of a vehicle-to-everything (V2X) radio of the first endpoint based on the recommended beam alignment setting so that the V2X radio of the first endpoint exchanges the mmWave message with the second endpoint using the recommended beam alignment setting. \n | 2. The method of claim 1, wherein detecting the intention of the first endpoint to exchange the mmWave message with the second endpoint includes receiving a command from an autonomous driving system of the first endpoint to transmit the mmWave message to the second endpoint.\n | 3. The method of claim 1, wherein the scenario data describing the scenario is based on sensor data describing measurements of a physical environment proximate to the first endpoint.\n | 4. The method of claim 1, further comprising: \ngenerating a beam request message including the scenario data; and \ntransmitting the beam request message to the server via a V2X network; \nwherein the feedback data is based on the beam request message. \n | 5. The method of claim 4, wherein the beam request message causes the server to query a beam alignment database based on the scenario data and to generate the feedback data describing the recommended beam alignment setting as a query result.\n | 6. The method of claim 1, further comprising: \nmodifying an operation of the V2X radio of the second endpoint based on the recommended beam alignment setting to cause a beam of the V2X radio of the first endpoint to be aligned with a beam of the V2X radio of the second endpoint so that the V2X radio of the first endpoint and the V2X radio of the second endpoint exchange the mmWave message using the recommended beam alignment setting. \n | 7. The method of claim 1, further comprising: \ngenerating mmWave performance data related to an exchange of the mmWave message using the recommended beam alignment setting; and \nuploading the mmWave performance data to the server. \n | 8. A system comprising: \nan onboard vehicle computer system of an ego vehicle including a non-transitory memory storing computer code which, when executed by the onboard vehicle computer system, causes the onboard vehicle computer system to: \ndetect an intention of a first endpoint to exchange a millimeter wave (mmWave) message with a second endpoint; \ndetermine scenario data describing a scenario of one or more of the first endpoint and the second endpoint; \nrequest a recommended beam alignment setting from a server based on the scenario data; \nreceive feedback data describing the recommended beam alignment setting from the server; and \nmodify an operation of a vehicle-to-everything (V2X) radio of the first endpoint based on the recommended beam alignment setting so that the V2X radio of the first endpoint exchanges the mmWave message with the second endpoint using the recommended beam alignment setting. \n | 9. The system of claim 8, wherein detecting the intention of the first endpoint to exchange the mmWave message with the second endpoint includes receiving a command from an autonomous driving system of the first endpoint to transmit the mmWave message to the second endpoint.\n | 10. The system of claim 8, wherein the scenario data describing the scenario is based on sensor data describing measurements of a physical environment proximate to the first endpoint.\n | 11. The system of claim 8, wherein the computer code further causes the onboard vehicle computer system to: \ngenerate a beam request message including the scenario data; and \ntransmit the beam request message to the server via a V2X network; \nwherein the feedback data is based on the beam request message. \n | 12. The system of claim 11, wherein the beam request message causes the server to query a beam alignment database based on the scenario data and to generate the feedback data describing the recommended beam alignment setting as a query result.\n | 13. The system of claim 8, wherein the computer code further causes the onboard vehicle computer system to: \nmodify an operation of the V2X radio of the second endpoint based on the recommended beam alignment setting to cause a beam of the V2X radio of the first endpoint to be aligned with a beam of the V2X radio of the second endpoint so that the V2X radio of the first endpoint and the V2X radio of the second endpoint exchange the mmWave message using the recommended beam alignment setting. \n | 14. The system of claim 8, wherein the computer code further causes the onboard vehicle computer system to: \ngenerate mmWave performance data related to an exchange of the mmWave message using the recommended beam alignment setting; and \nupload the mmWave performance data to the server. \n | 15. A computer program product comprising a non-transitory memory of an onboard vehicle computer system of an ego vehicle storing computer-executable code that, when executed by a processor, causes the processor to: \ndetect an intention of a first endpoint to exchange a millimeter wave (mmWave) message with a second endpoint; \ndetermine scenario data describing a scenario of one or more of the first endpoint and the second endpoint; \nrequest a recommended beam alignment setting from a server based on the scenario data; \nreceive feedback data describing the recommended beam alignment setting from the server; and \nmodify an operation of a vehicle-to-everything (V2X) radio of the first endpoint based on the recommended beam alignment setting so that the V2X radio of the first endpoint exchanges the mmWave message with the second endpoint using the recommended beam alignment setting. \n | 16. The computer program product of claim 15, wherein detecting the intention of the first endpoint to exchange the mmWave message with the second endpoint includes receiving a command from an autonomous driving system of the first endpoint to transmit the mmWave message to the second endpoint.\n | 17. The computer program product of claim 15, wherein the scenario data describing the scenario is based on sensor data describing measurements of a physical environment proximate to the first endpoint.\n | 18. The computer program product of claim 15, wherein the computer-executable code further causes the processor to: \ngenerate a beam request message including the scenario data; and \ntransmit the beam request message to the server via a V2X network; \nwherein the feedback data is based on the beam request message. \n | 19. The computer program product of claim 18, wherein the beam request message causes the server to query a beam alignment database based on the scenario data and to generate the feedback data describing the recommended beam alignment setting as a query result.\n | 20. The computer program product of claim 15, wherein the computer-executable code further causes the processor to: \nmodify an operation of the V2X radio of the second endpoint based on the recommended beam alignment setting to cause a beam of the V2X radio of the first endpoint to be aligned with a beam of the V2X radio of the second endpoint so that the V2X radio of the first endpoint and the V2X radio of the second endpoint exchange the mmWave message using the recommended beam alignment setting.", "output": "The method involves detecting (501) intention of a first endpoint to exchange a millimeter wave message with a second endpoint. Scenario data describing scenario of the endpoints is determined (503). A recommended beam alignment setting is requested from a server based on the scenario data. Feedback data describing the recommended beam alignment setting requested from the server is received. Operation of vehicle-to-everything (V2X) radio of the first endpoint is modified based on recommended beam such that the V2Xradio exchanges the millimeter wave message with the second endpoint by using the recommended beam alignment setting. INDEPENDENT CLAIMS are included for:(1) a system for modifying a V2X radio for millimeter wave communications based on schedule of point-to-point communications between vehicles; and(2) a computer program product comprising a non-transitory memory for storing computer-executable code to execute a method for modifying a V2X radio for millimeter wave communications based on schedule of point-to-point communications between vehicles. Method for modifying a V2X radio for millimeter wave communications based on schedule of point-to-point communications between vehicles. The method enables facilitating use of high data rate millimeter wave communications for application by optimizing scheduling of millimeter wave communication in environment in which vehicles share data received from sensors. The method allows a feedback loop to determine success of the millimeter wave message and update schedule based on success rate so that likelihood of target success of target millimeter wave message is increased. The drawing shows a sequential diagram illustrating a method for modifying V2X radio for millimeter wave communications based on schedule of point-to-point communications between vehicles.501Step for detecting intention of first endpoint to exchange millimeter wave message with second endpoint 503Step for determining scenario data describing scenario of endpoints 507Step for generating beam report message including scenario data and beam 508Step for transmitting beam report message to server via V2X network 509Step for receiving beam report message from ego vehicle ", "instruction": "Please summarize the input" }, { "input": "Ego-vehicles, systems, and methods for monitoring target objectsAn ego-vehicle for displaying a behavior of a target object in a spatio-temporal manner may include one or more processors. One or more memory modules are communicatively coupled to the one or more processors. A display is communicatively coupled to the one or more processors. One or more sensors are communicatively coupled to the one or more processors. Machine readable instructions are stored in the one or more memory modules and cause the one or more processors to display on the display an object indicator associated with a position of a target object relative to the ego-vehicle, wherein the object indicator depicts a spatio-temporal patterning indicating the behavior of the target object.What is claimed is: \n | 1. An ego-vehicle for displaying a behavior of a target object in a spatio-temporal manner, the ego-vehicle comprising: \none or more processors; \none or more memory modules communicatively coupled to the one or more processors; \na display communicatively coupled to the one or more processors; \none or more sensors communicatively coupled to the one or more processors; and \nmachine readable instructions stored in the one or more memory modules that cause the one or more processors to, based on a signal from the one or more sensors, display on the display an object indicator associated with a position of the target object relative to the ego-vehicle, wherein: \nthe object indicator depicts a spatio-temporal patterning indicating the behavior of the target object; and \nthe spatio-temporal patterning of the object indicator comprises at least one of a light pattern and waveform, the at least one of the light pattern and waveform having a frequency proportional to a speed of the target object. \n | 2. The ego-vehicle of claim 1, wherein the machine readable instructions cause the one or more processors to adjust the spatio-temporal patterning of the object indicator in both time and space to indicate the behavior of the target object at any given time.\n | 3. The ego-vehicle of claim 1, wherein: \nthe spatio-temporal patterning of the object indicator further indicates a direction the target object is traveling; and \nthe machine readable instructions further cause the one or more processors to move the object indicator across the display to correspond with a movement of the target object within a vicinity of the ego-vehicle. \n | 4. The ego-vehicle of claim 1, wherein the spatio-temporal patterning of the object indicator indicates the speed and direction of travel of the target object relative to a speed and direction of travel of the ego-vehicle.\n | 5. The ego-vehicle of claim 1, wherein the spatio-temporal patterning of the object indicator indicates an absolute speed and direction of travel of the target object.\n | 6. The ego-vehicle of claim 5, wherein: \nthe machine readable instructions further cause the one or more processors to display an ego-vehicle indicator on the display, wherein the ego-vehicle indicator depicts a spatio-temporal patterning indicating at least the absolute speed of the ego-vehicle; and \nthe object indicator is displayed adjacent to the ego-vehicle indicator at a position on the display corresponding to the position of the target object within a vicinity of the ego-vehicle. \n | 7. The ego-vehicle of claim 6, wherein: \nthe spatio-temporal patterning of the ego-vehicle indicator comprises at least one of a light pattern and waveform, the at least one of the light pattern and waveform having a frequency proportional to the absolute speed of the ego-vehicle and a direction of flow indicative of a direction of travel of the ego-vehicle; and \nthe at least one of the light pattern and waveform of the spatio-temporal patterning of the object indicator has a frequency proportional to the absolute speed of the target object and a direction of flow indicative of the direction of travel of the target object. \n | 8. The ego-vehicle of claim 1, wherein the light pattern and waveform of the spatio-temporal patterning of the object indicator has a direction of flow indicative of a direction of travel of the target object relative to a direction of travel of the ego-vehicle.\n | 9. The ego-vehicle of claim 1, wherein the frequency of the spatio-temporal patterning of the object indicator adjusts to correspond to changes in the speed of the target object.\n | 10. The ego-vehicle of claim 1, wherein the machine readable instructions further cause the one or more processors to identify a target object type, wherein the object indicator indicates the target object type.\n | 11. The ego-vehicle of claim 1, wherein the ego-vehicle is an autonomous vehicle.\n | 12. The ego-vehicle of claim 1, wherein the machine readable instructions further cause the one or more processors to: \nidentify road parameters, wherein the road parameters include at least a lane of a road; and \ndisplay the road parameters on the display, wherein the object indicator is displayed in the lane corresponding to the lane the target object is in. \n | 13. The ego-vehicle of claim 1, wherein the display is at least one of a heads-up display, an instrument cluster display, a navigation display, and a mobile device display.\n | 14. A system for displaying a behavior of a target object in a spatio-temporal manner relative to the system, the system comprising: \none or more processors; \none or more memory modules communicatively coupled to the one or more processors; \na display communicatively coupled to the one or more processors; \none or more sensors communicatively coupled to the one or more processors; and \nmachine readable instructions stored in the one or more memory modules that cause the one or more processors to, based on a signal from the one or more sensors, display on the display an object indicator associated with a position of the target object relative to the system, wherein: \nthe object indicator depicts a spatio-temporal patterning indicating the behavior of the target object relative to the system; and \nthe spatio-temporal patterning of the object indicator comprises at least one of a light pattern and waveform, the at least one of the light pattern and waveform having a frequency proportional to a speed of the target object. \n | 15. The system of claim 14, wherein: \nthe at least one of the light pattern and waveform of the spatio-temporal patterning of the object indicator has a direction of flow indicative of a direction of travel of the target object relative to a direction of travel of the system; and \nthe frequency of the spatio-temporal patterning adjusts to correspond to the changes in the speed and the direction of travel of the target object relative to the system. \n | 16. The system of claim 14, wherein the one or more sensors include at least one of a camera, LiDAR, RADAR, and vehicle-to-vehicle communication.\n | 17. The system of claim 14, wherein the machine readable instructions further cause the one or more processors to move the object indicator across the display to correspond with a movement of the target object within a vicinity of the system.\n | 18. A method of displaying a behavior of a target object in a spatio-temporal manner relative to an ego-vehicle comprising: \ndetecting, with one or more sensors, the target object in a vicinity of the ego-vehicle; \nmonitoring a speed, direction of travel, and position of the target object relative to the ego-vehicle; \ndisplaying, with one or more processors, an object indicator associated with the position of the target object relative to the ego-vehicle on a display, wherein: \nthe object indicator depicts a spatio-temporal patterning indicating the behavior of the target object; and \nthe spatio-temporal patterning of the object indicator comprises at least one of a light pattern and waveform, the at least one of the light pattern and waveform having a frequency proportional to a speed of the target object; and \nadjusting the spatio-temporal patterning of the object indicator on the display to correspond with the behavior of the target object within the vicinity of the ego-vehicle. \n | 19. The method of claim 18, wherein: \nthe at least one of the light pattern and waveform of the spatio-temporal patterning of the object indicator has a direction of flow indicative of a direction of travel of the target object relative to a direction of travel of the ego-vehicle. \n | 20. The method of claim 18, further comprising: \nidentifying road parameters, wherein road parameters include at least a lane of a road; and \ndisplaying virtual road parameters on the display, wherein: \nthe virtual road parameters include at least a virtual lane corresponding to the lane of the road; and \nthe object indicator of the target object is displayed in the virtual lane corresponding to the lane the target object is in.", "output": "The ego-vehicle (100) has processors, memory modules (106) communicatively coupled to processors, display (108) communicatively coupled to processors, sensors (120) communicatively coupled to processors, and machine readable instructions stored in memory modules that cause the processors to display an object indicator associated with position of target object (200) relative to ego-vehicle on the display, such that the object indicator depicts spatio-temporal patterning indicating the behavior of target object. The machine-readable instructions cause the processors to adjust spatio-temporal patterning of object indicator in both time and space to indicate behavior of the target object at any given time. The spatio-temporal patterning of object indicator indicates speed and direction of travel of target object relative to speed and direction of travel of ego-vehicle. The display is a heads-up display, an instrument cluster display, a navigation display, or a mobile device display. INDEPENDENT CLAIMS are also included for the following:a system for displaying behavior of target object in spatio-temporal manner relative to system; anda method of displaying behavior of target object in spatio-temporal manner relative to ego-vehicle. Ego-vehicle, such as autonomous vehicle, for displaying behavior of target object in spatio-temporal manner. Helps drivers of autonomous vehicles who might only be checking in periodically with road conditions to be able to quickly and efficiently understand the motions of objects outside of their vehicle. The drawing shows a schematic diagram of the ego-vehicle illustrating a communication path. 100Ego-vehicle106Memory modules108Display120Sensors200Target object", "instruction": "Please summarize the input" }, { "input": "Vehicle systems and methods for presenting penetration metric information on a routeSystems for presenting dedicated short range communication (DSRC) penetration metric information on route are provided. A system for DSRC presenting penetration metric information on route includes a screen, one or more processors, one or more memory modules communicatively coupled to the one or more processors, and machine readable instructions stored in the one or more memory modules. When executed by the one or more processors, the machine readable instructions may cause the system to generate at least one route between a start location and a destination, receive, from a cloud server, DSRC penetration metric information related to the at least one route, select one route among the least one route based on the DSRC penetration metric information, and display the selected route along with the DSRC penetration metric information for the selected route on the screen.What is claimed is: \n | 1. A vehicle system for presenting dedicated short rage communication (DSRC) penetration metric information on routes, the system comprising: \na screen; \none or more processors; \none or more memory modules communicatively coupled to the one or more processors; and \nmachine readable instructions stored in the one or more memory modules that cause the vehicle system to perform at least the following when executed by the one or more processors: \ngenerate two or more routes between a start location and a destination; \nreceive, from a cloud server, DSRC penetration metric information related to each of the two or more routes, the DSRC penetration metric information including a penetration rate for each of the two or more routes; \nselect one route among the two or more routes based on the DSRC penetration metric information; and \ndisplay the selected route along with the DSRC penetration metric information for the selected route on the screen. \n | 2. The vehicle system of claim 1, wherein selecting one route among the two or more routes based on the DSRC penetration metric information comprises: \ncalculating a penetration score for each of the two or more routes based on the DSRC penetration metric information related to each of the two or more routes; and \nselecting a route with a highest penetration score among the two or more routes. \n | 3. The vehicle system of claim 1, wherein displaying the selected route along with the DSRC penetration metric information for the selected route on the screen comprises: \ndividing the selected route into a plurality of segments based on the penetration rate for each of the plurality of segments; \ndisplaying each of the plurality of segments in a predetermined color based on the penetration rate. \n | 4. The vehicle system of claim 1, wherein displaying the selected route along with the DSRC penetration metric information for the selected route on the screen comprises: \ndividing the selected route into a plurality of segments based on the penetration rate for each of the plurality of segments; \ndisplaying each of the plurality of segments with a predetermined identifier based on the penetration rate. \n | 5. The vehicle system of claim 1, wherein the DSRC penetration metric information comprises an autonomous vehicle rate.\n | 6. The vehicle system of claim 1, wherein the DSRC penetration metric information comprises historical DSRC penetration metric information.\n | 7. The vehicle system of claim 1, wherein the DSRC penetration metric information comprises real-time DSRC penetration metric information.\n | 8. The vehicle system of claim 1, wherein displaying the selected route along with the DSRC penetration metric information for the selected route on the screen comprises \ndisplaying the selected route along with a penetration rate and an autonomous vehicle rate for the selected route. \n | 9. The vehicle system of claim 1, wherein the penetration rate percentage comprises a percentage of vehicles equipped with DSRC.\n | 10. A cloud server for providing DSRC penetration metric information on a route, the cloud server comprising: \na database storing historical DSRC penetration metric information including a historical penetration rate for the route; \na penetration rate estimator configured to estimate DSRC penetration metric information on the route at least based on the historical DSRC penetration metric information related to the route; \na network interface configured to \nreceive the route from a vehicle; and \nprovide the estimated DSRC penetration metric information on the route to the vehicle, \nwherein the historical penetration rate is a historical percentage of vehicles equipped with DSRC functionality. \n | 11. The cloud server of claim 10, wherein the network interface is further configured to receive information from a plurality of vehicles driving on the route in real time, and the penetration rate estimator is configured to estimate DSRC penetration metric information further based on the information from the plurality of vehicles.\n | 12. The cloud server of claim 10, wherein the network interface is further configured to receive traffic information on the route in real time, and the penetration rate estimator is configured to estimate DSRC penetration metric information further based on the traffic information.\n | 13. The cloud server of claim 11, wherein the information from a plurality of vehicles includes information on whether or not the vehicle has vehicle-to-vehicle communication functionality.\n | 14. The cloud server of claim 13, wherein the vehicle-to-vehicle communication functionality comprises DSRC.\n | 15. The cloud server of claim 11, wherein the information from a plurality of vehicles includes information on whether or not the vehicle is driving in an autonomous mode.\n | 16. The cloud server of claim 11, wherein the information from a plurality of vehicles includes location information on the plurality of vehicles.\n | 17. The cloud server of claim 11, wherein the network interface is further configured to receive information from a plurality of vehicles through vehicle-to-infrastructure (V2I) communication.\n | 18. The cloud server of claim 11, wherein the database is updated in real time based on the information from the plurality of vehicles.\n | 19. The cloud server of claim 10, wherein the penetration rate estimator is configured to divide the route into one or more segments based on the historical DSRC penetration metric information and estimate DSRC penetration metric information on the one or more segments.", "output": "The system (300) has a screen, multiple processors (302) and multiple memory modules (306) communicatively coupled to multiple processors. The machine readable instructions stored in multiple memory modules that cause vehicle system to perform following instruction when executed by multiple processors. The DSRC penetration metric to one route receives from a cloud server. One route selects among one route based on DSRC penetration metric information. The selected route displays along with DSRC penetration metric information for selected route on screen. Vehicle system for presenting dedicated short rage communication (DSRC) penetration metric information of cloud server (claimed) for providing visual output such as maps, navigation, entertainment, penetration metric information etc. The penetration estimator can utilize the real-time traffic information in estimating penetration metric information. The driver or an autonomous vehicle can drive with less probability of accidents by selecting the route with a high penetration rate because the drive or the autonomous vehicle can have an increased awareness of the environment with a higher DSRC penetration rate. The drawing shows a schematic view of a system. 300Vehicle system302Processor304Communication path306Memory module308Display", "instruction": "Please summarize the input" }, { "input": "AUTONOMOUS-MODE TRAFFIC LANE SELECTION BASED ON TRAFFIC LANE CONGESTION LEVELSA method and device for an autonomous vehicle control unit for traffic lane selection are disclosed. In operation, a present traffic lane in relation to each of a plurality of traffic lanes for a roadway is identified. A traffic congestion level is determined for the each of the plurality of traffic lanes, and compared with each other to determine a lowest-congested traffic lane of the plurality of traffic lanes. When the lowest-congested traffic lane is other than the present traffic lane, a traffic lane change command is generated that includes identifier data for an adjacent traffic lane having a lower traffic congestion level. The traffic lane change command is transmitted to effect a traffic lane change of the vehicle from the present traffic lane to an adjacent traffic lane.What is claimed is: \n | 1. A method in an autonomous vehicle control unit for traffic lane selection from a roadway having a plurality of traffic lanes in a common direction of travel, the method comprising: \nidentifying a present traffic lane in relation to each of the plurality of traffic lanes; \ndetermining a traffic congestion level for the each of the plurality of traffic lanes; \ncomparing the traffic congestion level for the each of the plurality of traffic lanes to determine a lowest-congested traffic lane of the plurality of traffic lanes; \nwhen the lowest-congested traffic lane is other than the present traffic lane: \ngenerating a traffic lane change command including identifier data for an adjacent traffic lane having a lower traffic congestion level; and \ntransmitting the traffic lane change command to effect a traffic lane change from the present traffic lane to the adjacent traffic lane. \n | 2. The method of claim 1, wherein the determining the traffic congestion level for the each of the plurality of traffic lanes comprising: \nsensing a vehicle positioned ahead along the common direction of travel; and \ndetermining a distance to the vehicle to produce the traffic congestion level. \n | 3. The method of claim 1, wherein the determining the traffic congestion level for the each of the plurality of traffic lanes comprising: \nsensing a vehicle positioned ahead along the common direction of travel; and \ndetecting a closing of a longitudinal distance to the vehicle; and \ndetermining a rate of the closing of the longitudinal distance to the vehicle to produce the traffic congestion level. \n | 4. The method of claim 1, wherein the transmitting the traffic lane change command further comprising: \ntransmitting the traffic lane change command to a powertrain control unit; and \nbroadcasting the traffic lane change command. \n | 5. The method of claim 4, wherein the broadcasting the traffic lane change command comprising: \na vehicle-to-vehicle communication; and \na vehicle-to-infrastructure communication. \n | 6. A method in a vehicle control unit for traffic lane selection of a roadway for an autonomous vehicle operation, the method comprising: \ndetermining a traffic congestion condition for the roadway; \nwhen the traffic congestion condition exceeds a threshold, determining whether the roadway includes a plurality of traffic lanes for travel in a uniform travel direction; and \nwhen the roadway includes the plurality of traffic lanes: \nidentifying a present traffic lane in relation to each of the plurality of traffic lanes; \ndetermining a traffic congestion level for the each of the plurality of traffic lanes; \ncomparing the traffic congestion level for the each of the plurality of traffic lanes to determine whether the present traffic lane is a lowest-congested traffic lane of the plurality of traffic lanes; and \nwhen the lowest-congested traffic lane is other than the present traffic lane, traversing the roadway to the lowest-congested traffic lane by: \ngenerating a traffic lane change command identifying an adjacent traffic lane; and \ntransmitting the traffic lane change command to effect a lane change from the present traffic lane to the adjacent traffic lane. \n | 7. The method of claim 6, further comprising: \nwhen the lowest-congested traffic lane is other than the adjacent traffic lane, again traversing the roadway to the lowest-congested traffic lane by: \ngenerating another traffic lane change command including identifier data for a next adjacent traffic lane; and \ntransmitting the another traffic lane change command to effect a traffic lane change from the present traffic lane to the next adjacent traffic lane. \n | 8. The method of claim 6, wherein the determining the traffic congestion condition for the roadway comprising: \nretrieving location data; \nrequesting, based on the location data, map layer data including roadway information; \nreceiving, in response, the map layer data indicating a present traffic speed for the roadway relative to a free-flowing traffic speed; and \nprocessing the map layer data to produce the traffic congestion condition for the roadway. \n | 9. The method of claim 6, wherein the threshold indicates less than a free-flowing traffic speed for the roadway.\n | 10. The method of claim 6, wherein the traffic congestion condition for the roadway being based on sensing vehicle-to-vehicle communication levels.\n | 11. The method of claim 10, wherein the sensing the vehicle-to-vehicle communication levels including sensing a volume of vehicle-to-vehicle communication collisions.\n | 12. The method of claim 6, wherein the determining the traffic congestion condition for the roadway comprising: \nreceiving a vehicle-to-infrastructure communication message; \nretrieving message data from the vehicle-to-infrastructure communication message; \ndetermining a congestion value for the message data; and \nassigning the congestion value to the traffic congestion condition. \n | 13. The method of claim 6, wherein the determining whether the roadway includes the plurality of traffic lanes in the uniform travel direction comprising: \nretrieving location data; \nrequesting, based on the location data, map layer data including roadway information data; and \nreceiving, in response, the map layer data. \n | 14. The method of claim 13, wherein the map layer data comprises a Route Network Description File indicating an amount of the traffic lanes for the roadway.\n | 15. The method of claim 6, wherein the determining whether the roadway includes the plurality of traffic lanes in the uniform travel direction comprising: \nreceiving vehicle sensor data; \ndetermining roadway features based on the vehicle sensor data; \ninferring from the roadway features to infer more than one traffic lane; and \ngenerating an initial estimate of traffic lane geometry. \n | 16. The method of claim 6, wherein the transmitting the traffic lane change command to effect the lane change from the present traffic lane to the adjacent traffic lane further comprising: \ntransmitting the traffic lane command to a powertrain control unit; and \nbroadcasting the traffic lane command. \n | 17. A vehicle control unit for traffic lane selection comprising: \na wireless communication interface to service communication with a vehicle network and user equipment of a vehicle user; \na processor coupled to the wireless communication interface, the processor for controlling operations of the vehicle control unit; and \na memory coupled to the processor, the memory for storing data and program instructions used by the processor, the processor configured to execute instructions stored in the memory to: \nidentify a present traffic lane in relation to each of the plurality of traffic lanes; \ndetermine a traffic congestion level for the each of the plurality of traffic lanes; \ncompare the traffic congestion level for the each of the plurality of traffic lanes to determine a lowest-congested traffic lane of the plurality of traffic lanes; and \nwhen the lowest-congested traffic lane is other than the present traffic lane: \ngenerate a traffic lane change command including identifier data for an adjacent traffic lane having a lower traffic congestion level; and \ntransmit the traffic lane change command to effect a traffic lane change from the present traffic lane to the adjacent traffic lane. \n | 18. The vehicle control unit of claim 17, wherein the processor being further configured to execute further instructions stored in the memory to determine the traffic congestion level for the each of the plurality of traffic lanes by: \nsensing a vehicle positioned ahead along the common direction of travel; and \ndetermining a distance to the vehicle to produce the traffic congestion level. \n | 19. The vehicle control unit of claim 17, wherein the processor being further configured to execute further instructions stored in the memory to determine the traffic congestion level for the each of the plurality of traffic lanes by: \nsensing a vehicle positioned ahead along the common direction of travel; and \ndetecting a closing of a longitudinal distance to the vehicle; and \ndetermining a rate of the closing of the longitudinal distance to the vehicle to produce the traffic congestion level. \n | 20. The vehicle control unit of claim 17, wherein the processor being further configured to execute further instructions stored in the memory to transmit the traffic lane change command by: \ntransmitting the traffic lane command to a powertrain control unit; and \nbroadcasting the traffic lane command.", "output": "The method involves identifying a present traffic lane in relation to a set of traffic lanes. Traffic congestion level is determined for the traffic lanes. The traffic congestion level for the traffic lanes is compared to determine a lowest-congested traffic lane of the traffic lanes. A traffic lane change command (240) including identifier data is generated for an adjacent traffic lane having a lower traffic congestion level. The traffic lane change command is transmitted to effect a traffic lane change from the present traffic lane to the adjacent traffic lane. An INDEPENDENT CLAIM is also included for a vehicle control unit. Method for facilitating traffic lane selection from a roadway in a control unit (claimed) i.e. audio/visual control unit of an autonomous vehicle using a handheld mobile device. Uses include but are not limited to a passenger car, passenger lorry, semi-lorry, cargo van, emergency or first response vehicle and a transport vehicle, using a smart phone, personal digital assistant (PDA) device, tablet computer and e-reader. The method enables providing a vehicular interface for a driver to interact with vehicle systems, interactive displays, audio systems, voice recognition systems, buttons and dials and haptic feedback systems for inputting or outputting information. The method enables allowing vehicle-to-infrastructure communications to broadcast traffic stoppage points and provide advance indication to the autonomous vehicle control unit about oncoming traffic congestion, beacons and vehicle-to-infrastructure devices to gather local traffic information and local traffic congestion, and broadcast the gathered data. The method enables using a light detection and ranging (LIDAR) to determine distance between a sensor input device and an object with a high degree of accuracy due to moving of light at a constant speed. The method enables using a touch screen for providing visual information and detecting presence and location of a tactile input upon a surface of or adjacent to the display. The drawing shows a schematic block diagram of a vehicle control unit in a context of a vehicle network environment. 200Autonomous vehicle control unit201Vehicle network environment202Head unit device240Traffic lane change command248Powertrain Control Unit", "instruction": "Please summarize the input" }, { "input": "METHOD AND APPARATUS FOR WEATHER SUPPORT FOR AN AUTONOMOUS VEHICLEA method and an apparatus for weather support are provided. In an embodiment, a request from a user to check weather information is received, and trip information provided by the request is also identified. The weather information in response to the request is retrieved and output to the user. In another embodiment, user information of the user and the trip information of the trip are captured. The weather information corresponding to the trip is retrieved and output to the user. In addition, trip suggestions are provided to the user. In yet another embodiment, the trip information is retrieved and available routes are identified corresponding to the trip. Further, respective weather information, respective traffic information, and respective physical information of each of the available routes are retrieved. Route for the trip is determined based on the respective weather information, the respective physical information and the respective traffic information of each of the available routes.What is claimed is: \n | 1. A method for weather support, comprising: \nreceiving, via interface circuitry of an apparatus installed in a vehicle, a request to check weather information; \nidentifying, by processing circuitry of the apparatus, an expected location and expected time of a trip provided by the request, retrieving the weather information corresponding to the trip, and outputting the weather information and trip suggestions in response to the request; \ncapturing, via the interface circuitry, user information and trip information of the trip, retrieving the weather information corresponding to the trip, outputting the weather information corresponding to the trip, and providing trip suggestions to the user according to the user information, the trip information and the weather information; and \nretrieving, via the interface circuitry, the trip information of the trip, identifying a plurality of available routes associated with the trip, retrieving respective weather information of each of the plurality of available routes, retrieving respective physical information of each of the plurality of available routes, retrieving respective traffic information of each of the plurality of available routes, determining a route for the trip based on the respective weather information, the respective physical information and the respective traffic information of each of the plurality of available routes, and outputting the determined route for the trip. \n | 2. The method of claim 1, further comprising: \nreceiving the request from the user through a microphone; \nidentifying the expected location and the expected time of the trip provided by the request through a voice recognition technique; \nretrieving the weather information corresponding to the trip through at least one of a weather website, an application, and a data sore; \noutputting the weather information in response to the request through a speaker or a display device; \nproviding the trip suggestions according to the weather information through the speaker or the display device. \n | 3. The method of claim 2, wherein the retrieving the weather information corresponding to the trip comprises: \nretrieving weather information on a specific day or at specific time; \nretrieving weather information for a duration of days or a period of time; \nretrieving weather information for a specific location; and \nretrieving current weather information at current location. \n | 4. The method of claim 2, wherein the providing the trip suggestions comprises: \nproviding suggestions on trip supplies; and \nproviding suggestions on trip safety. \n | 5. The method of claim 1, further comprising: \ncapturing the user information of the user through a camera, the user information including trip supplies that the user prepares for the trip and dress that the user wears; \ncapturing the trip information through at least one of a navigation system installed in the vehicle, a portable communication device of the user, and the request input via a microphone by the user, the trip information including the expected location and the expected time of the trip; \nretrieving the weather information corresponding to the trip through at least one of a weather web site, an application, and a data store; \noutputting the weather information corresponding to the trip through a speaker or a display device; and \nproviding the trip suggestions to the user on the trip supplies and trip safety through the speaker or the display device according to the captured user information, the captured trip information and the retrieved weather information. \n | 6. The method of claim 5, wherein the capturing the user information of the user is operated through at least one of image recognition, pattern recognition, feature recognition, and signal recognition.\n | 7. The method of claim 5, further comprising: \ntraining a machine learning algorithm based on the captured user information, and deploying the trained machine learning algorithm to identify similar user information in a future event. \n | 8. The method of claim 1, further comprising: \nretrieving the trip information of the trip through at least one of a navigation system installed in the vehicle, a portable communication device of the user, and the request input via a microphone by the user; \nidentifying the plurality of available routes associated with the trip through a map database; \nretrieving the respective weather information of each of the plurality of available routes through at least one of a weather website, an application, and/or a data store; \nretrieving the respective physical information of each of the plurality of available routes through at least one of a vehicle to infrastructure system, a cloud-based system, an application, and/or a data store; \nretrieving the respective traffic information of each of the plurality of available routes through at least one of a traffic system, a cloud-based system, an application, and a data store; \ndetermining the route for the trip from the available routes based on the respective weather information, the respective physical information and the respective traffic information of each of the plurality of available routes; and \noutputting the route for the trip to at least one of a speaker, a display device, and a navigation system. \n | 9. The method of claim 8, wherein the determining the route for the trip based on the respective weather information, the respective physical information and the respective traffic information of each of the available routes comprising: \nselecting routes having no weather hazard or least weather hazard from the plurality of available routes based on the respective weather information of each of the plurality of available routes; \nselecting routes having no traffic issues or fewest traffic issues from the routes that have no weather hazard or least weather hazards based on the respective traffic information of each of the plurality of available routes; \nselecting routes having no physical issues or fewest physical issues from the routes that have no traffic issues or fewest traffic issues based on the respective physical information of each of the plurality of available routes; and \ndetermining the route for the trip from the routes that has no physical issues or the fewest physical issues based on at least one of total driving time, driving costs, or driving distance. \n | 10. The method of claim 8, further comprising: \noutputting the determined route for the trip to the navigation system and automatically controlling driving of the vehicle through a control unit according to the determined route that is output to the navigation system. \n | 11. An apparatus for weather support, comprising: \ninterface circuitry configured to transmit messages within the apparatus, and between the apparatus and external devices; and \nprocessing circuitry configured to \nreceive, via the interface circuitry, a request from a user to check weather information, identify an expected location and expected time of a trip provided by the request, retrieve the weather information corresponding to the trip, and output the weather information and trip suggestions in response to the request; \ncapture, via the interface circuitry, user information and trip information of the trip, retrieve the weather information corresponding to the trip, output the weather information corresponding to the trip, and provide trip suggestions to the user according to the user information, the trip information and the weather information; and \nretrieve, via interface circuitry, the trip information of the trip, identify a plurality of available routes associated with the trip, retrieve respective weather information of each of the plurality of available routes, retrieve respective physical information of each of the plurality of available routes, retrieve respective traffic information of each of the plurality of available routes, determine a route for the trip based on the respective weather information, the respective physical information and the respective traffic information of each of the plurality of available routes, and output the determined route for the trip. \n | 12. The apparatus of claim 11, wherein the processing circuitry is further configured to: \nreceive the request from the user through a microphone; \nidentify the expected location and the expected time of the trip provided by the request through a voice recognition technique; \nretrieve the weather information corresponding to the trip through at least one of a weather web site, an application, and/or a data sore; \noutput the weather information in response to the request through a speaker or a display device; \nprovide the trip suggestions according to the weather information through the speaker or the display device. \n | 13. The apparatus of claim 12, wherein the processing circuitry is further configured to: \nretrieve the weather information on a specific day or at specific time; \nretrieve the weather information for a duration of days or a period of time; \nretrieve the weather information for a specific location; and \nretrieve current weather information at a current location. \n | 14. The apparatus of claim 12, wherein the processing circuitry is further configured to: \nprovide suggestions on trip supplies; and \nprovide suggestions on trip safety. \n | 15. The apparatus of claim 11, wherein the processing circuitry is further configured to: \ncapture the user information of the user through a camera, the user information including trip supplies that the user prepares for the trip and dress that the user wears; \ncapture the trip information through at least one of a navigation system installed in the vehicle, a portable communication device of the user, or the request input via a microphone by the user, the trip information including the expected location and the expected time of the trip; \nretrieve the weather information corresponding to the trip through at least one of a weather web site, an application, and a data store; \noutput the weather information corresponding to the trip through a speaker or a display device; and \nprovide the trip suggestions to the user on the trip supplies and trip safety through the speaker or the display device according to the captured user information, the captured trip information and the retrieved weather information. \n | 16. The apparatus of claim 15, wherein the processing circuitry is further configured to \ntrain a machine learning algorithm based on the captured user information, and deploy the trained machine learning algorithm to identify similar user information in a future event. \n | 17. The apparatus of claim 11, wherein the processing circuitry is further configured to \nretrieve the trip information of the trip through at least one of a navigation system installed in the vehicle, a portable communication device of the user, or the request input via a microphone by the user; \nidentify the plurality of available routes associated with the trip through a map database; \nretrieve the respective weather information of each of the plurality of available routes through at least one of a weather web site, an application, and a data store; \nretrieve the respective physical information of each of the plurality of available routes through at least one of a vehicle to infrastructure system, a cloud-based system, an application, and a data store; \nretrieve the respective traffic information of each of the plurality of available routes through at least one of a traffic system, a cloud-based system, an application, and a data store; \ndetermine the route for the trip from the plurality of available routes based on the respective weather information, the respective physical information and the respective traffic information of each of the available routes; and \noutput the route for the trip to at least one of a speaker, a display device, and a navigation system. \n | 18. The apparatus of claim 17, wherein the processing circuitry is further configured to: \nselect routes having no weather hazard or least weather hazards from the plurality of available routes based on the respective weather information of each of the plurality of available routes; \nselect routes having no traffic issues or fewest traffic issues from the routes that have no weather hazard or least weather hazards based on the respective traffic information of each of the plurality of available routes; \nselect routes having no physical issues or fewest physical issues from the routes that have no traffic issues or fewest traffic issues based on the respective physical information of each of the plurality of available routes; and \ndetermine the route for the trip from the routes that has no physical issues or fewest physical issues based on at least one of total driving time, driving costs, or driving distance. \n | 19. The apparatus of claim 17, wherein the processing circuitry is further configured to: \noutput the determined route for the trip to the navigation system and automatically control driving of the vehicle through a control unit according to the determined route that is output to the navigation system. \n | 20. A non-transitory computer readable storage medium having instructions stored thereon that when executed by processing circuitry causes the processing circuitry to perform operations, the operations comprising: \nreceiving a request from a user to check weather information, identifying an expected location and expected time of a trip provided by the request, retrieving the weather information corresponding to the trip, and outputting the weather information and trip suggestions in response to the request; \ncapturing user information of the user and trip information of the trip, retrieving the weather information corresponding to the trip, outputting the weather information corresponding to the trip, and providing trip suggestions to the user according to the user information, the trip information and the weather information; and \nretrieving the trip information of the trip, identifying a plurality of available routes associated with the trip, retrieving respective weather information of each of the plurality of available routes, retrieving respective physical information of each of the plurality of available routes, retrieving respective traffic information of each of the plurality of available routes, determining a route for the trip based on the respective weather information, the respective physical information and the respective traffic information of each of the plurality of available routes, and outputting the determined route for the trip.", "output": "The method involves receiving a request to check the weather information by an interface circuitry of an apparatus, which is installed in a vehicle. The expected location and the expected time of a trip provided by the request are identified by a processing circuitry of the apparatus. The weather information corresponding to the trip is retrieved, where the weather information and the trip suggestions in response to the request are outputted. The user information and the trip information of the trip are captured by the interface circuitry, where the weather information corresponding to the trip is retrieved. The weather information corresponding to the trip is outputted, where the trip suggestions to a user is provided according to the user information. INDEPENDENT CLAIMS are included for the following:an apparatus with a processing circuitry; anda non-transitory computer readable storage medium for storing the instructions executed by a processing circuitry to perform the operations. Method for providing the weather support. The request to check the weather information is received by an interface circuitry of an apparatus, which is installed in a vehicle, and hence ensures providing the weather information to a user of a vehicle in an effective manner and provides the route selections based on the weather information, and also avoids the weather hazard, thus identifies the similar user information in a future event accurately and promptly. The drawing shows a schematic view of an apparatus for the weather support. 100Weather support apparatus100AInterface group100BProcessing group102Camera104Audio input device106Audio output device108Communication device", "instruction": "Please summarize the input" }, { "input": "Route modification to continue fully-autonomous drivingA system and method for route modification to continue fully-autonomous driving is provided. The method includes operating a vehicle in a Level 3 autonomous driving mode according to a determined route; collecting data in real time concerning the route ahead of the vehicle; based on the collected data, identifying areas of the route ahead of the vehicle that would require the vehicle to leave the Level 3 autonomous driving mode; modifying the route based on the identified areas to continue operating in the Level 3 autonomous driving mode; and operating the vehicle in a Level 3 autonomous driving mode according to the modified route.What is claimed is: \n | 1. A vehicle comprising: \na processor; and \na non-transitory machine-readable storage medium encoded with instructions executable by the processor, the machine-readable storage medium comprising instructions to cause the processor to perform a method comprising: \noperating the vehicle in a Level 3 autonomous driving mode according to a first route; \ncollecting data in real time concerning the first route ahead of the vehicle; \nbased on the collected data, identifying areas of the first route ahead of the vehicle that would cause the vehicle to leave the Level 3 autonomous driving mode, the identified areas comprising a portion of a first lane of the first route from which a safe automatic stop of the vehicle cannot be performed; \nmodifying the first route to a second route by a system of the vehicle, wherein the second route is based on selecting a second lane to avoid the first lane comprising the portion from which the safe automatic stop of the vehicle cannot be performed, and wherein the second route allows the vehicle to continue operating in the Level 3 autonomous driving mode along the second route; and \noperating the vehicle in a Level 3 autonomous driving mode according to the second route. \n | 2. The vehicle of claim 1, wherein identifying areas of the first route ahead of the vehicle that would require the vehicle to leave the Level 3 autonomous driving mode comprises: \nidentifying areas of the first route ahead of the vehicle where a safe automatic stop of the vehicle cannot be performed. \n | 3. The vehicle of claim 1, wherein the method further comprises: \nmodifying the first route based on the identified areas to continue operating in the Level 3 autonomous driving mode according to a constraint, wherein the constraint comprises at least one of a maximum trip time, a maximum trip mileage, a maximum increase in trip drive time, a maximum increase in trip mileage, a maximum percentage increase in trip drive time, a maximum percentage increase in trip mileage, and a desired time of arrival at a destination. \n | 4. The vehicle of claim 1, wherein the method further comprises: \npresenting the second route to an occupant of the vehicle; and \nmodifying the first route to the second route only after receiving a confirmation of the second route from the occupant. \n | 5. The vehicle of claim 1, wherein the method further comprises: \ncollecting the data in real time using at least one of sensors on the vehicle, communication with other vehicles, a map database, and a position of the vehicle. \n | 6. The vehicle of claim 1, wherein the portion of the first lane of the first route from which the safe automatic stop of the vehicle cannot be performed comprises a travel lane with no shoulder.\n | 7. The vehicle of claim 1, wherein the portion of the first lane of the first route from which the safe automatic stop of the vehicle cannot be performed comprises a construction zone.\n | 8. The vehicle of claim 1, wherein the data collected in real time is received from a pre-mapped database.\n | 9. The vehicle of claim 1, wherein the data collected in real time is received from vehicle-to-vehicle communication.\n | 10. The vehicle of claim 1, wherein the data collected in real time is real-time traffic information.\n | 11. A non-transitory machine-readable storage medium encoded with instructions executable by a hardware processor of a computing component of a vehicle, the machine-readable storage medium comprising instructions to cause the hardware processor to perform a method comprising: \noperating the vehicle in a Level 3 autonomous driving mode according to a first route; \ncollecting data in real time concerning the first route ahead of the vehicle; \nbased on the collected data, identifying areas of the route ahead of the vehicle that would cause the vehicle to leave the Level 3 autonomous driving mode, the identified areas comprising a portion of a first lane of the first route from which a safe automatic stop of the vehicle cannot be performed; \nmodifying the first route to a second route by a system of the vehicle, wherein the second route is based on selecting a second lane to avoid the first lane comprising the portion from which the safe automatic stop of the vehicle cannot be performed, and wherein the second route allows the vehicle to continue operating in the Level 3 autonomous driving mode along the second route; and \noperating the vehicle in a Level 3 autonomous driving mode according to the second route. \n | 12. The medium of claim 11, wherein identifying areas of the first route ahead of the vehicle that would require the vehicle to leave the Level 3 autonomous driving mode comprises: \nidentifying areas of the first route ahead of the vehicle where a safe automatic stop of the vehicle cannot be performed. \n | 13. The medium of claim 11, wherein the method further comprises: \nmodifying the first route based on the identified areas to continue operating in the Level 3 autonomous driving mode according to a constraint, wherein the constraint comprises at least one of a maximum trip time, a maximum trip mileage, a maximum increase in trip drive time, a maximum increase in trip mileage, a maximum percentage increase in trip drive time, a maximum percentage increase in trip mileage, and a desired time of arrival at a destination. \n | 14. The medium of claim 11, wherein the method further comprises: \npresenting the second route to an occupant of the vehicle; and \nmodifying the first route to the second route only after receiving a confirmation of the second route from the occupant. \n | 15. The medium of claim 11, wherein the method further comprises: \ncollecting the data in real time using at least one of sensors on the vehicle, communication with other vehicles, a map database, and a position of the vehicle. \n | 16. A method for operating a vehicle, the method comprising: \noperating the vehicle in a Level 3 autonomous driving mode according to a first route; \ncollecting data in real time concerning the first route ahead of the vehicle; \nbased on the collected data, identifying areas of the first route ahead of the vehicle that would cause the vehicle to leave the Level 3 autonomous driving mode, the identified areas comprising a portion of a first lane of the first route from which a safe automatic stop of the vehicle cannot be performed; \nmodifying the first route to a second route by a system of the vehicle, wherein the second route is based on selecting a second lane to avoid the first lane comprising the portion from which the safe automatic stop of the vehicle cannot be performed, and wherein the second route allows the vehicle to continue operating in the Level 3 autonomous driving mode along the second route; and \noperating the vehicle in a Level 3 autonomous driving mode according to the second route. \n | 17. The method of claim 16, wherein identifying areas of the first route ahead of the vehicle that would require the vehicle to leave the Level 3 autonomous driving mode comprises: \nidentifying areas of the first route ahead of the vehicle where a safe automatic stop of the vehicle cannot be performed. \n | 18. The method of claim 16, further comprising: \nmodifying the first route based on the identified areas to continue operating in the Level 3 autonomous driving mode according to a constraint, wherein the constraint comprises at least one of a maximum trip time, a maximum trip mileage, a maximum increase in trip drive time, a maximum increase in trip mileage, a maximum percentage increase in trip drive time, a maximum percentage increase in trip mileage, and a desired time of arrival at a destination. \n | 19. The method of claim 16, further comprising: \npresenting the second route to an occupant of the vehicle; and \nmodifying the first route to the second route only after receiving a confirmation of the second route from the occupant.", "output": "The vehicle (102) comprises a processor, and a non-transitory machine-readable storage medium is encoded with instructions executable by the processor, and the vehicle is operated in a level-3 autonomous driving mode according to a determined route. The data is collected in real time concerning the route ahead of the vehicle. Areas of the route ahead of the vehicle is identified that is require the vehicle to leave the level-3 autonomous driving mode based on the collected data. The route is modified based on the identified areas to continue operating in the level-3 autonomous driving mode. The vehicle is operated in a level-3 autonomous driving mode according to the modified route. INDEPENDENT CLAIMS are included for the following:a non-transitory machine-readable storage medium having stored instructions for implementing the method for operating a vehicle; anda method for operating a vehicle. Vehicle, such as autonomous vehicle. Driver can be ready to take full control of the vehicle as in manual mode. Route control circuit can receive information from multiple vehicle sensors to determine the route control mode should be activated. The drawing shows a block diagram of vehicle. 28Differential gear device30Axles32Crankshaft34Wheels102Vehicle", "instruction": "Please summarize the input" }, { "input": "Parking assistance control for vehicle with autonomous operation capabilityProvided is a method and device parking assistance for a vehicle capable of autonomous operation. The embodiment herein operates to receive a parking zone, which is based on a destination location and a user-defined parking parameter, and includes a plurality of parking locations. When on approach to the parking zone under an autonomous vehicle operation, the parking assistance determines whether the parking zone includes at least one parking location that is physically available for parking the vehicle. When the parking zone does not, the parking assistance prompts the vehicle to engage in a holding pattern for a predetermined period of time. While in the holding pattern under the autonomous operation, the parking assistance periodically determines whether the at least one parking location becomes available. When the predetermined period of time lapses, parking status of the vehicle is transmitted to a vehicle user.What is claimed is: \n | 1. A method in a parking assistance control unit for a vehicle capable of autonomous operation, the method comprising: \nreceiving a parking zone based on a destination location and a user-defined parking parameter, the parking zone including a plurality of parking locations; \nwhen on approach to the parking zone under the autonomous operation, determining whether the parking zone includes at least one parking location of the plurality of parking locations that is physically available for parking the vehicle; \nwhen the parking zone does not include the at least one parking location that is physically available for parking the vehicle: \nprompting the vehicle to engage in a holding pattern for a predetermined period of time, wherein the holding pattern includes at least one of a dynamic holding pattern indicative of placing the vehicle in motion and a stationary holding pattern indicative of placing the vehicle in a stopped state; \nwhile in the holding pattern under the autonomous operation, periodically determining whether the at least one parking location becomes available; \nwhen the predetermined period of time lapses, transmitting a parking status of the vehicle based on a result of the periodically determining whether the at least one parking location becomes available. \n | 2. The method of claim 1, wherein the determining whether the parking zone includes at least one parking location that is physically available for parking the vehicle further comprises at least one of: \nreceiving a vehicle-to-vehicle communications relating to the at least one parking location; \nreceiving a vehicle-to-infrastructure communications relating to the at least one parking location; and \nsensing through a plurality of sensor devices at a periphery of the vehicle relating to the at least one parking location. \n | 3. The method of claim 1, wherein the holding pattern comprises: \na combinational stationary and dynamic holding pattern. \n | 4. The method of claim 1, wherein the holding pattern is based on traffic trends related to the parking zone.\n | 5. The method of claim 4, wherein the traffic trends include at least one of: \na traffic density assessment; and \na local venue activity assessment. \n | 6. The method of claim 1, wherein the holding pattern is based on a vehicle restriction, the vehicle restriction including at least one of: \na vehicle range limitation; and \na maneuverability limitation of the vehicle. \n | 7. The method of claim 1, further comprising: \ndetermining whether the predetermined period of time has elapsed; and \nwhen the period of time has elapsed: \nreceiving a secondary parking zone based on the destination location and the user-defined parking parameter; \nwhen on approach to the secondary parking zone under the autonomous operation, determining whether the secondary parking zone includes at least one parking location that is physically available for parking the vehicle; and \nwhen the secondary parking zone does not include the at least one parking location: \nprompting the vehicle to engage in another holding pattern for another period of time; \nwhile in the holding pattern under the autonomous operation, periodically determining whether the secondary parking zone includes the at least one parking location or another at least one parking location; and \ntransmitting another parking status of the vehicle. \n | 8. The method of claim 1, wherein the parking zone is defined by a graphic user interface of the vehicle.\n | 9. A parking assistance control unit for a vehicle comprising: \na wireless communication interface to service communication with user equipment of a vehicle user; \na plurality of sensor devices disposable about the vehicle; \none or more processors coupled to the wireless communication interface and in communication with the plurality of sensor devices, the one or more processors for controlling operations of the parking assistance control unit; \na memory coupled to the one or more processors, the memory for storing data and program instructions used by the one or more processors, wherein the one or more processors are configured to execute instructions stored in the memory to: \nreceive a parking zone based on a destination location and a user-defined parking parameter, the parking zone including a plurality of parking locations; \nwhen on approach to the parking zone under the autonomous operation, determine whether the parking zone includes at least one parking location being physically available for parking the vehicle via sensor data of the plurality of sensor devices; and \nwhen the parking zone does not include the at least one parking location that is physically available for parking the vehicle: \nprompt the vehicle to engage in a holding pattern for a predetermined period of time, wherein the holding pattern includes at least one of a dynamic holding pattern indicative of placing the vehicle in motion and a stationary holding pattern indicative of placing the vehicle in a stopped state; \nwhile in the holding pattern under the autonomous operation, periodically determining whether the at least one parking location becomes available; and \nwhen the predetermined period of time lapses, transmit via the wireless communication interface to the user equipment of the vehicle user a parking status of the vehicle based on a result of the periodically determining whether the at least one parking location becomes available. \n | 10. The parking assistance control unit of claim 9, wherein the one or more processors are further configured to execute further instructions stored in the memory to determine whether the parking zone includes at least one parking location that is physically available for parking the vehicle further comprising: \nvehicle-to-vehicle communication relating to the at least one parking location; and \nvehicle-to-infrastructure communication relating to the at least one parking location. \n | 11. The parking assistance control unit of claim 10, wherein the holding pattern comprises: \na combinational stationary and dynamic holding pattern. \n | 12. The parking assistance control unit of claim 10, wherein the holding pattern is based on traffic trends related to the parking zone.\n | 13. The parking assistance control unit of claim 12, wherein the traffic trends include at least one of: \na traffic density assessment; and \na local venue activity assessment. \n | 14. The parking assistance control unit of claim 10, wherein the holding pattern is based on a vehicle restriction, the vehicle restriction including at least one of: \na vehicle range limitation; and \na maneuverability limitation of the vehicle. \n | 15. The parking assistance control unit of claim 10, wherein the one or more processors are further configured to execute further instructions stored in the memory to: \ndetermine whether the predetermined period of time has elapsed; and \nwhen the predetermined period of time has elapsed: \nreceive a secondary parking zone based on the destination location and the user-defined parking parameter; \nwhen on approach to the secondary parking zone under the autonomous operation, determine whether the secondary parking zone includes at least one parking location that is physically available for parking the vehicle; and \nwhen the secondary parking zone does not include the at least one parking location: \nprompt the vehicle to engage in another holding pattern for another period of time; \nwhile in the holding pattern under the autonomous operation, periodically determine whether the secondary parking zone includes the at least one parking location or another at least one parking location; and \ntransmit another parking status of the vehicle. \n | 16. The parking assistance control unit of claim 9, wherein the parking zone is defined by a graphic user interface of the vehicle.\n | 17. A parking assistance device comprising: \none or more sensor devices disposable about a vehicle, the one or more sensor devices configured to monitor surroundings relative to the vehicle; \na wireless communication interface operable to service communications; and \na computing device coupled to the wireless communication interface and in communication with the one or more sensor devices, the computing device including: \none or more processors, the one or more processors for controlling operations of the parking assistance device; \na memory coupled to the one or more processors, the memory for storing data and program instructions used by the one or more processors, wherein the one or more processors are configured to execute instructions stored in the memory to: \nreceive a parking zone based on a destination location and a user-defined parking parameter, the parking zone including a plurality of parking locations; \nwhen on approach to the parking zone under the autonomous operation, determine whether the parking zone includes at least one parking location that is physically available for parking the vehicle via sensor data of the one or more sensor devices; and \nwhen the parking zone does not include the at least one parking location that is physically available for parking the vehicle: \nprompt the vehicle to engage in a holding pattern for a predetermined period of time, wherein the holding pattern includes at least one of a dynamic holding pattern indicative of placing the vehicle in motion and a stationary holding pattern indicative of placing the vehicle in a stopped state; \nwhile in the holding pattern under the autonomous operation, periodically determining whether the at least one parking location becomes available; and \nwhen the predetermined period of time lapses, transmit via the wireless communication interface, a parking status of the vehicle based on a result of the periodically determining whether the at least one parking location becomes available. \n | 18. The parking assistance device of claim 17, wherein the holding pattern comprises: \na combinational stationary and dynamic holding pattern. \n | 19. The parking assistance device of claim 17, wherein the holding pattern is based on traffic trends related to the parking zone.\n | 20. The parking assistance device of claim 19, wherein the traffic trends include at least one of: \na traffic density assessment; and \na local venue activity assessment.", "output": "The method involves receiving a parking zone based on a destination location and a user-defined parking parameter, where the parking zone is provided with multiple parking locations. The parking zone provided at a parking location from multiple parking locations physically available for parking a vehicle (100) is determined while providing an approach to the parking zone under an autonomous operation. The vehicle is prompted to engage in a holding pattern for a predetermined period of time when the parking zone is not provided with the parking location. INDEPENDENT CLAIMS are also included for the following:a parking assistance control unit with a wireless communication interface; anda parking assistance device with a computing device. Method for a parking assistance control unit (Claimed) for a vehicle for an autonomous operation. The parking zone is received based on a destination location and a user-defined parking parameter, where the parking zone is provided with multiple parking locations, and hence ensures improving the positional accuracy with the autonomous parking features and permits the flexibility to a vehicle user in an event in an easy manner. The drawing shows a schematic view of a vehicle with a parking assistance control unit. 100Vehicle102,104Sensor devices106a,106bVideo sensor devices200Parking assistance control unit220Antenna", "instruction": "Please summarize the input" }, { "input": "SYSTEMS AND METHODS FOR DYNAMIC ROAD SIGN PERSONALIZATIONSystems and methods are provided for presenting personalized information to one or more vehicle occupants as a vehicle approaches and/or passes a road sign. Characteristics of a vehicle, e.g., operating conditions, characteristics of the one or more vehicle occupants, e.g., age, demographic, driving record, purchase history, etc. and/or road conditions, e.g., weather-related road conditions, current traffic conditions, etc., may be obtained. Information relevant to the one or more vehicle occupants based on one or more of these characteristics/conditions may presented to the one or more vehicle occupants. In scenarios where the presentation of personalized information is unwanted/unwarranted, more generalized information and/or safety warnings or recommendations can be presented.What is claimed is: \n | 1. A method comprising: \ndetecting proximity of a vehicle to a road sign; \nobtaining data regarding at least one of operating characteristics of the vehicle and characteristics of one or more occupants of the vehicle; \ndetermining current traffic conditions proximate to the road sign; \nupon a determination that personalized information should not be presented via the road sign based on the current traffic conditions, presenting generalized information on the road sign; and \nupon a determination that personalized information should be presented via the road sign, presenting the personalized information on the road sign. \n | 2. The method of claim 1, wherein obtaining data regarding the operating characteristics of the vehicle comprise communicating with the at least one of an electronic control unit and one or more sensors of the vehicle over a vehicle-to-infrastructure communications channel to obtain the operating characteristics of the vehicle.\n | 3. The method of claim 1, wherein the operating characteristics of the vehicle comprise at least one of vehicle speed, vehicle acceleration, travel origin, travel destination, lane change history, and continuous operation time.\n | 4. The method of claim 1, wherein obtaining data regarding characteristics of one or more occupants of the vehicle comprises receiving at least one of biometric sensor signals from biometric sensors associated with the vehicle, obtaining vehicle associated account information regarding the one or more occupants of the vehicle, and obtaining information suggestive of at least one of the one or more occupants' travel preferences, travel history, purchase preferences, purchase history, and demographic information associated with the one or more occupants of the vehicle.\n | 5. The method of claim 1, wherein the current traffic conditions upon which the determination that personalized information should not be presented comprises traffic conditions that result in non-intended recipients of the personalized information receiving the personalized information.\n | 6. The method of claiml, wherein the generalized information is based upon at least one of operating characteristics of a group of vehicles proximate to the road sign, common characteristics of one or more occupants of each vehicle in the group of vehicles, and road conditions data applicable to the group of vehicles.\n | 7. The method of claim 1, further comprising prior to presenting the personalized information on the road side, determining to present the personalized information on another road sign, based upon a change in at least one of visibility of the road sign, speed of travel of the vehicle, direction of travel of the vehicle, and lane change.\n | 8. A method comprising: \ndetecting proximity of a vehicle to a digital billboard; \nobtaining operating characteristics of the vehicle; \nobtaining characteristics of at least one occupant of the vehicle; \ndetermining at least a current speed of travel of the vehicle; \nestimating whether first, targeted media for presentation on the digital billboard selected based upon the operating characteristics of the vehicle and the characteristics of the at least one occupant of the vehicle could be generated and viewed by the at least one occupant based upon the current speed of travel of the vehicle; and if the first, targeted media for presentation on the digital billboard cannot be generated for and viewed by the at least one occupant, presenting second, generalized media on the digital billboard based upon operational characteristics of the vehicle. \n | 9. The method of claim 8, wherein the at least one occupant is a driver of the vehicle.\n | 10. The method of claim 9, wherein the second, generalized media comprises at least one of a safety warning and a safe driving recommendation based upon the characteristics of the driver.\n | 11. The method of claim 8, further comprising: \ndetecting proximity of at least one other vehicle to the digital billboard; \nobtaining operating characteristics of the least one other vehicle; \nobtaining characteristics of at least one additional occupant of the at least one other vehicle; \ndetermining at least a current speed of travel of the at least one other vehicle; \nestimating whether the first, targeted media for presentation on the digital billboard could be generated and viewed by the at least one occupant of each of the vehicles proximate to the digital billboard based upon the current speeds of travel of each of the vehicle; and if the first, targeted media for presentation on the digital billboard cannot be generated for and viewed by the at least one occupant of each of the vehicles proximate to the digital billboard, presenting second, generalized media on the digital billboard based upon aggregate operational characteristics of each of the vehicles proximate to the digital billboard. \n | 12. The method of claim 8, further comprising determining whether the vehicle has declined targeted media presentation, and upon a determination that the vehicle has declined targeted media presentation, controlling the digital billboard to generate at least one of a blank and non-informational presentation.\n | 13. The method of claim 8, further comprising determining whether the obtained operating characteristics of the vehicle comprises a vehicle setting opting out of targeted media for presentation on the digital billboard.\n | 14. A system, comprising: \nat least one processor; and \nat least one memory unit operatively connected to the processor, the at least one memory unit having stored thereon, at least one computer program comprising computer code causing the at least one processor to perform the following: \nobtain operating characteristics of a vehicle detected as being proximate to a digital billboard; \nobtain characteristics of at least one occupant of the vehicle; \ndetermine at least a current speed of travel of the vehicle; \ntransmit the obtained operating characteristics of the vehicle, the characteristics of the at least one occupant, and the current speed of travel of the vehicle to a controller controlling operation of the digital billboard, the controller adapted to estimate whether first, targeted media for presentation on the digital billboard selected based upon the characteristics of the at least one occupant and the operating characteristics of the vehicle could be generated and viewed by the at least one occupant based upon the current speed of travel of the vehicle; and if the first, targeted media for presentation on the digital billboard cannot be generated for and viewed by the at least one occupant, controlling the digital billboard to present second, generalized media on the digital billboard based upon operational characteristics of the vehicle. \n | 15. The system of claim 14, wherein at least one of the operating characteristics of the vehicle and the characteristics of the at least one occupant of the vehicle are obtained by a vehicle-to-infrastructure (V2I) capable roadside unit.\n | 16. The system of claim 15, wherein the V2I capable roadside unit obtains the at least one of the operating characteristics of the vehicle and the characteristics of the at least one occupant of the vehicle from at least one of an electronic control unit of the vehicle, one or more sensors implemented in or associated with the vehicle, and a database in which at least one aspect of the characteristics of the at least one occupant are stored.\n | 17. The system of claim 14, wherein the controller controlling operation of the digital billboard is at least one of remotely located from the digital billboard and co-located with the digital billboard.\n | 18. The system of claim 14, wherein the vehicle comprises an autonomous vehicle, and wherein the controller controlling operation of the digital billboard initiates transmission of a speed override command to the autonomous vehicle to reduce its speed to a speed allowing the targeted media to be viewed by the at least one occupant.", "output": "The method involves detecting proximity of a vehicle (110) to a road sign (104). Data regarding one of operating characteristics of the vehicle and characteristics of occupants of the vehicle is obtained. Current traffic conditions proximate to the road sign are determined. Generalized information is presented on the road sign upon a determination that personalized information is not be presented through the road sign based on the current traffic conditions. The personalized information is presented on the road sign upon a determination that personalized information is presented through the road sign. An INDEPENDENT CLAIM is also included for a system for obtaining operating characteristics of vehicles and/or occupants proximate to a road sign. Method for obtaining operating characteristics of vehicles and/or occupants proximate to a road sign i.e. digital billboard. Uses include but are not limited to operating conditions such as weather-related road conditions and current traffic conditions and/or age, demographic, driving record and purchase history. The method enables providing redundancy and/or multiple sources of information that is compared or used as a way to verify a validity of received information, thus increasing accuracy of information. The drawing shows a schematic view of a driving scenario. 100Roadway102Roadside unit104Road sign106Roadside unit110,114Vehicles", "instruction": "Please summarize the input" }, { "input": "RESOLVING VEHICLE APPLICATION VERSION DIFFERENCESThe disclosure includes embodiments for resolving vehicle application version differences for connected vehicles. A method includes determining a common vehicle application that is installed in both a remote connected vehicle and an ego vehicle and identifying version differences in a common vehicle application installed in both the ego vehicle and the remote connected vehicle. The method includes forming a vehicular micro cloud. The method includes determining a maximum possible functionality of the common vehicle application. The method includes determining a set of tasks to be completed to achieve the maximum possible functionality. The method includes assigning a first subset of the set of tasks to the ego vehicle and a second subset of the set of tasks to the remote connected vehicle. The method includes using the vehicular micro cloud to cause the ego vehicle to complete the first subset and the remote connected vehicle to complete the second subset.What is claimed is: \n | 1. A method executed by a processor of an ego vehicle, the method comprising: \ndetermining a common vehicle application that is installed in both a remote connected vehicle and an ego vehicle and identifying a difference in a first version of the common vehicle application installed in the remote connected vehicle and a second version of the common vehicle application installed in the ego vehicle; \nforming a vehicular micro cloud including the ego vehicle and the remote connected vehicle responsive to the difference being identified; \ndetermining, by a processor of the ego vehicle, a maximum possible functionality of the common vehicle application; \ndetermining a set of tasks to be completed to achieve the maximum possible functionality; \nassigning a first subset of the set of tasks to the ego vehicle and a second subset of the set of tasks to the remote connected vehicle, wherein collectively the first subset and the second subset include each of the tasks in the set of tasks; and \nusing the vehicular micro cloud to cause the ego vehicle to complete the first subset and the remote connected vehicle to complete the second subset so that the maximum possible functionality of the common application is achieved. \n | 2. The method of claim 1, wherein the common vehicle application includes a vehicle control system.\n | 3. The method of claim 1, wherein the common vehicle application includes an autonomous driving system.\n | 4. The method of claim 1, wherein the common vehicle application includes an Advanced Driver Assistance System.\n | 5. The method of claim 1, wherein the maximum possible functionality includes the functionality of a most recently released version of the common vehicle application.\n | 6. The method of claim 1, wherein the maximum possible functionality includes less functionality than a most recently released version of the common vehicle application.\n | 7. The method of claim 1, wherein the vehicular micro cloud provides functionality that benefits each member of the vehicular micro cloud.\n | 8. The method of claim 1, further comprising parsing V2X data from a wireless message received from the ego vehicle, wherein the V2X data includes remote application data that describes a first set of applications installed in the remote connected vehicle and first version information for the first set of applications, wherein ego application data describes a second set of applications installed in the ego vehicle and second version information for the second set of applications.\n | 9. The method of claim 8, wherein the common vehicle application is determined based on comparison of the remote application data and the ego application data.\n | 10. The method of claim 8, wherein the V2X data includes member data describing a first hardware capability of the remote vehicle.\n | 11. The method of claim 10, wherein the maximum possible functionality of the common vehicle application is determined by the processor of the ego vehicle based on one or more of: the first hardware capability of the remote vehicle; a second hardware capability of the ego vehicle; the first version information; and the second version information.\n | 12. The method of claim 1, wherein the vehicular micro cloud includes each member of the vehicular micro cloud sharing their unused computing resources with one another to complete the set of tasks for a benefit of each member of the vehicular micro cloud.\n | 13. The method of claim 12, wherein the benefit includes reducing a risk of a collision by resolving vehicle application version differences among members of the vehicular micro cloud.\n | 14. A system of an ego vehicle comprising: \na communication unit; \na non-transitory memory; \nand a processor communicatively coupled to the communication unit and the non-transitory memory, wherein the non-transitory memory stores computer readable code that is operable, when executed by the processor, to cause the processor to execute steps including: \ndetermining a common vehicle application that is installed in both a remote connected vehicle and an ego vehicle and identifying a difference in a first version of the common vehicle application installed in the remote connected vehicle and a second version of the common vehicle application installed in the ego vehicle; \nforming a vehicular micro cloud including the ego vehicle and the remote connected vehicle responsive to the difference being identified; \ndetermining, by a processor of the ego vehicle, a maximum possible functionality of the common vehicle application; \ndetermining a set of tasks to be completed to achieve the maximum possible functionality; \nassigning a first subset of the set of tasks to the ego vehicle and a second subset of the set of tasks to the remote connected vehicle, wherein collectively the first subset and the second subset include each of the tasks in the set of tasks; and \ncausing the ego vehicle to complete the first subset and the remote connected vehicle to complete the second subset so that the maximum possible functionality of the common application is achieved. \n | 15. The system of claim 14, further comprising parsing V2X data from a wireless message received from the ego vehicle, wherein the V2X data includes remote application data that describes a first set of applications installed in the remote connected vehicle and first version information for the first set of applications, wherein ego application data describes a second set of applications installed in the ego vehicle and second version information for the second set of applications.\n | 16. The system of claim 15, wherein the common vehicle application is determined based on comparison of the remote application data and the ego application data.\n | 17. The system of claim 15, wherein the V2X data includes member data describing a first hardware capability of the remote vehicle.\n | 18. The system of claim 17, wherein the maximum possible functionality of the common vehicle application is determined by the processor of the ego vehicle based on one or more of: the first hardware capability of the remote vehicle; a second hardware capability of the ego vehicle; the first version information; and the second version information.\n | 19. A computer program product of an ego vehicle including computer code stored on a non-transitory memory that is operable, when executed by an onboard vehicle computer of the ego vehicle, to cause the onboard vehicle computer to execute operations including: \ndetermine a common vehicle application that is installed in both a remote connected vehicle and an ego vehicle and identifying a difference in a first version of the common vehicle application installed in the remote connected vehicle and a second version of the common vehicle application installed in the ego vehicle; \nform a vehicular micro cloud including the ego vehicle and the remote connected vehicle responsive to the difference being identified; \ndetermine a maximum possible functionality of the common vehicle application; \ndetermine a set of tasks to be completed to achieve the maximum possible functionality; \nassign a first subset of the set of tasks to the ego vehicle and a second subset of the set of tasks to the remote connected vehicle, wherein collectively the first subset and the second subset include each of the tasks in the set of tasks; and \ncause the ego vehicle to complete the first subset and the remote connected vehicle to complete the second subset so that the maximum possible functionality of the common application is achieved. \n | 20. The computer program product of claim 19, wherein the non-transitory memory stores additional computer code that is operable, when executed by the onboard vehicle computer, to cause the onboard vehicle computer to execute additional operations including: \nparse V2X data from a wireless message received from the ego vehicle, wherein the V2X data includes remote application data that describes a first set of applications installed in the remote connected vehicle and first version information for the first set of applications, wherein ego application data describes a second set of applications installed in the ego vehicle and second version information for the second set of applications; and \ncompare the remote application data and the ego application data to identify the common vehicle application based on the comparison.", "output": "The method involves determining a common vehicle application that is installed in both of a remote connected vehicle (124) and an ego vehicle. A vehicular micro cloud (194) including the ego vehicle and the remote vehicle is formed responsive to the difference being identified. A maximum possible functionality of the application is determined. A set of tasks to be completed is determined to achieve the functionality. Two subsets of the tasks are assigned to the vehicles, respectively. The micro cloud is used to cause the vehicles to complete the subsets, so that the functionality is achieved by a processor of the vehicle. INDEPENDENT CLAIMS are included for:(1) A system of an ego vehicle comprising a communication unit.(2) A computer program product of an ego vehicle including computer code stored on a non-transitory memory that is operable to cause the onboard vehicle computer to execute operations for resolving vehicle application version differences for connected vehicles. Method for resolving vehicle application version differences for connected vehicles. The product enables resolving vehicle application version differences for connected vehicles in an effective manner. The product allows the ego vehicle to complete the subset and the remote connected vehicle to perform the subset, so that the maximum possible functionality of the common application is achieved in an efficient manner. The drawing shows a schematic block diagram of an operating environment for a resolver system.103Server 124Remote connected vehicle 150Standard-compliant GPS unit 194Vehicular micro cloud 199Resolver system ", "instruction": "Please summarize the input" }, { "input": "SYSTEMS AND METHODS FOR VEHICULAR-NETWORK-ASSISTED FEDERATED MACHINE LEARNINGSystems and methods for vehicular-network-assisted federated machine learning are disclosed herein. One embodiment transmits first metadata from a connected vehicle to at least one other connected vehicle; receives, at the connected vehicle, second metadata from the at least one other connected vehicle; receives, at the connected vehicle based on analysis of the first and second metadata, a notification that the connected vehicle has been elected to participate in the current training phase of a federated machine learning process; receives, at the connected vehicle, instructions to prepare the connected vehicle for the next training phase; trains a machine learning model to perform a task at the connected vehicle during the current training phase to produce a locally trained machine learning model; and submits the locally trained machine learning model for aggregation with at least one other locally trained machine learning model to produce an aggregated locally trained machine learning model.What is claimed is: \n | 1. A system for vehicular-network-assisted federated machine learning, the system comprising: \none or more processors; and \na memory communicably coupled to the one or more processors and storing: \na federated learning module including instructions that when executed by the one or more processors cause the one or more processors to: \ntransmit, after initiation of a federated machine learning process, first metadata from a connected vehicle to at least one other connected vehicle, wherein the connected vehicle and the at least one other connected vehicle are members of a vehicular micro cloud of networked vehicles and the first metadata includes information regarding sensor capabilities of the connected vehicle; \nreceive, at the connected vehicle, second metadata from the at least one other connected vehicle, the second metadata including information regarding sensor capabilities of the at least one other connected vehicle; \nreceive, at the connected vehicle based on an analysis of the first and second metadata, a notification that the connected vehicle has been elected to participate in a current training phase of the federated machine learning process; and \nreceive, at the connected vehicle, instructions to prepare the connected vehicle for a next training phase of the federated machine learning process; \na training module including instructions that when executed by the one or more processors cause the one or more processors to train a machine learning model to perform a task at the connected vehicle during the current training phase to produce a locally trained machine learning model; and \nan aggregation module including instructions that when executed by the one or more processors cause the one or more processors to submit the locally trained machine learning model for aggregation with at least one other locally trained machine learning model produced by at least one other elected vehicle in the vehicular micro cloud to produce an aggregated locally trained machine learning model. \n | 2. The system of claim 1, wherein the first and second metadata include information regarding at least one of availability of particular types of sensors, capabilities of specific available sensors, quality of the specific available sensors, and an itinerary that includes scheduled stops.\n | 3. The system of claim 1, wherein the instructions to prepare the connected vehicle for the next training phase of the federated machine learning process include one or more of a speed advisory, a lane-change request, and a rerouting request.\n | 4. The system of claim 1, wherein the connected vehicle receives the instructions to prepare the connected vehicle for the next training phase of the federated machine learning process through coordination among a cloud leader of the vehicular micro cloud and at least one other cloud leader of a different vehicular micro cloud within a same vehicular macro cloud.\n | 5. The system of claim 1, wherein the analysis of the first and second metadata is performed at one of a cloud server and an edge server, the analysis of the first and second metadata resulting in the connected vehicle being elected to participate in the current training phase of the federated machine learning process.\n | 6. The system of claim 1, wherein the analysis of the first and second metadata is performed at the connected vehicle and the at least one other connected vehicle in a distributed fashion, the analysis of the first and second metadata resulting in the connected vehicle being elected to participate in the current training phase of the federated machine learning process.\n | 7. The system of claim 1, wherein the connected vehicle and the at least one other connected vehicle communicate via one or more vehicle-to-vehicle (V2V) communication links.\n | 8. The system of claim 1, wherein the aggregated locally trained machine learning model is further aggregated with at least one other aggregated locally trained machine learning model from another vehicular micro cloud at one of a cloud server and an edge server in preparation for the next training phase of the federated machine learning process.\n | 9. The system of claim 1, wherein the connected vehicle is an autonomous vehicle.\n | 10. A non-transitory computer-readable medium for vehicular-network-assisted federated machine learning and storing instructions that when executed by one or more processors cause the one or more processors to: \ntransmit, after initiation of a federated machine learning process, first metadata from a connected vehicle to at least one other connected vehicle, wherein the connected vehicle and the at least one other connected vehicle are members of a vehicular micro cloud of networked vehicles and the first metadata includes information regarding sensor capabilities of the connected vehicle; \nreceive, at the connected vehicle, second metadata from the at least one other connected vehicle, the second metadata including information regarding sensor capabilities of the at least one other connected vehicle; \nreceive, at the connected vehicle based on an analysis of the first and second metadata, a notification that the connected vehicle has been elected to participate in a current training phase of the federated machine learning process; \nreceive, at the connected vehicle, instructions to prepare the connected vehicle for a next training phase of the federated machine learning process; \ntrain a machine learning model to perform a task at the connected vehicle during the current training phase to produce a locally trained machine learning model; and \nsubmit the locally trained machine learning model for aggregation with at least one other locally trained machine learning model produced by at least one other elected vehicle in the vehicular micro cloud to produce an aggregated locally trained machine learning model. \n | 11. The non-transitory computer-readable medium of claim 10, wherein the first and second metadata include information regarding at least one of availability of particular types of sensors, capabilities of specific available sensors, quality of the specific available sensors, and an itinerary that includes scheduled stops.\n | 12. The non-transitory computer-readable medium of claim 10, wherein the instructions to prepare the connected vehicle for the next training phase of the federated machine learning process include one or more of a speed advisory, a lane-change request, and a rerouting request.\n | 13. A method of vehicular-network-assisted federated machine learning, the method comprising: \ntransmitting, after initiation of a federated machine learning process, first metadata from a connected vehicle to at least one other connected vehicle, wherein the connected vehicle and the at least one other connected vehicle are members of a vehicular micro cloud of networked vehicles and the first metadata includes information regarding sensor capabilities of the connected vehicle; \nreceiving, at the connected vehicle, second metadata from the at least one other connected vehicle, the second metadata including information regarding sensor capabilities of the at least one other connected vehicle; \nreceiving, at the connected vehicle based on an analysis of the first and second metadata, a notification that the connected vehicle has been elected to participate in a current training phase of the federated machine learning process; \nreceiving, at the connected vehicle, instructions to prepare the connected vehicle for a next training phase of the federated machine learning process; \ntraining a machine learning model to perform a task at the connected vehicle during the current training phase to produce a locally trained machine learning model; and \nsubmitting the locally trained machine learning model for aggregation with at least one other locally trained machine learning model produced by at least one other elected vehicle in the vehicular micro cloud to produce an aggregated locally trained machine learning model. \n | 14. The method of claim 13, wherein the first and second metadata include information regarding at least one of availability of particular types of sensors, capabilities of specific available sensors, quality of the specific available sensors, and an itinerary that includes scheduled stops.\n | 15. The method of claim 13, wherein the instructions to prepare the connected vehicle for the next training phase of the federated machine learning process include one or more of a speed advisory, a lane-change request, and a rerouting request.\n | 16. The method of claim 13, wherein the analysis of the first and second metadata is performed at one of a cloud server and an edge server, the analysis of the first and second metadata resulting in the connected vehicle being elected to participate in the current training phase of the federated machine learning process.\n | 17. The method of claim 13, wherein the analysis of the first and second metadata is performed at the connected vehicle and the at least one other connected vehicle in a distributed fashion, the analysis of the first and second metadata resulting in the connected vehicle being elected to participate in the current training phase of the federated machine learning process.\n | 18. The method of claim 13, wherein the connected vehicle and the at least one other connected vehicle communicate via one or more vehicle-to-vehicle (V2V) communication links.\n | 19. The method of claim 13, wherein the task is one of object detection, object recognition, and trajectory prediction.\n | 20. The method of claim 13, wherein the aggregated locally trained machine learning model is further aggregated with at least one other aggregated locally trained machine learning model from another vehicular micro cloud at one of a cloud server and an edge server in preparation for the next training phase of the federated machine learning process.", "output": "The system has a memory communicably coupled to processors (110) and storing a federated learning module (170) including instructions that when executed by the processors cause the processors to transmit metadata from a connected vehicle (180) i.e. autonomous vehicle, to another connected vehicle after initiation of federated machine learning process. The processors receive another metadata from the latter connected vehicle at the former connected vehicle, where the latter metadata includes information regarding sensor capabilities of the latter vehicle. An aggregation module submits a locally trained machine learning model for aggregation with another locally trained model produced by the latter elected vehicle in a vehicular micro cloud to produce an aggregated locally trained system learning model. INDEPENDENT CLAIMS are included for the following: Non-transitory computer-readable medium for vehicular-network-assisted federated machine learning and storing instructions Method of vehicular-network-assisted federated machine learning System for vehicular-network-assisted federated machine learning used in autonomous vehicles and intelligent driving-assistance systems for recognizing objects from red, green, blue (RGB) images or to predict trajectory of external road agents such as vehicles, cyclists, pedestrians, etc. The method enables training a machine-learning model to optimize parameters to maximize accuracy of predictions. The method allows a cloud computing environment to offer infrastructure, platforms and/or software as services for which a cloud consumer does not need to maintain resources on a local computing device. The drawing shows a block diagram of the connected vehicle.110Processors 170Federated learning module 180Connected vehicle 185Server 190Network ", "instruction": "Please summarize the input" }, { "input": "V2V CHARGE MAPPINGA system for electrical charging of a first vehicle by a second vehicle includes a network access device to communicate with a first source that includes at least one of the first vehicle or a mobile device associated with a user of the first vehicle. The system further includes a processor coupled to the network access device that is designed to receive a charge request from the first source via the network access device, the charge request requesting access to a source of electrical energy for charging the first vehicle. The processor is further designed to identify an available vehicle that is available to be used as the source of electrical energy for charging the first vehicle. The processor is further designed to control the network access device to transmit available vehicle information corresponding to the available vehicle to the first source in response to receiving the charge request.What is claimed is: \n | 1. A system for electrical charging of a first vehicle by a second vehicle, the system comprising: \na network access device configured to communicate with a first source that includes at least one of the first vehicle or a mobile device associated with a user of the first vehicle; and \na processor coupled to the network access device and configured to: \nreceive a charge request from the first source via the network access device, the charge request requesting access to a source of electrical energy for charging the first vehicle, \nidentify an available vehicle that is available to be used as the source of electrical energy for charging the first vehicle, and \ncontrol the network access device to transmit available vehicle information corresponding to the available vehicle to the first source in response to receiving the charge request. \n | 2. The system of claim 1 wherein the available vehicle information includes a location of the available vehicle.\n | 3. The system of claim 1 wherein the processor is further configured to: \ndetermine digital key information corresponding to the available vehicle; and \ntransmit the digital key information to the first source to allow access to a charge port of the available vehicle. \n | 4. The system of claim 1 wherein the processor is further configured to: \ndetermine digital key information corresponding to the first vehicle; and \ntransmit digital key information to the available vehicle or to a second mobile device associated with a second user of the available vehicle to allow access to a charge port of the first vehicle. \n | 5. The system of claim 1 wherein the processor is further configured to: \nidentify a plurality of available vehicles including the available vehicle; \ncontrol the network access device to transmit additional available vehicle data corresponding to each of the plurality of available vehicles; and \nreceive a selection of the available vehicle from the first source. \n | 6. The system of claim 1 wherein the processor is further configured to: \nidentify a plurality of available vehicles including the available vehicle; and \nselect the available vehicle from the plurality of available vehicles based on at least one of: \na current location of the first vehicle and current locations of each of the plurality of available vehicles; \na state of charge (SOC) of each of the plurality of available vehicles; or \na current route of the first vehicle. \n | 7. The system of claim 1 wherein: \nthe available vehicle is autonomous or semi-autonomous; and \nthe processor is further configured to control the network access device to at least one of navigation instructions from a current location of the available vehicle to a current location of the first vehicle, or provide the current location of the first vehicle to the available vehicle such that the second vehicle can autonomously travel to the current location of the first vehicle to charge the first vehicle. \n | 8. The system of claim 1 wherein at least one of the first vehicle or the available vehicle is autonomous or semi-autonomous, and includes at least one of a wireless charging port configured to wirelessly transmit or receive the electrical energy, or an actuator configured to actuate a charging contact to make contact with a corresponding charging contact of the other of the first vehicle or the available vehicle.\n | 9. The system of claim 1 further comprising a memory configured to store preferences for each of a plurality of available vehicles including the available vehicle, the preferences including at least one of a minimum state of charge (SOC) corresponding to a minimum SOC for each of the plurality of available vehicles, geographical area information corresponding to a geographical area which each of the plurality of available vehicles is available to travel to charge the first vehicle, time information corresponding to times at which each of the plurality of available vehicles is available to charge the first vehicle, or cost information corresponding an amount of payment that each of the plurality of available vehicles will pay for each unit of the electrical energy, wherein the processor is further configured to identify the available vehicle from the plurality of available vehicles based on the stored preferences.\n | 10. The system of claim 1 further comprising a memory configured to store account information including at least one of payment information corresponding to the user of the first vehicle or loyalty points corresponding to the user of the first vehicle, wherein the processor is further configured to receive, via the network access device, quantity information corresponding to a quantity of the electrical energy provided from the available vehicle to the first vehicle, and to collect payment for the quantity of the electrical energy using the account information.\n | 11. The system of claim 1 wherein the available vehicle is an autonomous vehicle and includes at least one of a fuel cell circuit configured to generate the electrical energy via a chemical reaction or a generator configured to convert fuel into the electrical energy such that the available vehicle is capable of providing a greater quantity of electrical energy than is stored in an available vehicle battery of the available vehicle.\n | 12. A system for electrical charging of a first vehicle, the system comprising: \na battery having a state of charge (SOC) and configured to receive electrical energy; \na network access device configured to communicate with a remote server; and \nan electronic control unit (ECU) coupled to the battery and the network access device and configured to transmit a charge request to the remote server requesting access to a source of electrical energy for charging the battery, and to receive available vehicle information corresponding to an available vehicle that is available to be used as the source of electrical energy for charging the battery. \n | 13. The system of claim 12 wherein the available vehicle information includes a location of the available vehicle along with digital key information corresponding to the available vehicle and usable to provide access to a charge port of the available vehicle.\n | 14. The system of claim 12 wherein the ECU is further configured to: \nreceive additional available vehicle data corresponding to each of a plurality of available vehicles via the network access device; \nidentify a preferred available vehicle based on at least one of: \na current location of the first vehicle and current locations of each of the plurality of available vehicles; \na state of charge (SOC) of each of the plurality of available vehicles; or \na current route of the first vehicle; and \ncontrol the network access device to transmit an identifier of the preferred available vehicle to the remote server. \n | 15. The system of claim 12 wherein the available vehicle information includes at least one of a location of the available vehicle or navigation instructions from a current location of the first vehicle to the location of the available vehicle, wherein the ECU is further configured to autonomously control the first vehicle to travel to the location of the available vehicle based on the available vehicle information.\n | 16. The system of claim 15 further comprising at least one of a wireless charge port configured to receive a wireless electrical energy signal, or a charging contact and a robot arm coupled to the charging contact and configured to actuate the charging contact to make contact with an external charging contact, wherein the ECU is further configured to control the at least one of the wireless charge port or the robot arm to be positioned in a way to receive the electrical energy from the available vehicle.\n | 17. A method for electrical charging of a first vehicle by a second vehicle, the method comprising: \nreceiving, by a network access device and from a first source that includes at least one of the first vehicle or a mobile device associated with a user of the first vehicle, a charge request requesting access to a source of electrical energy for charging the first vehicle; \nidentifying, by a processor, an available vehicle that is available to be used as the source of electrical energy for charging the first vehicle; and \ncontrolling, by the processor, the network access device to transmit available vehicle information corresponding to the available vehicle to the first source in response to receiving the charge request. \n | 18. The method of claim 17 further comprising: \ndetermining, by the processor, digital key information corresponding to the available vehicle; and \ntransmitting, by the processor, the digital key information to the first source to allow access to a charge port of the available vehicle. \n | 19. The method of claim 17 further comprising: \nidentifying, by the processor, a plurality of available vehicles including the available vehicle; \ncontrolling, by the processor, the network access device to transmit additional available vehicle data corresponding to each of the plurality of available vehicles; and \nreceiving, by the processor, a selection of the available vehicle from the first source. \n | 20. The method of claim 17 further comprising: \nidentifying, by the processor, a plurality of available vehicles including the available vehicle; and \nselecting, by the processor, the available vehicle from the plurality of available vehicles based on at least one of: \na current location of the first vehicle and current locations of each of the plurality of available vehicles; \na state of charge (SOC) of each of the plurality of available vehicles; or \na current route of the first vehicle.", "output": "The system has a network access device (8) for communicating with a source that includes a first vehicle (10) or a mobile device associated with a user of the first vehicle. A processor (4) is connected to the network access device to receive a charge request (18) from the source through the network access device, identify an available vehicle that is available to be used as the source of energy, and control the network access device to transmit available vehicle information corresponding to the available vehicle to the source in response to receiving the charge request, where the charge request requesting access to the source of electrical energy for charging the first vehicle. An INDEPENDENT CLAIM is included for a method for electrical charging of first vehicle by second vehicle. System for electrical charging of a first vehicle e.g. car, a bus, a motorcycle, a boat, an aircraft, by a second vehicle. The system increases availability of sources of electricity usable to charge electric vehicles in an effective manner. The system allows a user to select the available vehicle to be used as the source of electrical energy for charging the electric vehicle in an efficient manner. The drawing shows a block diagram of a system for electric charging of a first vehicle by a second vehicle.4Processor 6Memory 8Network access device 10First vehicle 18Charge request ", "instruction": "Please summarize the input" }, { "input": "A DEVICE FOR DETECTION AND PREVENTION OF AN ATTACK ON A VEHICLEA new device for the detection and prevention of an attack on a vehicle, via the vehicle's communication channels, includes sensors, network-based services, navigation applications, electronic control units of vehicles, bus-networks of the vehicle, sub of the vehicle. an input-unit configured to collect real-time and/or offline data from various sources, such as systems, and on board diagnostics (OBD); a database for storing data; a detection-unit in communication with the input-unit; and an action-unit in communication with the detection unit, configured to send an alert over the communication channels and/or prevent an attack by stopping or changing the attacked communication channels. The detection-unit is configured to simultaneously monitor the content of the data, the meta-data, and the physical-data, and detect an attack.|1. A device for executing instructions for in-vehicle attack detection and prevention, the device comprising at least one hardware processor, the at least one hardware processor comprising:\n code instructions for collecting real-time data from one or more data sources of the vehicle; \n code instructions for analyzing the real-time data to detect at least two irregularities; and code instructions to perform at least one of sending an alert when the at least two irregularities are detected and preventing at least one attack, wherein the at least two irregularities include: \n irregularities between data received from a vehicle sensor and operation maintained by one of a plurality of subsystems of the vehicle; \n irregularity in the relationship between one operation maintained by one of the plurality of subsystems and another operation maintained by another one of the plurality of subsystems; and an irregular cellular provider signal or irregular alteration of a cellular provider signal of the source of the real-time data; \nwherein the code instructions are stored on a non-transitory computer readable medium. \n | 2. The device of claim 1 , further comprising a database comprising parameters of at least one known attack previously detected as the attack, wherein the detection is performed according to the parameters.\n | 3. The device of claim 2, wherein the device further comprises an interface for communicating with a remote server to perform at least one of forwarding parameters of the known attack and receiving parameters of the known attack.\n | 4. The device of claim 1 , wherein the at least one hardware processor is at least partially embedded within a hardware card of the vehicle.\n | 5. The device of claim 2 , wherein the at least one hardware processor executes a machine learning engine configured to calculate an unfamiliar behavior of the vehicle based on analysis of the real-time data, wherein the parameters are updated or calculated according to the unfamiliar behavior. .\n | 6. The method of claim 1 , wherein one of the plurality of subsystems comprises:\n i. tire pressure monitoring; \n ii. stability control; \n iii. cruise control; \n iv. airbag control; \n v. PCM (Powertrain Control Module); \n vi. Transmission Control Module (TCM); \n vii. Brake Control Module (BCM); \n viii. Central Control Module (CCM); \n ix. Central Timing Module (CTM); \n x. General Electronic Module (GEM); \n xi. Body Control Module (BCM); \n xii. Suspension Control Module (SCM); \n xiii. Convenience Control Unit (CCU); \n xiv. ECU (Engine Control Unit); \n xv. Electrical Power Steering Control Unit (PSCU); \n xvi. Human Machine Interface (HMI); \n xvii. seat control unit; \n xviii. speed control unit; \n xix. Telephone Control Unit (TCU); \n xx. Transmission Control Unit (TCU); \n xxi. brake control module (ABS or ESC); \n xxii. crash sensors; \n xxiii. airbags; \n xxiv. seat belts; \n xxv. Tire Pressure Monitoring System (TPMS); \n xxvi. Electronic Stability Control (ESC) systems; \n xxvii. TCS (Traction Control System); \n xxviii. anti lock braking system (ABS); \n xx. Electronic Brake Assistance (EBA) systems; \n xxx. Electronic braking force distribution unit; \n xxxi. electronic brake force distribution (EBD) systems; \n xxxii. emergency stop; \n xxxiii. driven notifications and alerts; \n xxxiv. Pedestrian object recognition; \n xxxv. Lane keeping assistance; \n xxxvi. Collision avoidance; \n xxxvii. Adaptive headlamps control xxxviii. Reverse backup sensors xxxix. adaptive cruise control; \n xl. Active Cruise Control (ACC); \n xli. traction control systems; \n xlii. electronic stability control; \n xliii. Automated parking system; \n xliv. multimedia; \n xlv. Active noise cancellation (ANC); \n xlvi. radio; \n xlvii. Radio Data System (RDS); \n xlviii. driver information functions; \n xlix. AM/FM or satellite radio; \n l. DC/DVD player; \n li. payment systems; \n li. in-vehicle Wi-Fi router; \n liii. interior lights; \n liv. climate control; \n lv. Chairs adjustment; \n lvi. electric windows; \n lvii. mirror adjustment unit; \n lviii. central lock; \n lix. battery management unit; \n lx. charge management unit; \n lxi. vehicle-grid systems; \n lxii. active cruise control (ACC); \n lxiii. remote control keys; \n lxiv. Theft deterrent systems; \n lxv. Immobilizer system; \n lxvi. security systems; \n lxvii. digital cameras; \n lxviii. night vision; \n lxix. lasers; \n lxx. Radar; \n lxxi. RF sensors; \n lxxii. infotainment system; \n lxxiii. robotic gear-shaft; and lxxiv. any combination thereof; \nA device selected from the group consisting of. \n | 7. The method according to claim 1, wherein the vehicle sensor comprises:\n a. distance sensor; \n b. speed sensor; c. temperature Senser; \n d. satellite transmission sensor; \n e. cellular transmission sensors; f. video footage; \n g. air-fuel ratio meters; \n h. blind spot monitor; \n i. crankshaft position sensor; j. curb sensors used to warn the driver of curbs; \n k. anomaly detectors used to detect axle and signal problems of passing trains on railroads; \n l. an engine coolant temperature (ECT) sensor used to measure engine temperature; \n m. a Hall effect sensor used to time the speed of the wheels and shafts; \n n. a manifold absolute pressure (MAP) sensor used to regulate fuel metering; \n o. a mass flow sensor or mass airflow (MAF) sensor used to indicate the mass of air entering the engine; \n p. an oxygen sensor used to monitor the amount of oxygen in the exhaust pipe of the vehicle; \n q. parking sensors used to warn the driver of invisible obstacles during parking maneuvers; \n r. a speed meter used to detect the speed of other objects; s. speedometers used to measure the instantaneous speed of land vehicles; \n t. a speed sensor used to detect the speed of an object; \n u. a throttle position sensor used to monitor the position of a throttle in an internal combustion engine; \n v. a tire pressure monitoring sensor used to monitor the air pressure inside the tires; \n w. a torque sensor or torque transducer or torque meter that measures torque (torsion force) on a rotating system; x. a transmission fluid temperature sensor used to measure the temperature of the transmission fluid; \n y. a turbine speed sensor (TSS) or input speed sensor (ISS) used to measure the rotational speed of an input shaft or torque converter; \n z. a variable reluctance sensor used to measure the position and velocity of moving metal components; \n aa. a vehicle speed sensor (VSS) used to measure the speed of the vehicle; \n bb. a moisture sensor or a moisture sensor in the fuel used to indicate the presence of moisture in the fuel; \n cc. a wheel speed sensor used to read the wheel rotation speed of the vehicle; \n dd. comfort sensors including seat position, seat row, air condition and occupant position; and ee. any combination thereof; \nA device selected from the group consisting of. \n | 8. The method of claim 1 , wherein at least one of an irregular source and an irregular destination of the real-time data comprises:\n a. electronic/engine control module (ECM); b. powertrain control module (PCM); c. transmission control module (TCM); \n d. brake control module (BCM); \n e. Central Control Module (CCM); \n f. Central Timing Module (CTM); \n g. Generic Electronic Module (GEM); \n h. body control module (BCM); \n i. Suspension Control Module (SCM); \n j. airbag control unit (ACU); \n k. a body control module (BCU) for controlling door locks, automatic windows and interior lights; \n l. Convenience Control Unit (CCU); \n m. door control unit (DCU); \n n. engine control unit; \n o. an electric power steering control unit (PSCU) integrated into the electric power steering (EPS) power box; \n p. human machine interface (HMI); \n q. powertrain control module (PCM); \n r. seat control unit; \n s. speed control unit (SCU); \n t. telephone control unit (TCU); \n u. telematic control unit (TCU); \n v. transmission control unit (TCU); \n w. brake control module (BCM); \n x. onboard or integrated ECU handling remote services; and y. any combination thereof; \nA device, which is an Electronic Control Unit (ECU) selected from the group consisting of \n | 9. The method of claim 1 , wherein the at least one hardware processor comprises:\n A device running at least one of a commercially available antivirus, malware application, firewall or other malware database, at least a partially autonomous driving system, a remote control system, or a fully autonomous driving system. \n | 10. The device of claim 1 , wherein the vehicle is driven by a robotic platform.\n | 11. The device of claim 1 , wherein the vehicle moves over land, water, or air.\n | 12. The device of claim 1 , further comprising an evaluation engine configured to prioritize the attack by evaluating a risk level of the attack on the vehicle and its occupants.\n | 13. The code instructions of claim 1 , wherein the code instructions for performing at least one of sending the alert and preventing at least one attack include changing and blocking one or more communication channels connected to one or more network-based services. code instructions for performing at least one of: the one or more network-based services:\n A device selected from the group consisting of web, physical cable, wifi, cellular, Bluetooth, RF, GPS, vehicle-to-vehicle communication, vehicle-to-passenger infrastructure, environment-to-vehicle infrastructure. \n | 14. The device of claim 1 , wherein at least one of the irregular source and the irregular destination of the real-time data comprises one or more navigation applications or devices selected from the group consisting of a satellite navigator, a cellular navigator, and an inertial only navigator.\n | 15. The method of claim 1 , wherein the one or more data sources include:\n one or more sensors; one or more network-based services; one or more navigation applications or navigation devices; one or more electronic control units (ECUs) of the vehicle; one or more bus networks of the vehicle; one or more subsystems of the vehicle; and one or more onboard diagnostics (OBD); \nA device selected from the group consisting of \n | 16. A computer-implemented method by at least one hardware processor for in-vehicle attack detection and prevention, the method comprising:\n collecting real-time data from one or more data sources of the vehicle; \n analyzing, by the at least one hardware processor, the real-time data to detect at least two irregularities; and performing at least one of sending an alert and preventing at least one attack when the at least two irregularities are detected, wherein the at least two irregularities include: \n irregularities between current vehicle operation and operation maintained by one of a plurality of subsystems of the vehicle; \n irregularity in the relationship between one operation maintained by one of the plurality of subsystems and another operation maintained by another one of the plurality of subsystems; irregularity of the relationship between the current output of at least one sensor of the vehicle and the real-time data; and an irregular cellular provider signal or irregular change in a cellular provider signal of a source of instructions found in the real-time data; \nselected from the group consisting of \n | 17. A computer readable medium recording a computer program for in-vehicle attack detection and prevention, the computer program comprising program instructions, the program instructions being executable by a hardware processor of the vehicle:\n collect real-time data from one or more data sources of the vehicle; \n analyze the real-time data to detect at least two irregularities; and performing at least one of sending an alert and preventing at least one attack when the at least two irregularities are detected; \nThe at least two irregularities are: \n irregularities between current vehicle operation and operation maintained by one of a plurality of subsystems of the vehicle; \n irregularity in the relationship between one operation maintained by one of the plurality of subsystems and another operation maintained by another one of the plurality of subsystems; irregularity of the relationship between the current output of at least one sensor of the vehicle and the real-time data; and an irregular cellular provider signal or irregular change in a cellular provider signal of a source of instructions found in the real-time data; \nA computer-readable medium selected from the group consisting of", "output": "The device has an input-unit (110) collecting real-time and/or offline data from a source. A database (120) stores the data. A detection-unit (130) is provided in communication with the input-unit. An action-unit (140) is provided in communication with the detection unit, and sends an alert through communication channels and/or prevents an attack on a vehicle by breaking or changing the attack through the communication channels. The detection-unit simultaneously monitors content, meta-data and physical-data of the data, and detects the attack. The source is selected from a group consisting of a vehicle's sensor (111), a network based service (112), a navigation application (113) or navigation device, a vehicle electronic control unit (ECU) (114), a vehicle's bus-network (115), a vehicle's subsystem (116), and a vehicle's on board diagnostic (OBD) (117). The detection-unit detects the attack based on a characteristic that is selected from a group consisting of the data content, irregular source of the data, irregular destination for the data, irrational data content when compared with data received by the source, irrational action of the subsystem when compared with data received by another subsystem, irrational action between the subsystems, irrational action of one of the subsystems when compared with the data received by one of sensors, irrational meta-data, irrational meta-content, jam or blockage of communication channels and/or the network-based services, and sudden change in signal features of the network-based services and/or sensors. The sensor is a distance sensor, velocity sensor, temperature sensor, satellite transmission sensor, cellular transmission sensor, video image, air-fuel ratio meter, blind spot monitor, crankshaft position sensor, curb feeler, defect detector used on railroads to detect axle and signal problems in passing trains, engine coolant temperature (ECT) sensor, hall effect sensor, manifold absolute pressure (MAP) sensor, mass flow sensor or mass airflow (MAF) sensor, oxygen sensor, parking sensors, radar gun, speedometer, speed sensor, throttle position sensor, tire-pressure monitoring sensor, torque sensor or torque transducer or torque-meter, transmission fluid temperature sensor, turbine speed sensor (TSS), or input speed sensor (ISS). The subsystem is selected from a group consisting of tire pressure monitoring, stability control, cruise control, airbag control, powertrain control module (PCM), transmission control module (TCM), brake control module (BCM), central control module (CCM), central timing module (CTM), and general electronic module (GEM). The network based services are selected from a group consisting of web, physical cable, Wi-Fi , cellular, bluetooth , RF, GPS, vehicle to vehicle communication, vehicle to passenger infrastructure, environment to vehicle infrastructure. The navigation applications or devices are selected from a group consisting of satellite navigator, cellular navigator and inertial dedicated navigator. An INDEPENDENT CLAIM is also included for a method for detecting and preventing an attack on a vehicle through communication channels. Device for detection and prevention of an attack e.g. cyber attack and communication attack, on a vehicle i.e. robotic platform (claimed), through communication channels. The device utilizes a cellular device for virus detection followed by sequence of irrational commands to the vehicle's sub-systems and identification of irrational communication characteristics followed by a sharp turn of the stining wheel while navigation system recognizes an untracked, a dangerous path or even collision. The drawing shows a block diagram of a device for detection and prevention of an attack on a vehicle and basic components. 110Input-unit111Vehicle's sensor112Network based service113Navigation application114Vehicle ECU115Vehicle's bus-network116Vehicle's subsystem117Vehicle's OBD120Database130Detection-unit140Action-unit", "instruction": "Please summarize the input" }, { "input": "AUTONOMOUS VEHICLE FOR TEMPORARILY POWERING ELECTRIC VEHICLES (EVS) ON THE ROADMethods and systems for charging an electric vehicle (EV) are described herein. An EV may require additional battery power to reach a charging station. A remote server in communication with the EV or an on-board computer or mobile device in the EV may obtain data to determine a location for the EV to meet a charging vehicle. The charging vehicle may be dispatched to meet the EV and deliver power to it, enabling the EV to reach a charging station or other destination. In some examples, the charging vehicle may deliver power to the EV while both vehicles are stationary. In other examples, the charging vehicle may couple to the EV while both vehicles are in motion.What is claimed is: \n | 1. A computer-implemented method for charging an electric vehicle (EV) while the EV is moving, comprising: \nobtaining, by one or more processors, a request to deliver electrical power to the EV; \nidentifying, by the one or more processors, at least one autonomous charging vehicle (ACV) to charge the EV, wherein the ACV is configured to charge the EV while the EV and the ACV are each moving; \nobtaining, by the one or more processors, EV location data and ACV location data; \ndetermining, by the one or more processors, a coupling location based upon the EV location data and the ACV location data; \ndispatching, by the one or more processors, the ACV to the coupling location; and \ncausing, by the one or more processors, the ACV to deliver the electrical power to the EV for a threshold charging period, wherein the ACV and the EV are each moving during at least a portion of the threshold charging period. \n | 2. The computer-implemented method of claim 1, further comprising: \nobtaining, by the one or more processors, battery data of the EV, wherein the battery data includes at least a charge level of the EV; \ndetermining, by the one or more processors, a minimum charge level required for the EV to travel to a target location based upon the EV location data and the battery data, wherein the target location is a charging station or a destination; \ndetermining, by the one or more processors, that the charge level of the EV is less than the minimum charge level; and \ndispatching, by the one or more processors, the ACV in response to determining that the charge level of the EV is less than the minimum charge level. \n | 3. The computer-implemented method of claim 2, further comprising: \ncausing, by the one or more processors, a prompt to be displayed to a vehicle occupant requesting input from the vehicle occupant regarding whether to dispatch the ACV. \n | 4. The computer-implemented method of claim 2, wherein: \nthe ACV is dispatched automatically. \n | 5. The computer-implemented method of claim 1, wherein: \nthe ACV delivers the electrical power to the EV wirelessly. \n | 6. The computer-implemented method of claim 5, wherein: \nthe ACV is configured to deliver the electrical power to the EV using a magnet system. \n | 7. The computer-implemented method of claim 1, wherein the one or more processors are included in the ACV and the EV location data is obtained by the one or more processors using vehicle-to-vehicle communication between the EV and the ACV.\n | 8. The computer-implemented method of claim 1, wherein the ACV is a drone.\n | 9. The computer-implemented method of claim 1, wherein: \nthe ACV is a tow truck; and \nthe ACV delivers electrical power to the EV while towing the EV. \n | 10. A computer system for charging an electric vehicle (EV), comprising: \none or more processors; \na non-transitory computer-readable memory coupled to the one or more processors and storing executable instructions that, when executed by the one or more processors, cause the one or more processors to: \nobtain a request to deliver electrical power to the EV; \nidentify at least one autonomous charging vehicle (ACV) to charge the EV, wherein the ACV is configured to charge the EV while the EV and the ACV are each moving; \nobtain EV location data and ACV location data; \ndetermine a coupling location based upon the EV location data and the ACV location data; \ndispatch the ACV to the coupling location; and \ncause the ACV to deliver the electrical power to the EV for a threshold charging period, wherein the ACV and the EV are each moving during at least a portion of the threshold charging period. \n | 11. The computer system of claim 10, wherein the executable instructions further cause the one or more processors to: \nobtain battery data of the EV, wherein the battery data includes at least a charge level of the EV; \ndetermine a minimum charge level required for the EV to travel to a target location based upon the EV location data and the battery data, wherein the target location is a charging station or a destination; \ndetermine that the charge level of the EV is less than the minimum charge level; and \ndispatch the ACV in response to determining that the charge level of the EV is less than the minimum charge level. \n | 12. The computer system of claim 10, wherein the executable instructions further cause the one or more processors to: \ncause a prompt to be displayed to a vehicle occupant requesting input from the vehicle occupant regarding whether to dispatch the ACV. \n | 13. The computer system of claim 10, wherein: \nthe ACV is dispatched automatically. \n | 14. The computer system of claim 10, wherein: \nthe ACV delivers the electrical power to the EV wirelessly. \n | 15. The computer system of claim 10, wherein: \nthe ACV is configured to deliver the electrical power to the EV using a magnet system. \n | 16. A tangible, non-transitory computer-readable medium storing executable instructions for charging an electric vehicle (EV) that, when executed by one or more processors, cause the one or more processors to: \nobtain a request to deliver electrical power to the EV; \nidentify at least one autonomous charging vehicle (ACV) to charge the EV, wherein the ACV is configured to charge the EV while the EV and the ACV are each moving; \nobtain EV location data and ACV location data; \ndetermine a coupling location based upon the EV location data and the ACV location data; \ndispatch the ACV to the coupling location; and \ncause the ACV to deliver the electrical power to the EV for a threshold charging period, wherein the ACV and the EV are each moving during at least a portion of the threshold charging period. \n | 17. The tangible, non-transitory computer-readable medium of claim 16, wherein the executable instructions further cause the one or more processors to: \nobtain battery data of the EV, wherein the battery data includes at least a charge level of the EV; \ndetermine a minimum charge level required for the EV to travel to a target location based upon the EV location data and the battery data, wherein the target location is a charging station or a destination; \ndetermine that the charge level of the EV is less than the minimum charge level; and \ndispatch the ACV in response to determining that the charge level of the EV is less than the minimum charge level. \n | 18. The tangible, non-transitory computer-readable medium of claim 16, wherein the one or more processors are included in the ACV and the EV location data is obtained by the one or more processors using vehicle-to-vehicle communication between the EV and the ACV.\n | 19. The tangible, non-transitory computer-readable medium of claim 16, wherein the ACV is a drone.\n | 20. The tangible, non-transitory computer-readable medium of claim 16, wherein: \nthe ACV is a tow truck; and \nthe ACV delivers electrical power to the EV while towing the EV.", "output": "The method (800) involves obtaining a request to deliver electrical power to an electric vehicle (EV) by processors (802). An autonomous charging vehicle (ACV) is identified (804) by the processors to charge the EV, where the ACV is configured to charge the EV while the EV and the ACV are each moving. EV location data and ACV location data are obtained (806) by the processors. Coupling location is determined (808) by using the processors based upon the EV location data and the ACV location data. The ACV is dispatched (810) to the coupling location by the processors. The ACV is caused (812) to deliver electrical power to the EV for threshold charging period by the processors, where the ACV and the EV are each moving during a portion of the threshold charging period. INDEPENDENT CLAIMS are included for: (1) a computer system for charging an electric vehicle EV; (2) a tangible non-transitory computer-readable medium comprising a set of instructions for charging EV. Computer-implemented method for charging an electric vehicle (EV) i.e. autonomous or semi-autonomous vehicle such as solar electric vehicle, during movement by using an ACV e.g. drone and tow lorry. The method enables charging the EV without the use of stationary charging stations in an efficient manner. The method enables allowing the ACV to determine an appropriate charge needed to allow the EV to travel to a charging station along the EV route to determine minimum amount of power necessary to provide enough fuel and charge for the EV so as to reach the nearest charging station. The drawing shows a flow diagram illustrating a computer-implemented method for charging an EV.800Computer-implemented method for charging an EV. 802Step for obtaining a request to deliver electrical power to an EV by processors. 804Step for identifying ACV by the processors to charge the EV, where the ACV is configured to charge the EV while the EV and the ACV are each moving. 806Step for obtaining EV location data and ACV location data by the processors. 808Step for determining coupling location by using the processors based upon the EV location data and the ACV location data. 810Step for dispatching ACV to the coupling location by the processors. 812Step for causing ACV to deliver electrical power to the EV for threshold charging period by the processors. ", "instruction": "Please summarize the input" }, { "input": "Prevention planning for minimum risk strategyThe application claims a preventive planning of minimum risk strategy. The invention relates to a method executed by MRM planning system (1), the MRM planning system (1) is used for supporting the planning of the minimum risk strategy (MRM) of the automatic driving system (ADS) (21) of the vehicle (2). When the dynamic driving task (DDT) of the vehicle is executed by the autonomous driving mode of the ADS, the MRM planning system determines (1001) to the remaining distance (3) of the coming operation design domain (ODD) outlet (4), the ODD defined by the autonomous driving mode is to be finished at the outlet (4). When the remaining distance is shorter than the predetermined distance, the MRM planning system further evaluates (1002) data (5) associated with a section of the road (6) to the ODD outlet.|1. A method executed by the MRM planning system (1), wherein the MRM planning system (1) is used for supporting the planning of the minimum risk strategy MRM of the automatic driving system ADS (21) of the vehicle (2), the method comprises: when the dynamic driving task DDT of the vehicle (2) is executed by the autonomous driving mode of the ADS (21), determining (1001) to the remaining distance (3) of the upcoming operation design domain ODD outlet (4), an ODD defined for the autonomous driving mode is to be terminated at the outlet (4); when the remaining distance (3) is shorter than a predetermined distance, evaluating (1002) data (5) associated with a section of road (6) leading to the ODD outlet (4), the data (5) indicating a potential temporary and/or emergency stop area, road shoulder lane and/or margin space of one or more lanes along the one section of road (6), and/or potential occupation and/or disorder; based on the evaluation of the data (5) to identify (1003) an advantageous area (60) along the section of the road (6), the advantageous area (60) for potentially causing the vehicle (2) to be deemed safe to stop after the potential MRM is triggered; and determining (1004) starting the autonomous driving mode DDT termination program time (8), for example, providing the switching request (2) passenger taking the DDT at the time (8). The time (8) is calculated to occur a predetermined duration and/or distance before the vehicle (2) reaches an advantageous area (60) of the identification.\n | 2. The method according to claim 1, wherein the identification (1003) advantageous area (60) comprises selecting the advantageous area (60) based on a selection criterion. The selection criteria is balanced between a corresponding security level due to stopping in certain areas and a proximity to the ODD outlet (4) corresponding to a certain area.\n | 3. The method according to claim 1 or 2, wherein the evaluation (1002) comprises: evaluating a data (5) comprising a map-based data (51) derived from a digital map (22) such as a high definition HD map covering the section of the road (6); and/or evaluating data (5) comprising sensor-based data (52) derived from at least a first ambient detection sensor capable of capturing an ambient environment covering at least a portion of the section of the road (6).\n | 4. The method according to claim 3, wherein said evaluating (52) data (5) comprising sensor-based data comprises collecting at least a portion of said sensor-based data (52) from one or more sensors out of said vehicle (2) through at least a first inter-vehicle communication service.\n | 5. The method according to any one of claims 1 to 5 to 5, further comprising: transmitting (1006) a data (600) indicating an advantageous area (60) of the identification to the ADS (21) and/or to transmit (1006) in the ADS (21), for example, A track planner and/or a vehicle movement control device for transmitting (1006) to the ADS (21).\n | 6. The method according to any one of claims 1 to 1 to 5, wherein the determining (1004) the time of initiating the autonomous driving mode DDT terminating program comprises determining a time for triggering the MRM.\n | 7. An MRM planning system (1) for planning the minimum risk strategy (MRM) of an automatic driving system (ADS) (21) of a vehicle (2), wherein the MRM planning system (1) comprises: a remaining distance determining unit (101) for, when the dynamic driving task (DDT) of the vehicle (2) is executed by the autonomous driving mode of the ADS (21), determining (1001) to the remaining distance (3) of the upcoming operation design domain (ODD) outlet (4), an ODD defined for the autonomous driving mode is to be terminated at the outlet (4); a data evaluation unit (102) for, when the remaining distance (3) is shorter than a predetermined distance, evaluating (1002) data (5) associated with a section of the road (6) leading to the ODD outlet (4), the data (5) indicating a potential temporary and/or emergency stop area, road shoulder lane and/or margin space of one or more lanes along the one section of road (6), and/or potential occupation and/or disorder; a profit area identification unit (103), for based on the evaluation of the data (5) to identify (1003) along the beneficial area (60) of a section of road (6), the beneficial region (60) for potentially causing the vehicle (2) to be considered safe to stop after the potential MRM is triggered; and a time determining unit (104) for determining (1004) starting the autonomous driving mode DDT termination program time (8), for example, providing the vehicle (2) passenger taking the DDT at the time (8 switching request. The time (8) is calculated to occur a predetermined duration and/or distance before the vehicle (2) reaches an advantageous area (60) of the identification.\n | 8. The MRM planning system (1) according to claim 7, wherein said advantageous area identification unit (103) is adapted to select said advantageous area (60) based on a selection criterion. The selection criteria is balanced between a corresponding security level due to stopping in certain areas and a proximity to the ODD outlet (4) corresponding to a certain area.\n | 9. The MRM planning system (1) according to claim 7 or 8, wherein the data evaluation unit (102) is adapted to: evaluating a data (5) comprising a map-based data (51) derived from a digital map (22) such as a high definition (HD) map covering the section of the road (6); and/or evaluating data (5) comprising sensor-based data (52) derived from at least a first ambient detection sensor capable of capturing an ambient environment covering at least a portion of the section of the road (6).\n | 10. The MRM planning system (1) according to claim 9, wherein the data evaluation unit (102) is adapted to collect at least a portion of the sensor-based data (52) from one or more sensors outside the vehicle (2) through at least a first vehicle-to-vehicle communication service.\n | 11. The MRM planning system (1) according to any one of claims 7 to 11 to 12, further comprising: a data communication unit (106) for transmitting (1006) a data (600) indicating an advantageous area (60) of the identification to the ADS (21) and/or to transmit (1006) in the ADS (21), for example, A track planner and/or a vehicle movement control device for transmitting (1006) to the ADS (21).\n | 12. The MRM planning system (1) according to any one of claims 1 to 7 to 11, wherein the time determining unit (104) is adapted to determine a time for triggering the MRM.\n | 13. A vehicle (2), comprising the MRM planning system (1) according to any one of claims 1 to 7 to 12.\n | 14. A computer program product comprising a computer program product having a computer program code module arranged to cause a computer or a processor to perform the steps of the method according to any one of claims 1 to 6 to 26, The computer program product is stored on a computer readable medium or carrier.\n | 15. A non-volatile computer-readable storage medium having stored thereon a computer program product according to claim 14.", "output": "The method involves determining a dynamic driving task (DDT). Remaining distance to an upcoming operational design domain is obtained. Data associated with a stretch of road leading up is assessed to operational design domain (ODD) when the remaining distance is shorter than pre-determinable distance. A favorable area (60) is identified along the stretch of road for potentially bringing a vehicle (2) to a stop deemed safe subsequent based on the assessment of the data. A handover request is provided to prompt an occupant of the vehicle to take over the DDT. Gathering process is performed with support from a first inter-vehicle communication service. INDEPENDENT CLAIMS are included for: (1) a system for supporting planning of minimal risk maneuver of automated driving system of vehicle;(2) a computer readable storage medium for storing a set of instructions for performing a method for supporting planning of minimal risk maneuver of automated driving system of vehicle. Method for supporting planning of minimal risk maneuver (MRM) of an automated driving system (ADS) of a vehicle. The method enables providing precautionary actions to enable the vehicle to upon potential activation of the MRM having potentially lower risk maneuver to choose, and enabling for improved MRM planning. The drawing shows a schematic diagram of a method for supporting planning of minimal risk maneuver of automated driving system of vehicle. 1MRM planning system 2Vehicle 6Road segments 60Favorable area 61Emergency parking area ", "instruction": "Please summarize the input" }, { "input": "avoiding the vehicle automatic driving vehicle capability test fieldThe utility model claims an automatic driving vehicle performance test technical field, especially claims an automatic driving vehicle avoiding the vehicle capacity of the test field. comprising a road test field, a first reference test vehicle and operation management centre, management centre operation test control first reference vehicle front or back the vehicle to be tested and which can keep the distance to run, or the detected vehicle running in parallel, or reciprocal travel with the vehicle on the lane adjacent to the tested vehicle, then controlling the first reference vehicle changes the vehicle speed and/or driving direction to close to the tested vehicle. of this test field compared to a computer simulated vehicles operating software the theoretical data, more close to the actual running environment to make the test result more accurately shows the vehicle avoidance capability close to the vehicle, compared with the actual road of road safer measuring, the test field can be used as standardized test scene for use of different automatic driving vehicle, the test result is more authoritative and reliable.|1. An automatic driving vehicle avoiding the vehicle capacity of the test field, wherein, comprising a road (1), a first reference vehicle (2) travelling on the road (1), the first reference vehicle (2) is an unmanned vehicle; control the first reference vehicle (2) can keep the vehicle to be tested (3) in front or behind the distance travelled, or controlling the first reference vehicle (2) and vehicle (3) driving, or controlling the first reference vehicle (2) running oppositely on the lane adjacent to the tested vehicle (3) of the tested vehicle (3). and then control the first reference vehicle (2) changing the vehicle speed and/or driving direction to close the operation test management center (5) of the vehicle (3), wherein the operation management centre (5) and the first reference vehicle (2) comprise a wireless communication device, wireless communication device with V2N communication protocol by the service test management centre (5) and the first reference vehicle (2) through respective V2N communication protocol with each communication connection, to test the operation management centre (5) for controlling the first reference vehicle (2) runs.\n | 2. The automatic driving vehicle according to claim 1 avoiding the vehicle capacity of the test field, wherein the first reference vehicle (2) has a V2N communication protocol of the wireless communication device further has a V2V communication protocol to the communication connection of the tested vehicle (3) and its own change vehicle speed and/or driving direction sending the early warning information to the vehicle (3) to the vehicle (3).\n | 3. The automatic driving vehicle according to claim 2 avoiding the vehicle capacity of the test field, wherein it further comprises a road side device (4), the road side device (4) is set on the road (1) or the side, wherein the wireless communication device of the first reference vehicle (2) further comprises a V2I communication protocol can itself change the vehicle speed and/or direction of travel to close to the tested vehicle (3) when sending out the pre-warning information to the road side device (4), the road side device (4) comprises receiving the first reference vehicle (2) sends the warning information and the to-be-tested vehicle (3) sends the warning information. wireless communication device with V2I communication protocol.\n | 4. The automatic driving vehicle according to claim 3 avoiding the vehicle capacity of the test field, wherein it further comprises: said road (1) comprises intersection (11) and chute (12) and the intersection (11), the road side unit (4) is arranged at the intersection (11).\n | 5. The automatic driving vehicle according to claim 1 avoiding the vehicle capacity of the test field, wherein it further comprises auxiliary road safety facility (7), the road additional safe device (7) is arranged on the road (1) or the side.\n | 6. The automatic driving vehicle according to claim 1 avoiding the vehicle capacity of the test field, wherein it further comprises a second reference vehicle (6) travelling on the road (1), the second reference vehicle (6) is an unmanned vehicle. the second reference vehicle (6) comprises a wireless communication device with V2N communication protocol of the wireless communication device by the service test management centre (5) and the second reference vehicle (6) through respective V2N communication protocol with each communication connection, to test the operation management centre (5) to control the second reference vehicle (6) travelling on the road (1) and at the periphery of the vehicle to be measured (3).\n | 7. The automatic driving vehicle according to claim 6 avoiding the vehicle capacity of the test field, wherein the wireless communication device of the second reference vehicle (6) further has a V2V communication protocol to receive vehicle (3) sends the warning information.\n | 8. The automatic driving vehicle according to claim 7 avoiding the vehicle capacity of the test field, wherein it further comprises the second reference vehicle (6) interconnected to collect light information of the second reference vehicle (6) and can be connected with the tested vehicle (3) interconnected to the collecting module of receiving and sending information and path planning information of the vehicle (3).\n | 9. The automatic driving vehicle according to claim 1 avoiding the vehicle capacity of the test field, wherein said road (1) comprises bidirectional lanes of unidirectional lane or without intermediate spacers.", "output": "The utility model claims an automatic driving vehicle performance test technical field, especially claims an automatic driving vehicle avoiding the vehicle capacity of the test field. comprising a road test field, a first reference test vehicle and operation management centre, management centre operation test control first reference vehicle front or back the vehicle to be tested and which can keep the distance to run, or the detected vehicle running in parallel, or reciprocal travel with the vehicle on the lane adjacent to the tested vehicle, then controlling the first reference vehicle changes the vehicle speed and/or driving direction to close to the tested vehicle. of this test field compared to a computer simulated vehicles operating software the theoretical data, more close to the actual running environment to make the test result more accurately shows the vehicle avoidance capability close to the vehicle, compared with the actual road of road safer measuring, the test field can be used as standardized test scene for use of different automatic driving vehicle, the test result is more authoritative and reliable.", "instruction": "Please summarize the input" }, { "input": "an automatic driving vehicle continuously by crossing ability of test fieldThe utility model claims an automatic driving vehicle performance test technical field, especially claims an automatic driving vehicle continuously by crossing ability of the test field. test field comprises road, at least two corresponding intersection, at each intersection are provided with a signal lamp device and capable of detecting the road condition and form a first traffic information of the road side device, each road side device and the corresponding signal lamp device communication connection to obtain the signal lamp information, each road side device comprises a to-be-tested vehicle sends wireless communication device of the first traffic information and signal lamp information. the test field compared to a computer simulated vehicles operating software the theoretical data more close to the actual running environment, the test result can more accurately show the tested vehicle continuous passing ability of the intersection, compared with the actual road test is more safe, the test field can be used as standardized scene for testing of different automatic driving vehicle, the test result is more authoritative and reliable.|1. An automatic driving vehicle continuously by crossing ability of the test field, wherein the test field comprises road (1) and the road (1) comprises intersection (2) at least two arranged at intervals; the road side device (5) correspondingly are equipped with a signal lamp device at each intersection (2) and capable of detecting the road condition and form a first traffic information of each road side device (5) is connected with corresponding signal lamp device communication to obtain the signal lamp information, each road side device (5) comprises a to-be-tested vehicle (3) sends the first traffic information and the signal lamp information, wireless communication device with V2I communication protocol.\n | 2. The automatic driving vehicle according to claim 1 continuously passes through the crossing capacity of the test field, wherein the wireless communication device further comprises: an operation test management centre (7), the operation management centre (7) comprises a vehicle to be measured (3) through each intersection (2) before sending the second traffic information corresponding to the tested vehicle (3), having a V2N communication protocol.\n | 3. The automatic driving vehicle according to claim 2 continuously passes through the crossing capacity of the test field, wherein each of said road side device (5) are connected by transmitting signal light information and the operation test management centre (7). wireless communication device by the service test management centre (7) can receive the driving route of the vehicle (3) sends information, the operation test management centre (7) further comprises according to corresponding to each intersection (2) of the signal lamp information and the second traffic information and the driving path information analysis processing to form the tested vehicle passes first of each intersection (2) suggested analysis module of travelling speed; the operation management centre (7) is also capable of running speed through the wireless communication device sending the first suggestion to the tested vehicle (3), the wireless communication device by the service test management centre (7) is connected with the analysis module.\n | 4. The automatic driving vehicle according to claim 2 continuously passes through the crossing capacity of the test field, further comprising: a reference vehicle (8) travelling on the road (1) and at the periphery of the vehicle to be measured (3).\n | 5. The automatic driving vehicle according to claim 4 continuously passes through the crossing capacity of the test field, wherein the reference vehicle (8) is an automatic driving vehicle, the reference vehicle (8) comprises a wireless communication device with V2N communication protocol; wireless communication device by the service test management centre (7) and the reference vehicle (8) through respective V2N communication protocol with each communication connection so as to test the operation management centre (7) controls the reference vehicle (8) runs.\n | 6. The automatic driving vehicle according to claim 5 continuously passes through the crossing capacity of the test field, wherein the reference vehicle (8) is an unmanned vehicle.\n | 7. The automatic driving vehicle according to claim 1 continuously passes through the crossing capacity of the test field, wherein it is set with at least three intersection (2), the distance between two adjacent intersection (2) is more than or equal to 150 m.\n | 8. The automatic driving vehicle according to claim 1 continuously passes through the crossing capacity of the test field, wherein the signal lamp device comprises a signal lamp and a control signal lamp controller (6), the road-side device and the controller (6) in communication way.\n | 9. The automatic driving vehicle according to claim 1 continuously passes through the crossing capacity of the test field, wherein, further comprising: a communication connection of the tested vehicle (3) so as to obtain the information of the tested vehicle (3) of the collection module.", "output": "The utility model claims an automatic driving vehicle performance test technical field, especially claims an automatic driving vehicle continuously by crossing ability of the test field. test field comprises road, at least two corresponding intersection, at each intersection are provided with a signal lamp device and capable of detecting the road condition and form a first traffic information of the road side device, each road side device and the corresponding signal lamp device communication connection to obtain the signal lamp information, each road side device comprises a to-be-tested vehicle sends wireless communication device of the first traffic information and signal lamp information. the test field compared to a computer simulated vehicles operating software the theoretical data more close to the actual running environment, the test result can more accurately show the tested vehicle continuous passing ability of the intersection, compared with the actual road test is more safe, the test field can be used as standardized scene for testing of different automatic driving vehicle, the test result is more authoritative and reliable.", "instruction": "Please summarize the input" }, { "input": "automatic driving vehicle lane ability test fieldThe utility model claims an automatic driving vehicle performance test technical field, especially claims an automatic driving vehicle lane-changing capability of the test field. test field comprising a first reference vehicle with road of the first lane and the second lane, the first lane driving, control first reference vehicle in front of the vehicle to be measured running management centre operation test, wireless communication device running test management centre and first reference vehicle through respective V2N communication protocol with other communication connection to control first reference vehicle running, and running test management centre can send out the changing command or task command to the tested vehicle through the wireless communication device. the test field is more close to the actual running environment, the test result more accurately vehicle lane ability, compared with the actual road test is more safe, and suitable for different automatic driving vehicle, standardized evaluation of vehicle to be detected changing capability, the test result is more authoritative and reliable.|1. A testing field of automatic driving vehicle lane ability, wherein it comprises road (1) and the road (1) first reference vehicle comprises adjacent first lane (11) and the second lane (12) on the first lane (11) (2) the first reference vehicle (2) is non-human driving vehicle, the first reference vehicle (2) comprises a wireless communication device with V2N communication protocol, control the first reference vehicle (2) in front of the operation test management centre (4) vehicle to be measured (3); the operation management centre (4) comprises a wireless communication device with V2N communication protocol of the wireless communication device by the service test management centre (4) and the first reference vehicle (2) through respective V2N communication protocol with each communication connection so as to test the operation management centre (4) control the first reference vehicle (2) runs, and the operation test control center (4) can send out the changing command or running the task command to the tested vehicle (3) through the wireless communication device; wherein, when sending out the run task command condition, the run task command comprises a tested vehicle (3) is located to the mileage of the destination and to arrival time or to travel time, the operation test management centre (4) control the first reference vehicle (2) smaller than the distance and the required arrival time to the quotient of the difference between the current time, or the running speed of the quotient is less than the distance and the required travel time of running.\n | 2. The automatic driving vehicle according to claim 1, the said changing capability of the test field, wherein it further comprises a second reference vehicle (5), said second reference vehicle (5) is non-human driving vehicle, said second reference vehicle (5) comprises a wireless communication device with V2N communication protocol. wireless communication device by the service test management centre (4) and the second reference vehicle (5) through respective V2N communication protocol with each communication connection so as to test the operation management centre (4) control the second reference vehicle (5) on said second lane (12) and the tested vehicle (3) and the first reference vehicle (2) with both travel in the same direction, and the second reference vehicle (5) as the detected vehicle (3) has changeable distance.\n | 3. The automatic driving vehicle according to claim 1, the said changing capability of the test field, wherein it further comprises a third reference vehicle (6), said third reference vehicle (6) is non-human driving vehicle, said third reference vehicle (6) comprises a wireless communication device with V2N communication protocol. wireless communication device by the service test management centre (4) and the third reference vehicle (6) through respective V2N communication protocol with each communication connection so as to test the operation management centre (4) control the third reference vehicle (6) facing with the first reference vehicle (2) on the second lane (12).\n | 4. The automatic driving vehicle according to claim 1, the said changing capability of the test field, wherein it further comprises a wireless communication device side device (7), the road side device (7) is set on the road (1) or beside the road side device (7) comprises a road condition information to the vehicle (3) before changing the tested vehicle (3), having a V2I communication protocol.\n | 5. The automatic driving vehicle according to claim 4, the said changing capability of the test field, wherein the road side device (7) further comprises automatically detecting the road condition detector, and the detector receives the road condition related information analyzing processing for forming traffic information processing module, said detector and said processing module connecting the processing module and the V2I communication protocol of the wireless communication device communication connection.\n | 6. The automatic driving vehicle according to claim 1, the said changing capability of the test field, wherein the operation test management centre (4) is also capable of changing the forward vehicle (3) sends the road condition information to be detected vehicle (3) by the wireless communication device.\n | 7. The automatic driving vehicle according to claim 1, the said changing capability of the test field, wherein the wireless communication device of the first reference vehicle (2) has a V2V communication protocol, the first reference vehicle (2) can through the wireless communication equipment receives the tested vehicle (3) sends changing request and receiving the changing request and reply the consent information.\n | 8. The automatic driving vehicle according to claim 2, the said changing capability of the test field, wherein the wireless communication device of the second reference vehicle (5) has a V2V communication protocol, the second reference vehicle (5) can through the wireless communication equipment receives the tested vehicle (3) sends changing request and receiving the changing request and reply the consent information.\n | 9. The automatic driving vehicle according to claim 3, the said changing capability of the test field, wherein the wireless communication device of the third reference vehicle (6) has a V2V communication protocol, the third reference vehicle (6) can through the wireless communication equipment receives the tested vehicle (3) sends changing request and receiving the changing request and reply the consent information.\n | 10. The automatic driving vehicle according to claim 1, the said changing capability of the test field, wherein it further comprises a vehicle to be tested (3) interconnected by collecting collecting module for receiving and sending information, operation information and route planning information of the vehicle (3).", "output": "The utility model claims an automatic driving vehicle performance testing technology field, especially relates to a testing field of automatic driving vehicle lane changing capability. testing field comprising a first reference vehicle having a first lane and a second lane of the road, running on the first lane, control the first reference vehicle in front of the vehicle to be measured running operation testing management centre; Wireless communication device of operation testing management centre and the first reference vehicle has V2N communication protocol through the respective communication connection with one another to control the first reference vehicle running and operation testing management centre can send command or running the task command to the vehicle through the wireless communication device. the testing field is more close to the actual running environment, testing result more accurately vehicle lane changing ability, compared with the actual road testing manner is more safety, and is suitable for different automatic driving vehicle, normalized to the vehicle lane changing ability evaluation, testing result more authoritative and reliable.", "instruction": "Please summarize the input" }, { "input": "an automatic driving vehicle field by testing ability of signal interference regionThe utility model claims an automatic driving vehicle performance test technical field, especially claims an automatic driving vehicle through testing the ability of field signal interference region. test field facilities, road comprises road and signal interference comprises signal interference-free section and the corresponding facility of signal interference on signal interference section, test field further comprises a first road information of the operation test management centre to the detected vehicle running in signal interference section when the vehicle to be measured. the test field compared to no signal transceiver in the laboratory but only by the simulation environment of machine vision and computer simulated vehicles operating software the theoretical data, more close to the actual running environment, so that the test result can accurately represent the tested vehicle through capacity of signal interference region, and relative to the actual road test is safer, the test field can be used for testing different automatic driving vehicle as standard scene, so that the test result is more authoritative and reliable.|1. An automatic driving vehicle through capacity of signal interference region of the test field, wherein it comprises a signal interference device (2) of road (1) and the section of the road (1) is formed on the signal transmission interference. said road (1) comprises signal interference section (12) without signal interference section (11) and corresponding to the signal-to-interference device (2) in said signal interference; the driving section (12) the to-be-tested vehicle (3) to the to-be-detected road information of the vehicle (3) sends first operation test management centre (4); the operation management centre (4) comprises a wireless communication device with V2N communication protocol.\n | 2. The automatic driving vehicle according to claim 1 the test signal of the ability of interference area, wherein along said road (1) are provided with multiple signal for the interference device (2); said road (1) comprises respectively corresponding to a plurality of signal-to-interference equipment (2) of a plurality of signal-to-interference section (12) and multiple signal-to-interference-free section (11), the plurality of signal interference section (12) and said multiple signal-to-interference-free section (11) are alternately arranged.\n | 3. The automatic driving vehicle according to claim 1 or 2 the ability of signal interference test field zone, wherein the signal-to-interference device (2) is set on the road (1) on two sides of the street, tunnel set above the road (1), set on the road (1) above the bridge, or meteorological analogue interference signal transmission of meteorological simulation device.\n | 4. The automatic driving vehicle according to claim 3 the capability of signal interference region of the test field, wherein the length of the hideaways is greater than 50m; the length of the tunnel is greater than or equal to 50m, the tunnel made of concrete, or concrete and reinforcing steel. the width of the bridge is greater than or equal to 10m, and the length is more than 20m, the bridge made of concrete, or concrete and reinforcing steel, the meteorological simulation device comprises a meteorological simulation chamber and in the chamber producing a meteorological simulation interference signal transmission of meteorological weather simulator. said road (1) set through the weather simulation chamber, weather, interference of signal transmission the weather simulator manufactured by covering length is more than or equal to 30m.\n | 5. The automatic driving vehicle according to claim 3, the ability of a test field signal interference region, wherein the hideaways comprises a plurality of hideaways segments with different densities, the length of each said hideaways section is more than or equal to 20m.\n | 6. The automatic driving vehicle according to claim 1 the test signal of the ability of interference area, further comprising: a reference vehicle (6) used on the road (1) and running on the periphery of the vehicle (3) to be tested, wherein the reference vehicle (6) comprises a wireless communication device with V2V communication protocol.\n | 7. The automatic driving vehicle according to claim 1 the test signal of the ability of interference area, wherein it further comprises a forward vehicle on the tested vehicle (3) entering said signal interference section (12) (3) sends the road condition information of the road side device (5); the road side device (5) the signal interference-free section (11) or on side, comprising a wireless communication device for sending the road condition information to the tested vehicle (3), having a V2I communication protocol.\n | 8. The automatic driving vehicle according to claim 7 the field of capability of the signal interference region, wherein the road side device (5) further comprises a detector capable of automatically detecting the road condition and/or input terminal of the manual input condition, and a processing module for processing the information.\n | 9. The automatic driving vehicle according to claim 1 the test signal of the ability of interference area, wherein the operation testing management centre (4) further comprises: controlling the V2N communication protocol of the wireless communication device in the vehicle (3) into the signal interference section (12) before sending the second road information to the tested vehicle (3), and issuing a first road information of the control module to the tested vehicle (3) in the vehicle (3) to be tested in said signal interference section (12) when driving.\n | 10. The automatic driving vehicle according to claim 1 the field of capability of the signal interference region, further comprising: a device for the interconnection of the tested vehicle (3) by collecting module for collecting the information of the vehicle (3).", "output": "The utility model claims an automatic driving vehicle performance testing technical field, especially claims an automatic driving vehicle through testing field capability signal interference region. testing field comprises facilities, road road and signal interference comprises interference without signal section and a signal corresponding to interference of signal interference, test field further comprises emitting the first road information operation testing management centre to the vehicle when said to-be-detected vehicle driving road section in the signal interference. the testing field compared to simulated environment laboratory does not have signal transceiving but only by machine vision and compared with theoretical data of computer simulated vehicles operating software, which is more close to the actual running environment, the testing result can accurately represent the vehicle through capacity of signal interference region, and it is more safe to actual road testing, the testing field as standard scene for testing of different automatic driving vehicle, so as to make the test result more authoritative and reliable.", "instruction": "Please summarize the input" }, { "input": "the automatic driving vehicle avoiding dynamic obstruction capability testing method and testing fieldThe invention claims an automatic driving vehicle performance test technical field, especially claims an automatic driving vehicle barrier capability of avoiding dynamic test method and test field. the test field and the test method uses the test field comprises a road and the dynamic obstacle, scene arranged to control dynamic obstacle according to the travel route of the vehicle moving route of movement, and control the dynamic obstacle when the vehicle to be measured does not avoiding action under the condition that will collide with the vehicle position of the travel route and the moving route of the manner. so it is more close to the actual running environment, so that the test result can more accurately represent the vehicle the dynamic obstacle avoidance ability, and, compared with the actual road test is more safe. Furthermore, the test field and the test method can be used for testing different automatic driving vehicle, so as to make the test result more authoritative and reliable.|1. An automatic driving vehicle barrier capability of avoiding dynamic testing method, wherein it comprises the test field scene and the vehicle response, the scene arrangement is that the test field comprises road and the dynamic obstacle, the vehicle is placed on the road. controlling the dynamic obstacle according to the travel route of the vehicle moving route of movement, and control the dynamic obstacle to under the condition of avoiding action is made to sense a vehicle will collide with the vehicle position in the travel route and moves the moving route of the way; the measured vehicle response is the vehicle running according to the travelling route on the road, then avoiding the dynamic obstacle movement to avoid or reduce collision of the dynamic obstacle.\n | 2. The test method of avoiding dynamic obstruction capability the automatic driving vehicle according to claim 1, wherein the scene arrangement further includes: between a vehicle capable of avoiding distance, control the dynamic obstacle appears in the machine vision acquisition range of the detected vehicle position and the travel route and the travel route intersecting the testing vehicle.\n | 3. The test method of avoiding dynamic obstruction capability the automatic driving vehicle according to claim 1 or 2, wherein said scene further comprises: controlling a plurality of dynamic obstacle according to the travel route of the vehicle moving route of movement, the vehicle response further comprises: finishing each dynamic obstacle avoidance the vehicles to be detected.\n | 4. The test method of avoiding dynamic obstruction capability the automatic driving vehicle according to claim 1 or 2, wherein said scene further comprises: the dynamic obstacle is dummy, animal model or non motor vehicle and the dummy. animal model or vehicle comprising a carrying or send location information and/or motion information of the device, the measured vehicle response further comprises the vehicle to be tested collected by the location information and/or motion information and combining the state after analyzing and processing. finally finishing the action of avoiding the dummy, to avoid collision of the dummy, animal model or non-motor vehicle, or the scene arrangement further includes a dynamic obstacle is the vehicle. said motor vehicle comprises a wireless communication device with V2V communication protocol, the wireless communication equipment with V2V communication protocol sent by the travel path information of the vehicle, the vehicle response further comprises the vehicle receiving the driving route information and combining the state after analyzing and processing. finally finishing the action of avoiding the vehicle, to avoid or reduce the collision of said vehicle.\n | 5. The test method of avoiding dynamic obstruction capability the automatic driving vehicle according to claim 1 or 2, wherein said scene further comprises: said test field further comprises running a test management centre the operation test management centre sends the road condition information to the vehicle to be measured, the measured vehicle response further comprises the vehicle receives the road condition information and combining the state after analyzing and processing, finally finishing the avoidance action of the dynamic barrier so as to avoid or reduce collision of the dynamic obstacle.\n | 6. The test method of avoiding dynamic obstruction capability the automatic driving vehicle according to claim 1 or 2, wherein the scene arrangement further comprises the test field further comprises a road side device; the road side device detects the sending warning information to the vehicles to be detected has a dynamic obstacle of the road after the vehicle response further comprises: said vehicle information device receives the warning sent by the road side and combined with the state after analyzing and processing, finally finishing the action of the dynamic obstacle avoidance. to avoid or reduce collision of the motor vehicle.\n | 7. The test method of avoiding dynamic obstruction capability the automatic driving vehicle according to claim 1 or 2, wherein said scene further comprises: said test field further comprises a signal lamp and a sight barrier, the signal lamp is installed and allows the detected vehicle traffic on the road, the line-of-sight barrier blocking the vehicle through machine vision to find the position of the dynamic barrier.\n | 8. The test method of avoiding dynamic obstruction capability the automatic driving vehicle according to claim 1 or 2, wherein the to-be-tested vehicle response further comprises one or several of the following: a, the vehicle remind the car in the front road condition, b. the vehicles to be detected to the dynamic obstacle warning action and/or early warning information is sent out.\n | 9. An automatic driving vehicle obstacle avoiding dynamic capability of the test field, wherein it comprises road, the dynamic obstacle, travel route intersecting the moving route of the dynamic obstacle with the vehicle to be tested, crossing position on the road, and the dynamic obstacle can be said to-be-detected avoiding action of the vehicle does not collide with the vehicle at the intersection of manner.\n | 10. The test field for avoiding dynamic obstruction capability the automatic driving vehicle according to claim 9, wherein said dynamic obstacles can have vehicle capable of avoiding distance, position with the intersecting between the testing vehicle appearing in the mechanical visual acquisition range of the vehicle to be measured, and the dynamic obstacle is dummy, animal model, non-motor vehicle or motor vehicle, wherein, said dummy, animal model or non-motor comprises carrying or sending device location information and/or motion information; the vehicle includes a wireless communication device capable of transmitting travel path information of the vehicle, has V2V communication protocol, the test field further comprises running test management centre, the operation test management centre comprises sending road condition information to the vehicle to be measured. Wireless communication equipment with V2N communication protocol, the test field further comprises set on the road or road side of the device, the road side device comprises an automatic detector for detecting the road condition, the road related information received from the detector of the analytical process to form pre-warning information processing module; and wireless communication equipment with V2I communication protocol, the V2I communication protocol of the wireless communication device is capable of sending the warning information to the vehicle to be measured, the test field further comprises a signal lamp and a sight barrier, the signal lamp is installed on the road and allows the vehicle to be measured, the line-of-sight barrier blocking the vehicles to be detected through machine vision to find the position of the dynamic barrier.", "output": "The method involves placing a vehicle on a road for controlling dynamic obstacle according to a traveling route and a vehicle moving route. The dynamic obstacle is controlled under action avoiding condition. A vehicle position is determined by using the traveling route for moving a vehicle. Vehicle running response is obtained according to the traveling route on the road for avoiding dynamic obstacle movement and dynamic obstacle collision. Machine vision acquisition range of the detected vehicle position is determined based on the traveling route. An INDEPENDENT CLAIM is also included for an automatic driving vehicle performance testing device. Automatic driving vehicle performance testing method. The method enables testing different automatic driving vehicle so as to improve result testing reliability and automatic driving vehicle barrier capability. The drawing shows a schematic view of an automatic driving vehicle performance testing method.", "instruction": "Please summarize the input" }, { "input": "Automatic obstacle avoiding dynamic capability test field driving vehicleThe utility model claims an automatic driving vehicle performance test technical field, especially claims an automatic driving vehicle barrier capability of avoiding dynamic test field. test field comprising running route of road and the dynamic obstacle, a movement route of the dynamic obstacle, crossing with the tested vehicle is located on the road, and a dynamic obstacle to the to-be-tested vehicle does not make action under the condition that will collide with the vehicle at a running route and the movement route of the motion. In one aspect, such a test field compared to a computer simulated vehicles operating software the theoretical data, more close to the actual running environment, so that the test result can more accurately represent the ability of obstacle avoiding dynamic measured vehicle, and compared with the actual road of road safer measuring, on the other hand, the test field can be used for testing different automatic driving vehicle as standard scene, so that the test result is more authoritative and reliable.|1. An automatic driving vehicle of avoiding dynamic barrier capability testing field, wherein it comprises road; the dynamic obstacle, the dynamic obstacle movement path and the vehicle driving route crossing the intersection is located on the road. and the dynamic obstacle is capable to track a moving route of the vehicle to be measured is not made under the condition of avoiding action will collide with detected vehicle at the intersection manner, defining the dynamic obstacle, a driving drive device of the dynamic obstacle along the track.\n | 2. The automatic driving vehicle according to claim 1 the dynamic barrier ability of the test field, wherein the dynamic obstacle can be left between the vehicle sidestep with said intersection when the distance of the measured vehicle motion in the mechanical visual acquisition range of the vehicle to be detected.\n | 3. The automatic driving vehicle according to claim 1 the dynamic barrier ability of the test field, wherein, further comprising: automatically detecting the road condition and sending the early warning information to the vehicle when detecting the road side device with dynamic road condition of the barrier.\n | 4. The automatic driving vehicle according to claim 3 the dynamic barrier ability of the test field, wherein the road side device comprises an automatic detector for detecting road condition the road condition related information received from the detector of the analytical process to form pre-warning information processing module and a wireless communication device with V2I communication protocol, and processing the detector module in the road side device, a processing module with a V2I communication protocol of the wireless communication device communication connection for information transmission, the V2I communication protocol of the wireless communication device is capable of sending the alarming information to the tested vehicle.\n | 5. The automatic driving vehicle according to claim 1 the dynamic barrier ability of the test field, wherein it further comprises a management centre operation test, the operational test management centre comprises sending road condition information to the to-be-detected vehicle, with the V2N communication protocol of the wireless communication device.\n | 6. The automatic driving vehicle according to claim 5 the dynamic barrier ability of the test field, wherein the dynamic obstacle is dummy, animal model or non-motor vehicle, the dummy, animal model or non-motor vehicle comprises taking or sending location information for the mobile device and/or movement information, the operation test management centre communication connected with the said drive device for controlling the drive device to drive the dummy, animal model or non-motor vehicle movement.\n | 7. The automatic driving vehicle according to claim 5 the dynamic barrier ability of the test field, wherein the dynamic obstacle for motor vehicle, said motor vehicle is an unmanned vehicle. the wireless communication device comprises sending the motor vehicle travel path information of the vehicle, having a V2V communication protocol and a V2N communication protocol, the operation test management centre and the vehicle are in communication connection through their own wireless communication device so as to test the operation management centre controls the vehicle running.\n | 8. The automatic driving vehicle according to claim 1 the dynamic barrier ability of the test field, wherein it further comprises a signal lamp, the signal lamp is installed on the road and indicates to allow the vehicle to pass.\n | 9. The automatic driving vehicle according to claim 1 the dynamic barrier ability of the test field, wherein the further barrier, the sight comprising: a sight barrier blocking the vehicle through machine vision finds the position of the dynamic barrier.\n | 10. The automatic driving vehicle according to claim 1 the dynamic barrier ability of the test field, wherein it further comprises a collecting module for interconnection with the tested vehicle for collecting the information of the vehicle.", "output": "The utility model claims an automatic driving vehicle performance test technical field, especially claims an automatic driving vehicle barrier capability of avoiding dynamic test field. test field comprising running route of road and the dynamic obstacle, a movement route of the dynamic obstacle, crossing with the tested vehicle is located on the road, and a dynamic obstacle to the to-be-tested vehicle does not make action under the condition that will collide with the vehicle at a running route and the movement route of the motion. In one aspect, such a test field compared to a computer simulated vehicles operating software the theoretical data, more close to the actual running environment, so that the test result can more accurately represent the ability of obstacle avoiding dynamic measured vehicle, and compared with the actual road of road safer measuring, on the other hand, the test field can be used for testing different automatic driving vehicle as standard scene, so that the test result is more authoritative and reliable.", "instruction": "Please summarize the input" }, { "input": "automatic driving vehicle near the vehicle capability testing method and testing fieldThe invention claims an automatic driving vehicle performance test technical field, especially claims an automatic driving vehicle near the vehicle capability testing method and testing field. the test field of the test field and the test method comprises a road and first reference vehicle, scene arranged to the front of the vehicle or behind the control first reference vehicle and keeps it capable of avoiding distance to run, the first reference vehicle and the vehicle running side by side, or control the first reference vehicle travelling with the vehicle on the lane adjacent to the vehicle to be measured, then controlling the first reference vehicle changing speed and/or direction of travel close to the vehicle. Thus it is more close to the actual running environment to make the test result more accurately shows the vehicle close to the avoidance capability of the vehicle, and compared with the actual road test is more safe. secondly, matching the detected vehicle response, standardizing the vehicle avoiding the evaluation of vehicle capability, test result authoritative and reliable.|1. An automatic driving vehicle close to the vehicle capability testing method, wherein it comprises the test field scene and the vehicle response, the scene arrangement is that the test field comprises road and the first reference vehicle, the vehicle is placed on the road. control the first reference vehicle front or back the vehicle to be measured and which keeps the avoidance distance in driving, or controlling the first reference vehicle and the detected vehicle running side by side, or control the first reference vehicle travelling with the vehicle on the lane adjacent to the vehicle to be measured. and then controls the first reference vehicle changes the vehicle speed and/or direction of travel close to the vehicle, the vehicle response is the vehicle begins to be tested normally travels on the road, when the first reference vehicle and to avoiding action of the first reference vehicle, to avoid or reduce collision the first reference vehicle.\n | 2. The avoiding vehicles closing capability test method the automatic driving vehicle according to claim 1, wherein the scene arrangement further includes the first reference vehicle comprises wireless communication equipment with V2V communication protocol. the first reference vehicle itself changing the vehicle speed and/or driving directions from the early-warning information to the vehicles to be detected through the V2V communication protocol of wireless communication device close to the vehicle, the vehicle response further comprises: the vehicles to be detected receives the first reference vehicle sends warning information and combining the state after analyzing and processing, finally finishing the avoidance operation of the first reference vehicle, to avoid or reduce collision the first reference vehicle.\n | 3. The automatic driving vehicle according to claim 2 near the vehicle capability test method, wherein the scene arrangement further comprises the test field further comprises the road side unit, the road side device set on the road or beside. the road side device comprises an automatic detector for detecting the road condition, the road related information received from the detector of the analytical process to form pre-warning information processing module and a wireless communication device having a V2I communication protocol, the wireless communication device sending processing module to the vehicles to be detected to form pre-warning information. the wireless communication device of the first reference vehicle further comprises V2I communication protocol, the first reference vehicle sends warning information to the road side device through the wireless communication device. the road side equipment sends the pre-warning information to the vehicles to be detected after receiving the first reference vehicle sends warning information; the vehicle response further comprises: said vehicle information device receives the warning sent by the road side and combined with the state after analyzing and processing, finally finishing the avoidance operation of the first reference vehicle. to avoid or reduce collision the first reference vehicle.\n | 4. The automatic driving vehicle according to claim 1 near the vehicle capability test method, wherein the scene arrangement further comprises the test field further comprises the second reference vehicle, control the second reference vehicle travelling on the road and around the vehicle to be measured, the measured vehicle response further comprises: the tested vehicle avoiding action of the first reference vehicle while avoiding collision of the vehicle with the second reference.\n | 5. The automatic driving vehicle according to claim 4 near the vehicle capability test method, wherein the scene arrangement further comprises the second reference vehicle comprises wireless communication equipment with V2V communication protocol; the vehicle response further comprises the vehicle to be tested to the second reference vehicle early warning information is sent out.\n | 6. The automatic driving vehicle according to claim 1 near the vehicle capability test method, wherein the scene arrangement further comprises the test field further comprises additional safe facilities on the road or road side the testing vehicle on the adjacent lane of the road safety facility running, the vehicle response further comprising: avoiding action of the vehicles to be detected while avoiding the first reference vehicle and the road safety facility collision.\n | 7. The automatic driving vehicle according to claim 1 near the vehicle capability test method, wherein the scene arrangement further comprises the test field further comprises running a test management centre, the operation test management centre comprises a wireless communication device with V2N communication protocol, the measured vehicle response further comprises the vehicle to be tested to the operation test management centre sends the road condition information.\n | 8. The automatic driving vehicle near the vehicle capability according to claim 1 testing method, wherein the testing vehicle response further comprises one or several of the following: a, the vehicle remind the car in the front road condition, b. the vehicles to be detected remains itself located on the road while avoiding action of the first reference vehicle, c. said detected vehicle alarm action.\n | 9. An automatic driving vehicle avoiding the vehicle capacity of the test field, wherein, comprising a road vehicle a first reference, the first reference vehicle can be front or back the vehicle to be measured and which keeps the avoidance distance to run, or with the vehicle to be measured are running, or running opposite to the vehicles to be detected on the lane adjacent to the vehicle to be measured, and then change the vehicle speed and/or direction of travel close to the vehicle.\n | 10. The avoidance test field near the vehicle capacity of the automatic driving vehicle according to claim 9, wherein it further comprises a second reference vehicle, the second reference vehicle can travel on the road and around the vehicle to be measured. the second reference vehicle comprises wireless communication equipment with V2V communication protocol, road side unit, the road side device set on the road or beside the road side device comprises an automatic detector for detecting the road condition, the road related information received from the detector of the analytical process to form pre-warning information processing module; and a wireless communication device having a V2I communication protocol of the wireless communication device to the vehicle to be measured sends early warning information processing module is formed of the first reference vehicle comprising a wireless communication device of the V2V communication protocol and the V2I communication protocol. the first reference vehicle sends warning information to the road side device and the vehicles to be detected through its wireless communication apparatus, the road side equipment sends the pre-warning information to the vehicles to be detected through its wireless communication apparatus through the wireless communication device receives the first reference vehicle sends the pre-warning information; road safety facility, the road safety facility is set on or beside the road, and management centre operation test, the operational test management centre comprises a wireless communication device with V2N communication protocol. the wireless communication device of the first reference vehicle further comprises V2N communication protocol, the wireless communication device of the operation test management centre and the first reference vehicle has V2N communication protocol through the respective other communication connection to test the operation management centre controls the first reference vehicle running.", "output": "The method involves placing a main vehicle on a road. A reference vehicle is controlled for running side by side to maintain distance during driving. The reference vehicle is traveled with the main vehicle on the road in an adjacent position. The reference vehicle is controlled to change vehicle speed and/or direction of travel to the main vehicle for avoiding collision of the reference vehicle and the main vehicle. The reference vehicle is provided with a wireless communication equipment that is provided with a vehicle-to-vehicle (V2V) communication protocol. An INDEPENDENT CLAIM is also included for an automatic driving vehicle avoiding the vehicle capacity of the test field. Automatic driving vehicle capability testing method. The reference vehicle is controlled to change the vehicle speed and/or direction of travel to the main vehicle for avoiding the collision of the reference vehicle and the main vehicle, thus realizing a test result authoritative and reliable. The drawing shows a schematic view of an automatic driving vehicle capability testing system.", "instruction": "Please summarize the input" }, { "input": "the automatic driving vehicle lane change capability testing method and testing fieldThe invention claims an automatic driving vehicle performance test technical field, especially relates to a test method and a test field of automatic driving vehicle lane change capability. the test field of the test field and the test method comprises path, in operation test management centre and a first reference vehicle, road comprises adjacent first lane and a second lane, the vehicle is placed on the first lane, the first reference vehicle to be tested on the first lane, and the vehicle ahead, then operation test management centre sends switching command to the vehicle or driving task command. the test method and the test field is more close to the actual running environment and test result can more accurately show the changing capacity of the vehicle to be tested, compared with the actual road test manner is more safety, and it is suitable for different automatic driving vehicle, normalized to vehicle changing ability evaluation, test result more authoritative and reliable.|1. A test method of automatic driving vehicle lane change capability, wherein it comprises the test field scene and the vehicle response, the scene arrangement is that the test field comprises road, running test management centre and the first reference vehicle. said road comprises adjacent first lane and a second lane, the vehicle is placed on the first lane, control the first reference vehicle running on said first lane and in front of the vehicle to be measured, then operation test management centre sends switching command to the vehicles to be detected or running task command; wherein the running task command includes the vehicle position to the destination of the mileage and a required arrival time or running time, and sending the running task command. controlling the running speed of the first reference vehicle is less than said distance and said required arrival time to the value of the current time, or less than the distance and the required travel time of the provider, the vehicle response is the vehicles to be detected following the first reference vehicles travelling on the first lane. after receiving the channel changing command or the running task command, the vehicle safely enter into the second lane.\n | 2. The automatic driving vehicle according to claim 1, the capability of the method, wherein, the scene arrangement further comprises: under the condition of issuing the command, the channel changing command to between the adjacent two lane two times to realize the overtaking the first reference vehicle, the vehicle response further comprises: the vehicles to be detected after entering the second lane safely enter again the first lane and located between the first reference vehicle ahead.\n | 3. The method according to claim 1 or 2, the automatic driving vehicle lane change capability test method, wherein the scene arrangement further comprises the test field further comprises the second reference vehicle, control the second reference vehicle around on said second lane, and the vehicle to be measured and the first reference vehicle and running in the same direction, the second reference vehicle is the vehicle to be measured has changeable distance.\n | 4. The automatic driving vehicle according to claim 3, the capability of the method, wherein, the scene arrangement further comprises: the first reference vehicle and the second reference vehicle comprises wireless communication equipment with V2V communication protocol, the first reference vehicle and the second reference vehicle through wireless communication device receives the changing request sent by the vehicle, and after receiving the switching request response information agree, the measured vehicle response further comprises: the vehicles to be detected vehicle to the first reference and the second reference vehicle sends changing request, and receiving the first reference vehicle and the second reference vehicle replies agreeing information after changing action is performed.\n | 5. The automatic driving vehicle according to claim 4, the capability of the method, wherein the operation test management centre comprises a wireless communication device with V2N communication protocol. the wireless communication device of the first reference vehicle further comprises V2N communication protocol, the wireless communication device of the operation test management centre and the first reference vehicle has V2N communication protocol through the respective other communication connection to test the operation management centre controls the first reference vehicle to run; the wireless communication device of the second reference vehicle further comprises an V2N communication protocol, a wireless communication device in the operation test management centre and said second reference vehicle through each with V2N communication protocol of each communication connection to test the operation management centre controls the second reference vehicle running.\n | 6. The automatic driving vehicle according to claim 1, the capability of the method, wherein, the scene arrangement further comprises: said test field further comprises a third reference vehicle and controls vehicle travel opposite the third reference and said first reference vehicle on the second lane. the operation test management centre comprises a wireless communication equipment with V2N communication protocol, the third reference vehicle comprises wireless communication equipment with V2N communication protocol, the wireless communication device of the operation test management centre and the third reference vehicle has V2N communication protocol through the respective other communication connection. to test the operation management centre controls the third reference vehicle is running, the first reference vehicle comprises wireless communication equipment with V2V communication protocol, the wireless communication device of the third reference vehicle with V2V communication protocol. the first reference vehicle and the third reference vehicle through wireless communication device receives the changing request sent by the vehicle, and after receiving the switching request response information agree, the detected vehicle response further comprises the vehicle to be tested and the third to the first reference vehicle reference vehicle sends changing request. and receiving the first reference vehicle and the third reference vehicle reply the consent information after the changing operation is performed.\n | 7. The automatic driving vehicle according to claim 1, the capability of the method, wherein, the scene arrangement further comprises: said test field further comprises a road side device, the road side device set on the road or beside. the road side device comprises an automatic detector for detecting the road condition, the road related information received from the detector to analyzing wireless communication device processing a traffic information processing module, and a V2I communication protocol, the wireless communication device sends the processing module formed by the road condition information to the vehicle to be measured before the vehicles to be detected; the measured vehicle response further comprises the vehicle receives traffic information device sent by the road side and combined with the state after analyzing and processing, finally finishing the changing action.\n | 8. The automatic driving vehicle according to claim 1, the capability of the method, wherein, the scene arrangement further comprises: the operation test management centre comprises a wireless communication device with V2N communication protocol. the operation test management centre sends the switching command or running the task command to the vehicles to be detected through the wireless communication device, the operation test management centre further sends the road condition information to the vehicles to be detected in front of the vehicle to be measured, the measured vehicle response further comprises: the vehicles to be detected receives the operation test management centre sends the road condition information and combining the state after analyzing and processing, finally finishing the action, and the vehicle to be measured to the operation test management centre sends the changing information.\n | 9. A test field of automatic driving vehicle lane change capability, wherein it comprises road, the road adjacent a first lane and a second lane, a first reference vehicle, the first reference vehicle can move on the first lane and the vehicle ahead, operation test management centre, the operation test management centre comprises a wireless communication device with V2N communication protocol, the operation test management centre can send command or running the task command to the tested vehicle through the wireless communication device; wherein sending the running task command under the condition that the running task command comprises the position to the vehicle mileage and the requirement of the destination arrival time or to travel time, travel speed of the first reference vehicle is less than said distance and said required arrival time to the value of the current time, or less than the distance and the required travel time of running.\n | 10. The automatic driving vehicle according to claim 9, the capacity of the test field, wherein it further comprises a second reference vehicle or third reference vehicle, wherein the second reference vehicle can around on said second lane, and the vehicle to be measured and the first reference vehicle and running in the same direction, the second reference vehicle is the vehicle to be measured has changeable channel distance, the third reference vehicle can run counter to the first reference vehicle on the second lane; road side device, the road side device set on the road or the road side device comprises an automatic detector for detecting the road condition, the road related information received from the detector of the analytical process to form traffic information processing module; and wireless communication equipment with V2I communication protocol, the road side device can send processing module to the vehicles to be detected to form traffic information before the vehicle through the wireless communication device, wherein the operation testing management centre further sends the road condition information to the vehicles to be detected before the vehicles to be detected; wherein the first reference vehicle comprising a wireless communication device of the V2V communication protocol and the V2N communication protocol, the first reference vehicle capable through its wireless communication apparatus receives the changing request sent by the vehicle and receiving the lane Obtaining and restoring the consent information, and the first reference vehicle is an unmanned vehicle, the wireless communication device of the operation test management centre and the first reference vehicle has V2N communication protocol through the respective other communication connection. to test the operation management centre controls the first reference vehicle running, wherein the second reference vehicle comprises wireless communication equipment with V2V communication protocol and the V2N communication protocol. the second reference vehicle can through its wireless communication apparatus receives the changing request sent by the vehicle and return consent information after receiving the switching request, and the second reference vehicle is an unmanned vehicle, the wireless communication device of the operation test management centre and said second reference vehicle has V2N communication protocol through the respective other communication connection. to test the operation management centre controls the first reference vehicle driving, wherein the third reference vehicle comprises wireless communication equipment with V2V communication protocol and the V2N communication protocol. the third reference vehicle can through its wireless communication apparatus receives the changing request sent by the vehicle and return consent information after receiving the switching request, and the third reference vehicle is an unmanned vehicle, the wireless communication device of the operation test management centre and the third reference vehicle has V2N communication protocol through the respective other communication connection. to test the operation management centre controls the third reference vehicle running.", "output": "The method involves placing a vehicle on a first lane. Vehicle running process on the first lane is controlled. A switching command is transmitted to the vehicle by an operation test management center, where the switching command includes vehicle mileage, required arrival time and running time period. A running task command is obtained when running speed of the vehicle is lesser than distance and the required arrival time period. Vehicle response is determined. The vehicle is entered into a second lane when a channel changing command and the running task command are received. Automatic driving vehicle lane change capability testing method. The method enables determining actual vehicle running environment and changing capacity of the vehicle, and improving test result accuracy and reliability with high safety. The drawing shows a schematic view of an automatic driving vehicle lane.", "instruction": "Please summarize the input" }, { "input": "Apparatus and system to manage monitored vehicular flow rateThe system apparatus manages and adjusts monitored traffic density and/or speed in relationship to spatial locational flow rates, through a plurality of mobile and/or stationary encrypted communication device sensors, system transmits calculated digital navigational directives throughout a network of domains within any infrastructure in direct secure communication with humans, drivers and/or owners and/or vehicle transports mechanism themselves, viewable and/or audibly seen on iNavX2 virtual interface, providing on demand dimensional spatial mapping locational services, driver and/or vehicles system generated requests for certified advertisements and tracking transmitted navigational maneuvers.The invention claimed is: \n | 1. A method, comprising a secure navigational system and apparatuses networked within at least one domain and configured within at least one infrastructure for a manageability of monitored aggregate vehicular traffic density maintaining variable sustained vehicle velocity, and further comprises the following elements: characterized by \nat least one central system server configured to perform calculable equations; transmit navigational directives, suggestions and advisements for purpose of to: track, record, measure and maintain spacial density between each transport at a calculated speed in relationship to phase-change spatial analytics from artifacts calculated from current positional points in relationship to prior at least one future positional phase point in space time location for at least one vehicle, a human, motorcycle, bicycle transmitting current Longlat coordinate location through a plurality of encrypted communication devices to the system; \nwherein system transmits modifiable speed variations consistent with original routes destination time frames, and provide alternate routes based on current computed location with projected congestion artifact and density computed variables, each comprised of weather variations causing reductions in flow rates, speed or velocity of at least one vehicle transports calculations and for each transport, a mass of vehicle transports transmitting calculated adjustments in-from forecasted equations in to one or more vehicles in one or more vector areas within at least one domain, detecting a predicated or predictable eventuality of congestion level change in future time space, responding to driver, or from vehicles navigational device requests, system determines vehicles mechanism anticipated communication interoperability issues in communication time-lag in responses to transmissions or requested data from the system, including ecomm-advice, one or more a plurality of certified complaint encrypted communication devices, a vehicles equipped with Qbvipro, pAvics and other similar certified compliant communication device equipped for vehicles transports, and one or more programs configure within system calculates and advises one or more vehicles, drivers, owners, operators or vehicles controller requested alternate routes or system generated based on density factors, constructions areas along with human factors, vehicle capabilities and capacity to navigate: \nwherein the system computes and compares speed density variables in calculated flow rates from change in weather conditions on selected routes or entire areas affected by atmospheric changes to assist minimizing the need to break inertia, control safe density factors and manage flow rates, maximizing constant variable velocity throughout for at least one vehicle transport or a plurality of vehicle transports within the domain that is networked within an infrastructure, and transmits to a single vehicle or at least one vehicle within a mass that is not registered due to communication error and that is registered authenticated link sync'd and paired with a plurality of ecomm-devices and system; \nwherein the system comprises adjustments in calculated configured spatial density occur in proportion with current velocity for at least one transport within a mass or a cluster of transports or a single vehicle transmitting encrypted navigational directives, and further performs calculated adjustments from spatial phase-change density in future time in proportional to spatial velocity continuously, including cross-layer intersections, modifying speed to one or more transports calculated from optimized overlay models allowing vehicular traffic to flow safely; and \nfurther the secure navigational system and each encrypted communication device apparatus, comprises one or more non-transitory computer-implemented programmed methods, further comprised of a plurality of storage mediums, in communication with and comprised of at least one server configured in a localized cloud sync'd enterprise configuration communicating through a sub navigational system transmitting, receiving reactive navigational directives, recording die amount of vehicular traffic proximity density at a time frame (moment) relative to a specific point in time space, compared with the speed of the a calculated comparative clustered mass area chosen and compared with other determined vehicles within the same mass, calculating common variations in predictable calculated overlay models and provide calculable results to manage vehicle flow rates at future points in time space or within a selected vector area, and \nfurther comprising archived data-analytics, real-time data artifacts and other realtime telematic data transmitted to system from a plurality of device sensors synchronize link sync'd with at least one vehicle transport, transmitting engine informatics through a plurality of link sync'd ecomm-device sensors, in further communication with at least one networked server hub each authenticate, sync-d, paired and securely registered within at least one domain networked connected to each in direct communication with each other and at least one central server within at least one domain networked infrastructure; \nwherein system further comprises a plurality of communications and advertisements encrypted for secure transmissions between vehicles, a plurality of ecomm-devices sensors including Object Functionality Points or Proximity Integration on vehicles and servers within a domain networked architecture; and \nwherein system server, connected cloud based synchronized servers and all other ecomm-devices OS, NOS and other certified applications used within the domain networked infrastructure that receives and transmits a plurality of data, information, directives, suggestions and advertisements including certified tribal entities, owners, and drivers viewable device the construct comprises one or more of a plurality of machine languages, and one or more parts thereof converted to binary and one or more parts of the machine languages and there converted binary parameters or parts thereof along with languages are converted to binary codes are encrypted; \nwherein dominion of all data analytics, data facts, data relics and all other telematic data transmitted, received, extracted, responded and acknowledged to and equipped with at least one transmitter, receiver, transponder and other certified compliant telematic ecomm-devices, including a plurality of towers in direct communication with a plurality of vehicle transports, purposely generating income for each city, township, and municipalities, state and federal DOT; \nwherein the plurality of ecomm-device sensors are each equipped with at least one network interface for each stationary and mobile ecomm-devices and other compliant certified devices and servers in communication with one or more of a plurality of remote mobile and stationary encrypted device sensors equipped with Avics, each further constructed of and equipped with a media suitable for storing and processing electronic instructions to maneuver vehicle transports, to increase, decrease flow rates, velocity or speed, instruct vehicle transport or its mechanism and/ or a human to turn on vehicles lights, blinkers, transmitting instructions and communications to vehicles, mechanism or humans from system generated or one or more ecomm-devices or Obvipro detected with evasive maneuverable recommendations, directives, instructions and lane change acquisition, and further comprises other navigational maneuvers, to perform a plurality of tasks, procedures, performing analytical calculations, computations and other mathematical equations embedded within, and comprised on at least one iChipSet RFIDGPS equipped with a transponder/receiver/transmitter and entire device comprises a plurality of modules; \nwherein said receiving and transmitting data comprises encapsulated encrypted data exchange or push transmissions from a plurality of at least one ecomm-device and other certified compliant devices, comprised of a configured plurality of at least one iChipset programmed to perform of at least one task or a plurality of at least one procedure, event, calculations to disseminate informatics either current or artifacts; \nwherein one or more ecomm-devices sensors and vehicles communicate through one or more sub navigational systems (subnaysys), computing devices configurable variable frequencies to: transmit, receiving and responding to transponders requested signal data from a plurality of strategically calculated positioned stationary and mobile communication ecomm-device sensors and other compliant certified devices attached to a plurality of vehicle transports or from humans; and \nfurther equipped with at least one Avics iChipset module constructed in and configured as a hardware, software downloadable application, a hardware software device or combination thereof, comprises Obvipro, pAvics and other certified ecomm-devices within any domain networked infrastructure, system transmits dimensional mapping locational services displayed on a virtual interface calculable from positional point transmitted by vehicle devices and a plurality of smart devices locational services; wherein each infrastructure comprises one or more residential domains, shopping center domains, and to be determined by a configured network protocol for each domain that is networked within municipalities, city or townships infrastructure, that may further comprised, of a plurality of modules, additional cameras and plurality of other certified compliant communication devices and sensors within one or more specific networked domains for additional security measures fer in certain vector areas, in conjunction with a plurality of hub sensor devices within at least one vehicular traffic domains infrastructure; \nwherein each interface being configured to transmit, receive and to respond to at least one transponders signal of different frequencies at the same time that may pulsate; \nwherein each transmitted dataset, data analytics, data facts and artifacts from vehicle transports mechanical, mechanical-electrical and electrical sensors and all other system ecomm-devices and transmitted ecomm-advice each is timestamped for traffic data and other informatics entering and existing a plurality of encrypted communication device link sync'd together within a cryptic VPI connection using encrypted data push transmissions, via a VCPI (Virtual Cryptic Private Infrastructure) tunnel to one or of a plurality of encrypted communication devices and by other secure means collecting, transmitting, receiving and transponder signals and responding signal data feed into a central processing complex of at least one fink sync'd cloud based Sync'd server and system responds with at least one encrypted Paired-Key acknowledging ecomm-device, Obvipro, pAvics and other certified compliant devices within the network infrastructure; \nwherein said plurality of each encrypted ecomm-devices, are load-networked within a given domain networked area channeling said plurality of communications parts, through one or more selected determined configured sub navigational system tracked paths or routes to system server; \nwherein infrastructure transmission devices are Sync'd to each other after registration and authenticated and paired with and to at least one other communication sensor device, vehicle, a server or the plurality of ecomm-devices in direct communication with each and system and link sync'd servers with system server, assisting in threat intelligent analytics and analysis; \nwherein traffic data further comprises: vehicles phase-change spatial location, positional congestion artifact relationship with past phase-change position to calculated future phase-change spatial positional point in time space, vehicle informatics transmitted to system calculating forecasted flow rates in future spatial time for at least one vehicle transport within a mass or a single vehicle and other calculated derived intelligence necessary to perform real-time calculable equations to securely navigate vehicular traffic, further received from and by a plurality of mobile ecomm-device sensors deployed in a plurality of vehicles transmitting throughout a plurality of one or more vector-hub class sensor device hubs; \nwherein system further comprises at least one AlphaVectorHub and AlphaHubs and one or more are combined with other sub hub-class and other certified compliant communication device sensors, nodes and virtual nodes transmitting secure selected advertisement requests from one or more drivers, vehicle, transports, humans and a plurality of mechanisms strategically arranged along one or more roadways, each device and advertisement is displayed in a virtual reality interface configured within iNavX2 for a plurality of transports and same being viewable in and configured to display same in at least one iNavCom center and facilities: \nwherein stored particulars comprise a plurality of strategically located servers, each having at least one non shared encrypted database, in secure direct communication with at least one central server within each domain networked infrastructure, purposely to perform calculations, to detect approaching and passing vehicles, archived data artifacts, relics and disseminate shared data across tribal entities and for the purpose of to reconstruct anomalies and accidental occurrences from humans, bicycles, motorcycles, and the derived causation of a plurality of system disturbances along with Longlat positional time points before and after incident of each including on private property; and \nfurther supported by one or more data facts comprised of: telematic data archived and indexed, comprised of: time intervals of communications and distance/time data entered/existed a plurality of compliant certified communication devices and cameras; vehicles travel speed, volume in relationship with topography and climatic conditions, flow rates and density and ecomm-device markers and time intervals and their VarChk Index, including transponder and response signal indicators and all other monitored and recorded categorized and cataloged informatics for each driver or owner and their associated elements and factors including registered location for each; vehicle, transport, motorcycle, bicycles and humans recorded in a indexed data book, each analytically compared independently and combined with one or more data artifacts collected, computing precise geographic positional location, and further verified from renderings and overlay map-objects created by each effected vehicle involved in accident or mishap, the computed result of which determines whether the human driver or owner was at fault or communications interoperability issues with system or ecomm-devices, Obvipro, pAvics or other certified complaint devices or a plurality of vehicles or transport were found or a mechanism itself was the causation of accident, incident or a mishap; \nwherein a central server, ecomm-devices, Obvipro, pAvics and other certified compliant devices are configured to receive, transmit and respond to transponders signals each of variable frequencies that may pulsate and one or more parts are encrypted, communicating a plurality of traffic data artifacts and informational data to and from a plurality of calculated strategically positioned remote stationary and mobile hub ecomm-devices, sensors, and other certified compliant device sensors throughout a secure sync-d linked network within a virtual private infrastructure; therein \ncreating a channeled telematics network from a plurality of encrypted communication devices, and sensors equipped with at least one Avics iChipset, for each Obvipro, pAvics and other certified compliant encrypted communication devices and senors use UPnP telematic discovery service; and further transmitting and receiving calculated encrypted digital ecomm-advice directives to and from traffic vector-hub class communication sensor devices updating traffic data in one or more non shared databases; and \ntransmitting continuous updates for the purpose of to calculate traffic density in proportion to rate flow for one or more vehicles traversing along one or more roadways based on updated traffics dynamic data; and \ntransmit timing adjustments for speed, flow rates and spatial density throughout a network of secure devices to one or more transports, traffic lights, tVectorHubs and others similar compliant certified ecomm-devices and sensors to advise and transmit to a plurality of vehicles, and transports, along with quantum vector nodes and virtual nodes displaying existing and future contemplated traffic signage in a virtual interface, further providing fuel, eating and other services travelers need, requested autonomously from vehicles Obvipro's, pAvics and system informational recommendations based onboard vehicles processors current integrated capabilities with Proximity Integration, formulating XY (Long/Lat) coordinates laced together with spatial third dimension Z-Topography and Climatic Expectations commixed with human factors, vehicle capabilities to navigate, self drive-ability guided by encrypted navigational directives, vehicles reactive capacity to self navigate with instructions from the system, along with transmissions from other certified compliant vector encrypted ecomm-devices based on optimal traffic flow calculations determined by speed variations in relationship to and from traffic density, human, vehicle capability and capacity factors, continuously creating and updating calculated overlay models transmitting adjustments in vehicular velocity rate flow in proportion with density between at least two transports, a clustered mass of transport vehicles or a single vehicle within the mass; \nwherein communication links are pre-configured with one or more ecomm-devices when deployed or determined by forecasted forward velocity based on recorded and posted traffic speed, and for security reasons moving data transmission in advance of transports calculated future positional points, each stationary and mobile ecomm-devices only communicate with certified compliant devices link sync'd registered paired and authenticated with system, ecomm-advices and other certified compliant devices and sensors, creating a primary line of defense for structured network protocol security established by a configured baseline for channeled telematics; \nwherein comprising system transmits navigational directives to one or more certified compliant vector ecomm-devices, comprised of optimal traffic flow calculations determined by speed variations in relationship to traffic density to minimize the need to break inertia, transmitting encrypted timing adjustments over a network to one or more vehicles via encapsulated digital encrypted voice and displayed in a virtual interface for vehicles and navigational command centers and facilities and further providing encrypted virtual commands between one or more traffic lights, tVectorhubs, cameras and other compliant certified communication devices for an intersection within a domain networked infrastructure and roadway areas equipped with VectorHub Class ecomm-devices, sensors, and other types of sensor devices; \nwherein tVectorHubs, vector-hubs, sub hub-class sensors, VectorHub Class ecomm-devices one or more are combined with BeaconHubs, SentryBeaconHubs, SentryHubs, SentinelHubs, Sentry Nodes, AlphaHubs, AlphaVectorHubs; b, c, d and xVectorHubs, and each sub hub sensor devices configuration preforms performs at least one explicit task, a duty of which is specifically allocated and configured for any given sector within networks infrastructure area; and \nfurther comprised of each device sensor and other certified compliant device sensors, ecomm-devices are equipped with a configuration of at least one iChipset, the construct comprising at least one non-transitory machine computer-readable medium constructed with a plurality of storage mediums, at least one interface, and further comprising a non-transitory computer device configured for receiving, to transmit, and respond to transponder calls, to mark data upon entering and existing a plurality of encrypted communication devices, sensors, electrical, mechanical or electrical-mechanical device; and \nfurther configured to determine vehicle locational positions and to detect, to warn, to advise, suggest, alert, to respond to requests from humans, a transport, or a mechanism configured within and attached to the vehicle transport, comprised of a plurality of certified registered link sync'd ecomm-devices, authenticated and paired within systems infrastructures communication devices and servers, configured with at least one Paired-Key and at least one response match set for each Paired-Key set assigned within the OS's NOS within the certified compliant ecomm-device, a device, or sensors carried by humans, attached to temporarily or permanently, comprised of a smart device installed with pAvics application or a hardware software device Obvipro and other certified ecomm-devices, and attached temporarily or permanently to a plurality of vehicle and transports; \nwherein system comprises encrypted transmitted directives to turn head fights on for autonomous and semi-autonomous vehicles and transports are audibly or visually instructed, recommended rest time intervals for personal and commercial vehicle drivers for vehicles equipped with or without technologies monitoring drowsiness with audible voice statements to assist with aiding driver in keeping awake or system detects driver attentiveness is declining; and \nfurther transmit visual and audible notifications to vehicle with destination arrivals time, alternate routes, maintenance items, declining fuel notifications recommending fuel node stops in a virtual interface, transport mechanism inspections, insurance, valid tags and vehicle ownership records and verification from at least one data-base and other associated informatics, said data is stored in at least one iChipset within each Obvipro and other certified compliant devices and securely stored within non-transitory machine computer-readable medium with a plurality of storage mediums and archived in at least one system servers database, and a downloadable software applications for humans, bicycles, motorcycles and of any other type certified compliant device and further comprising a Nuclex operating system providing designated lane isolation or acquisition; and \nfurther transmitting digital directives maintaining distance factors between vehicles based on density flow rates in proportion with weight loads, historical records of vehicles capabilities, capacity to stop, self navigate and navigate by a human factors associated with each driver and their registered recorded vehicle or transport, directives formulated from computed variables associated with each particular vehicles cataloged informatics, navigational suggestions as requested by driver, owner or system derived directives for autonomous vehicles, each audibly heard or virtually viewable or both audible heard and viewable at the same time; \nwherein system comprises analyzed dynamic analytical rate flow (DARF) in comparison to and in conjunction with calculated dynamic analytical lane allocations available, to assistant in maintaining vehicular spatial distance and dedicated transport positioning, along with dynamic directional flow constraints calculated inputs from network traffic congestion artifacts allowing vehicular traffic to move in variable velocity momentum, managed by a controlled network protocol protecting datas composition integrity flow; \nwherein system further comprises replacing existing traffic control lights and there associated control mechanisms and stop signs with at least one tVectorHub, VectorHubs combined with Sentry or Sentinel hubs sensors or combinations thereof, transmitting encrypted timing adjustments to adjust speed, flow rates and density for at least one vehicle transport within one or more domains fink sync'd in communication with a plurality of vehicles, transports, mechanism's and other certified complaint devices maneuvering or traversing on road way, including on toll roads; and \nfurther comprising each network device sensor are is connected to a plurality of ecomm-devices having a direct data transmission corn-link from and to a plurality of systems servers, system and other mobile and stationary ecomm-device sensors and other certified compliant device sensors communications comprise encapsulated digital encrypted data, including voice or virtual digital commands or both to one or more traffic lights or tVectorhubs or both at the same time for any given intersection, and other certified ecomm-devices, each system device is characterized including system construct transmitting a plurality of ecomm-advice; and \ncomprises at least one Nuclex operating system, comprised of at least one non-transitory machine computer-readable medium configured with a plurality of storage mediums, and further comprises a OS computer device having at least one non-transitory machine computer-readable medium configured with a plurality of storage mediums to perform at least one: process, task, detection of approaching vehicle and transports and system communications anomalies, intrusions, hacks or other maliciousness that disrupts system, records and transmits to system ecomm-devices ability to function properly, detecting non-engagement and non registered vehicle and transports, improper code injections, to compute at least one algorithm, to perform mathematical equations, to transmit, receive from and to system and from and to the ecomm-device and other certified compliant device sensors generate digital directives, to advise; to initiate 911 service activated by a human from the smart certified complaint ecomm-device, for cyclers, runners and motorcycles or an ecomm-device activated by vehicle transport each providing locational services and the location at time of 911 activation, to respond to a transponder signal of a plurality of frequencies and to initiate a transponder signal and ecomm-advice directives transmitted to system; and \nfurther each system, ecomm-devices and other certified compliant device sensors and modules comprise at least one or more programmed events, processes, tasks, procedures, a plurality of decisions, detecting, perform calculable equations, acknowledging a plurality of transmissions, to respond to a transponder or to activate a transponder signal, to initiate or activate 911 calls and one or more parts thereof are hard coded into at least one iChipSet; \nwherein each OS comprises one or more iChipSets constructed and independent of each other in its functionality, purpose, programmed procedures, time stamp data entry and existing each iChipset and a plurality of devices, or sensors, to perform one or more processes, calculations, events, to transmit and receive, to perform one or more tasks, to make or activate a decision and one or more parts are combined with each other within the certified compliant ecomm-devices; \nwherein system and vehicles ecomm-devices comprises a configured generated GPSGIS Virtual Telematic Architecture in a 2D, 3D or a 4D toggled interface viewed by touch, voice or both on iNavX2's virtual interface, displaying all exiting and contemplated traffic signage, including representations for speed, spatial locations of surrounding vehicles with audible and visual navigational maneuvers including a plurality of emergency notifications and dimensional mapping locational service as requested by driver or a human, from system derived protocols, from the vehicle along with roads change display and visually indicating and recording ecomm-device vehicle is receiving transmissions from or to other vehicles (V2V) and system, infrastructure to vehicle (I2V) or V2I including a human to vehicle (H2V) and V2H. \n | 2. The secure navigational system of claim 1, wherein comprises one or more of a plurality of remote stationary and mobile communication hubs, devices, nodes or virtual nodes and a plurality of encrypted communication devices sensors, each comprised of at least one computer-executable instruction, further comprised of one or more programs hardcoded within a non-transitory machine computer-readable medium with a plurality of storage mediums configured within a plurality of ecomm-devices, each comprised of at least one RFIDGPS transponder/receiver/transmitter iChipset, better known as Avics iChipset; structured as a hardware, a downloadable hardware software application, a software application for smart devices or combination thereof to communicate with system, a plurality of Sync'd devices and systems servers, for the purpose of to transmit navigational directives and locational i...", "output": "The method involves transmitting/receiving encrypted digital communication-advice directives to/from traffic vectorhub class communication devices. Traffic density is continuously calculated in proportion to rate flow for vehicles (102) traveling along roadways based on dynamic data. Timing adjustments are transmitted over a network (110) to a traffic light, quantum vector nodes and virtual nodes formulating XY coordinates in an overlay model from certified vector communication-devices based on optimal traffic flow calculations determined by speed variations in relationship to the density. The dynamic data is traffic dynamic data. INDEPENDENT CLAIMS are also included for the following:a computer-apparatus structure for managing monitored vehicular aggregate traffic densitya computer-implemented procedure for managing monitored vehicular aggregate traffic density. Computer-implemented method for managing monitored vehicular aggregate traffic density. The method enables selectively powering down street light when traffic is less, thus reducing fossil fuel supply consumption rate by implementing Nxgen traffic system and allowing the vehicles to move as fast as possible without unnecessary idling, while optimizing exhausted energy consumed breaking inertia and speed bursts. The method enables integrating object functionality points or proximity integration to determine exact phase-change spatial relationship with each vehicle and allow prompt reactive response interval feeds into onboard vehicle processor, thus allowing each vehicle with certified communication-device to be provided with an ability to encapsulate logistical response times on preventative measures regarding accidental collisions and saving municipalities, states or countries significantly and reducing traffic expenditures, while making a safer traffic landscape, minimizing associated fatalities when accidental contact occurs with traffic signage and decreasing fuel consumption in manufacturing these items including material and labor costs for all signage. The drawing shows a block diagram of a system for managing and monitoring traffic density flow. '(Drawing includes non-English language text)' 102Vehicles104Receiving stations110Network112Computers114Databases", "instruction": "Please summarize the input" }, { "input": "Switching wireless network sites based on vehicle velocityThe disclosed technology proposes a new methodology to include the effect of speed and direction of a UE into the threshold used for determining when to switch between a 4G UL connection and a 5G UL connection. The system can use a lookup table with various speeds mapping to varying thresholds. The system can use an accelerometer sensor or digital compass to determine the direction of the vehicle, such as heading away from or toward the 5G site, so the vehicle can switch sooner from 5G-NR to LTE and from LTE to NR, respectively. For C-V2X applications, latency is an important factor because 5G technology provides shorter latency than 4G; thus keeping the link on 5G is preferred when under good coverage. Further, the idea is not limited to UL, 5G and/or vehicle technologies, but can also be applied to DL direction, Wi-Fi and/or drone technologies as well.The invention claimed is: \n | 1. A method comprising: \nobtaining, at a vehicle, signal quality data, from a fifth-generation wireless technology (5G) site and a fourth-generation wireless technology (4G) site of a cellular network, wherein the 5G and 4G sites provide at least a portion of a signal associated with the signal quality data; \nobtaining, at the vehicle, a velocity of the vehicle relative to at least the 5G site; \ncreating a dynamic signal quality threshold by increasing the dynamic signal quality threshold with an increasing velocity of the vehicle, increasing the dynamic signal quality threshold when the vehicle is moving away from the 5G site, and decreasing the dynamic signal quality threshold when the vehicle is moving toward the 5G site; and \nswitching a cellular network connection of the vehicle between the 5G site and the 4G site based on the dynamic signal quality threshold. \n | 2. The method of claim 1, comprising: \nobtaining map data for the cellular network, wherein the map data indicates variations in the signal quality data based on geographic location and a future path of the vehicle; \ndetermining a location where the signal quality data associated with the 5G site causes an interruption in a connection between the vehicle and the 5G site based on the map data and the future path of the vehicle; and \nswitching the connection from the 5G site to the 4G site before the vehicle reaches the determined location. \n | 3. The method of claim 1, wherein creating the dynamic signal quality threshold further comprises increasing the dynamic signal quality threshold when the vehicle is in communication with the 4G site and decreasing the dynamic signal quality threshold when the vehicle is in communication with the 5G site.\n | 4. The method of claim 1, wherein creating the dynamic signal quality threshold further comprises obtaining a table correlating a speed of the vehicle and a direction of motion of the vehicle with one of multiple signal quality thresholds.\n | 5. The method of claim 4, comprising determining a function correlating the speed of the vehicle and the one of multiple signal quality thresholds based on the table.\n | 6. The method of claim 1, wherein the signal quality data comprises a signal to interference plus noise ratio (SINR), a reference signal received power (RSRP), a bit error rate, or a packet error rate.\n | 7. The method of claim 1, wherein switching the cellular network connection of the vehicle comprises: \nswitching the cellular network connection of the vehicle between the 5G site and the 4G site when the signal quality data is below the dynamic signal quality threshold. \n | 8. At least one non-transient computer-readable medium, carrying instructions that, when executed by at least one data processor, performs a method comprising: \nResponse to Office Action dated Mar. 15, 2021 obtaining, at a vehicle, signal quality data, from a fifth-generation wireless technology (5G) site and a fourth-generation wireless technology (4G) site of a cellular network, wherein the 5G and 4G sites provide at least a portion of a signal associated with the signal quality data; \nobtaining, at the vehicle, a velocity of the vehicle relative to at least the 5G site; \nproviding a dynamic signal quality threshold by increasing the dynamic signal quality threshold with an increasing velocity of the vehicle, increasing the dynamic signal quality threshold when the vehicle is moving away from the 5G site, and decreasing the dynamic signal quality threshold when the vehicle is moving toward the 5G site; and \nswitching a cellular network connection of the vehicle between the 5G site and the 4G site based on the dynamic signal quality threshold. \n | 9. The non-transient computer-readable medium of claim 8, comprising: \nobtaining map data for the cellular network, wherein the map data indicates variations in the signal quality data based on geographic location and a future path of the vehicle; \ndetermining a location where the signal quality data associated with the 5G site causes an interruption in a connection between the vehicle and the 5G site based on the map data and the future path of the vehicle; and \nswitching the connection from the 5G site to the 4G site before the vehicle reaches the determined location. \n | 10. A system comprising: \none or more processors; \nmemory coupled to the one or more processors, wherein the memory includes instructions executable by the one or more processors to: \nobtain, at a vehicle, signal quality data, from a wireless network, wherein a first wireless network site and a second wireless network site provide at least a portion of a signal associated with the signal quality data; \nobtain, at the vehicle, a velocity of the vehicle relative to at least the first wireless network site; \ncreate a dynamic signal quality threshold by increasing the dynamic signal quality threshold when the vehicle is moving away from the first wireless network site, and decreasing the dynamic signal quality threshold when the vehicle is moving toward the first wireless network site; and \nswitch a wireless network connection between the vehicle, the first wireless network site or the second wireless network site based on the dynamic signal quality threshold. \n | 11. The system of claim 10, the vehicle comprising a land vehicle, an aerial vehicle or a water vehicle.\n | 12. The system of claim 10, the vehicle comprising an unmanned vehicle.\n | 13. The system of claim 10, the wireless network comprising a Wi-Fi network, the first wireless network site and the second wireless network site comprising a Wi-Fi access point.\n | 14. The system of claim 10, the wireless network comprising a cellular network, the first wireless network site comprising a fifth-generation wireless technology (5G) site and the second wireless network site comprising a fourth-generation wireless technology (4G) site.\n | 15. The system of claim 10, the instructions further comprising the instructions to: \nobtaining map data for the wireless network, wherein the map data indicates variations in the signal quality data based on geographic location and a future path of the vehicle; \ndetermining a location where the signal quality data associated with the first wireless network site causes an interruption in a connection between the vehicle and the first wireless network site based on the map data and the future path of the vehicle; and \nswitching the connection from the first wireless network site to the second wireless network site before the vehicle reaches the determined location. \n | 16. The system of claim 10, wherein the instructions to create the dynamic signal quality threshold further comprise instructions to increase the dynamic signal quality threshold when the vehicle is in communication with the second wireless network site and decrease the dynamic signal quality threshold when the vehicle is in communication with the first wireless network site.\n | 17. The system of claim 10, wherein the instructions to create the dynamic signal quality threshold further comprise instructions to operate an autonomous vehicle configured to wirelessly communicate with the first and second wireless network site and to measure a signal strength associated with the first and second wireless network site.\n | 18. The system of claim 10, wherein the instructions further comprise instructions to: \nobtain, at the vehicle, the signal quality data from the wireless network more frequently when a speed of the vehicle is above a speed threshold; \ncompare the obtained signal quality data to the dynamic signal quality threshold more frequently when the speed of the vehicle is above the speed threshold; and \nswitch the wireless network connection of the vehicle between the first wireless network site and the second wireless network site based on the comparison. \n | 19. The system of claim 10, wherein the signal quality data comprises a signal to interference plus noise ratio (SINR), a reference signal received power (RSRP), a bit error rate, or a packet error rate.\n | 20. The system of claim 10, wherein the instructions to switch the wireless network connection of the vehicle further comprise instructions to: \nswitch the wireless network connection of the vehicle between the first wireless network site and the second wireless network site when the signal quality data matches the dynamic signal quality threshold.", "output": "The switching method involves obtaining (600) signal quality data at a vehicle from a fifth-generation wireless technology (5G) site and a fourth-generation wireless technology (4G) site of a cellular network, where the 5G and 4G sites provide a portion of a signal associated with the signal quality data. A velocity of the vehicle relative to the 5G site is obtained (610), and created (620) a dynamic signal quality threshold by increasing the dynamic signal quality threshold with an increasing velocity of the vehicle. The dynamic signal quality threshold is increased when the vehicle is moving away from the 5G site, and the dynamic signal quality threshold is decreased when the vehicle is moving toward the 5G site. A cellular network connection of the vehicle is switched (630) between the 5G site and the 4G site based on the dynamic signal quality threshold. INDEPENDENT CLAIMS are included for:(a) a non-transient computer-readable medium for switching wireless network sites based on vehicle velocity;(b) a system for switching wireless network sites based on vehicle velocity. Method for switching wireless network sites based on vehicle velocity. The processor switches the cellular connection of the vehicle between the 5G site and the 4G site when the cellular network signal quality data matches or is less than the dynamic signal quality threshold. The processor compares the obtained signal quality data to the dynamic signal quality threshold more frequently when the speed of the vehicle is above the speed threshold. The drawing shows a flowchart of a switching method.600Obtaining signal quality data at a vehicle from a fifth-generation wireless technology site and a fourth-generation wireless technology site of a cellular network 610Obtaining a velocity of the vehicle relative to the fifth-generation site 620Creating a dynamic signal quality threshold 630Switching a cellular network connection of the vehicle between the fifth-generation site and the fourth-generation site based on the dynamic signal quality threshold ", "instruction": "Please summarize the input" }, { "input": "Hybrid mesh of licensed and unlicensed wireless frequency bandsThis disclosure describes techniques for creating a hybrid mesh of unlicensed wireless frequency bands between two or more vehicles communicating using an unlicensed wireless frequency band, and a massive MIMO base station communicating with the two or more vehicles using a licensed wireless frequency band. The hybrid mesh can be used to upload and download data from a vehicle in motion. The hybrid mesh can be formed via V2V connections between the vehicle and nearby vehicles. In other words, if a vehicle moves into a region outside the operating boundary of a 5G-NR massive MIMO base-station node, the vehicle can interact with other vehicles to generate a data pipeline using the unlicensed wireless frequency band from the vehicle the nearby vehicle, and using the licensed wireless frequency band from the nearby vehicle to the nearest, massive MIMO base station.The invention claimed is: \n | 1. At least one non-transitory, computer-readable medium carrying instructions, which when executed by a data processing in a vehicle, perform a method to facilitate transmission or reception of a data file, the method comprising: \nreceiving, at the vehicle, a request to transmit the data file; \nanalyze the data file to determine a bandwidth for transmission; \ndetermine that an available bandwidth associated with a licensed wireless frequency band is less than the determined bandwidth; and \ninitiate a hybrid mesh of an unlicensed wireless frequency band and the licensed wireless frequency band to transmit the data file, \nwherein the hybrid mesh corresponds to at least one Vehicle-to-Vehicle (V2V) communication connection between the vehicle and at least one other vehicle, and with a base station associated with the licensed wireless frequency band; and \ntransmit the data file via the hybrid mesh. \n | 2. The non-transitory, computer-readable medium of claim 1, further comprising instructions to: \nobtain at least two of: a speed associated with the vehicle, a planned path associated with the vehicle, or a map of base stations capable of providing the determined bandwidth for transmission; \ndetermine a time required to transmit the data file; \ndetermine whether the vehicle has access to a base station in the map of base stations during the time required to transmit the data file based on at least two of: the speed, the planned path, or the map of base stations; and \nupon determining that the vehicle does not have access to the base station during the time required to transmit the data file, determine that the available bandwidth associated with the licensed wireless frequency band is less than the determined bandwidth. \n | 3. The non-transitory, computer-readable medium of claim 1, wherein initiating the hybrid mesh further comprises instructions to: \nobtain at least two of: multiple speeds associated with multiple vehicles, multiple planned paths associated with the multiple vehicles, or a map of base stations capable of offering the determined bandwidth for transmission; \ndetermine a time required to transmit the data file; \ndetermine whether one or more vehicles among the multiple vehicles has access to a base station in the map of base stations during the time required to transmit the data file based on at least two of: the multiple speeds, the multiple planned paths or the map of base stations; and \nupon determining the one or more vehicles among the multiple vehicles has access to the base station, initiate the hybrid mesh between the vehicle and the one or more vehicles. \n | 4. The non-transitory, computer-readable medium of claim 1, further comprising instructions to: \nobtain a planned path associated with the vehicle; \ndetermine that a first group of vehicles along the planned path does not provide the determined bandwidth for transmission; \ndetermine an alternate path associated with the vehicle that provides the determined bandwidth for transmission; and \nsuggest the alternate path to the vehicle, and \nwherein determining the alternate path further comprises instructions to: \nobtain a map of base stations along the alternate path, an available bandwidth associated with a base station along the alternate path, a second group of vehicles along the alternate path, and a bandwidth required to transmit the data file; \ndetermine at least a portion of the second group of vehicles along the alternate path enabling transmission of the data file based on the available bandwidth, the bandwidth required and a location of the base station; and \nsuggest the alternate path to the vehicle. \n | 5. The non-transitory, computer-readable medium of claim 1, further comprising instructions to: \nobtain a planned path associated with the vehicle; \ndetermine that a first group of vehicles along the planned path does not provide the determined bandwidth for transmission; \ndetermine an alternate path associated with the vehicle that provides the determined bandwidth for transmission; and \nsuggest the alternate path to the vehicle. \n | 6. The non-transitory, computer-readable medium of claim 1, further comprising instructions to: \nobtain at least two of: a map of base stations, an available bandwidth associated with a base station in the map of base stations, or a bandwidth required to transmit the data file; \ndetermine a speed of the vehicle for enabling transmission of the data file based on at least two of: the available bandwidth associated with the base station in the map of base stations, the bandwidth required, or a location of the base station; and \nsuggest the speed to the vehicle. \n | 7. The non-transitory, computer-readable medium of claim 1, further comprising instructions to: \nobtain at least two of: a map of base stations proximate to the vehicle, a planned path associated with the vehicle, or multiple planned or expected paths associated with multiple nearby vehicles; \ndetermine that the hybrid mesh can be formed at a later point in time based on at least two of: the map of base stations, the planned path associated with the vehicle and the multiple planned or expected paths associated with the multiple nearby vehicles; and \ndelay transmission of the data file until the later point in time. \n | 8. The non-transitory, computer-readable medium of claim 1, the vehicle comprising a low band transceiver to communicate with the at least one other vehicle, and a high band transceiver to communicate with the base station, wherein the vehicle is an autonomous vehicle, and wherein the instructions further comprise navigate the autonomous vehicle along an alternate path, or at a new speed from a current speed, to provide the determined bandwidth for transmission.\n | 9. The non-transitory, computer-readable medium of claim 1, the unlicensed wireless frequency band comprising an IEEE 802.11s or 802.11ay standard.\n | 10. At least one non-transient, computer-readable medium, carrying instructions that, when executed by at least one data processor, performs a method to facilitate transmission or reception of a data file for a vehicle, the method comprising: \ncreating a hybrid communication mesh to transmit from the vehicle, or receive at the vehicle, the data file, \nwherein the hybrid communication mesh includes communicating with at least one other vehicle using an unlicensed wireless frequency band, and communicating with a massive MIMO base station communicating using a licensed wireless frequency band, \nwherein creating the hybrid communication mesh includes determining that the vehicle is not within range of the massive MIMO base station and then creating the hybrid communication mesh of unlicensed wireless frequency bands between the vehicle and a nearest massive MIMO base station using a vehicle-to-vehicle (V2V) connection with the at least one other vehicle, and \nwherein creating the hybrid communication mesh includes determining whether a communication parameter for the hybrid communication mesh satisfies a threshold; and, \ntransmitting the data file via the created hybrid communication mesh. \n | 11. The non-transient, computer-readable medium of claim 10, wherein the communication parameter is an available bandwidth associated with the licensed wireless frequency band, and wherein the method further comprises: \nobtaining at least two of: a speed associated with the vehicle, a planned path associated with the vehicle, or a map of base stations capable of providing a determined bandwidth for transmission; \ndetermining a time required to transmit the data file; \ndetermining whether the vehicle has access to a base station in the map of base stations during the time required to transmit the data file based on at least two of: the speed, the planned path, or the map of base stations; and \nupon determining that the vehicle does not have access to the base station during the time required to transmit the data file, determining that the available bandwidth associated with the licensed wireless frequency band is less than the determined bandwidth. \n | 12. The non-transient, computer-readable medium of claim 10, wherein initiating the hybrid communication mesh further comprises: \nobtaining at least two of: multiple speeds associated with multiple vehicles, multiple planned paths associated with the multiple vehicles or a map of base stations capable of offering a determined bandwidth for transmission; \ndetermining a time required to transmit the data file; \ndetermining whether one or more vehicles among the multiple vehicles has access to a base station in the map of base stations during the time required to transmit the data file based on at least two of: the multiple speeds, the multiple planned paths or the map of base stations; and \nupon determining the one or more vehicles among the multiple vehicles has access to the base station, initiating the hybrid communication mesh between the vehicle and the one or more vehicles. \n | 13. At least one non-transient, computer-readable medium, carrying instructions that, when executed by at least one data processor, performs a method to facilitate transmission or reception of a data file for a vehicle, the method comprising: \nreceiving a request to transmit the data file to or from the vehicle; \nobtaining at least two of: a speed associated with the vehicle, a planned path associated with the vehicle, or a map of base stations capable of providing a selected bandwidth for transmission; and, \ndetermining that a hybrid mesh can be created at a later point-in-time, and \ndeferring the transmission of the data file until the later point-in-time; or, \ndetermining a different route as being more amenable to generating the hybrid mesh verses a current route, and \nrecommending the vehicle detour along the different route. \n | 14. The non-transient, computer-readable medium of claim 13, wherein determining the different route as being more amenable to generating the hybrid mesh includes determining that more vehicles are on the different that are capable of participating in V2V communications than on the current route.\n | 15. The non-transient, computer-readable medium of claim 13, further comprising: determining a time required to transmit the data file, and determining whether the vehicle has access to a base station in the map of base stations during the time required to transmit the data file based on the speed, the planned path, and the map of base stations.\n | 16. The non-transient, computer-readable medium of claim 13, further comprising determining that an available bandwidth associated with a licensed wireless frequency band is less than the selected bandwidth.\n | 17. The non-transient, computer-readable medium of claim 13, further comprising: \nobtaining the planned path associated with the vehicle; \ndetermining that a first group of vehicles along the planned path does not provide the selected bandwidth for transmission; and \ndetermining the different route associated with the vehicle that provides the selected bandwidth for transmission. \n | 18. The non-transient, computer-readable medium of claim 13, further comprising: \nobtaining a map of base stations along the different route, an available bandwidth associated with a base station along the different route, a second group of vehicles along the different route, and a selected bandwidth to transmit the data file; \ndetermining at least a portion of the second group of vehicles along the different route enabling transmission of the data file based on the available bandwidth associated with the base station along the different route, the selected bandwidth required and a location of the base station. \n | 19. The non-transient, computer-readable medium of claim 13, further comprising: \nobtaining a map of base stations proximate to the vehicle, the planned path associated with the vehicle and multiple planned paths associated with multiple nearby vehicles; \ndetermining that the hybrid mesh can be formed at a later point in time based on the map of base stations, the planned path associated with the vehicle and the multiple planned paths associated with the multiple nearby vehicles; and \ndelaying transmission of the data file until the later point in time. \n | 20. The non-transient, computer-readable medium of claim 13, further comprising: \nmeasuring signal strength to the base station using a high band transceiver associated with the vehicle by periodically communicating with the base station; and \ncommunicating the measurement to a nearby vehicle.", "output": "The system has a memory which is coupled to a processor. The processor receives a request to transmit a data package at vehicles (110,120,130,140) and analyzes the data package to determine a required bandwidth for transmission. The processor determines that an available bandwidth associated with a licensed wireless frequency band is less than the required bandwidth. The licensed wireless frequency band comprises a millimeter wavelength band. The processor initiates hybrid meshes (135,145) of an unlicensed wireless frequency band and the licensed wireless frequency band to transmit the data package. The hybrid mesh corresponds to multiple vehicle-to-vehicle (V2V) communication connections (115) between the vehicle. The vehicle and a base station (100) are associated with the licensed wireless frequency band and transmit the data package through the hybrid mesh. INDEPENDENT CLAIMS are included for the following:a non-transient computer-readable medium storing program for creating hybrid mesh of unlicensed wireless frequency bands; anda method for creating hybrid mesh of unlicensed wireless frequency bands. System for creating hybrid mesh of unlicensed wireless frequency bands between vehicles e.g. car communicating using unlicensed wireless frequency band. The method enables improving subscriber quality of experience (QoE), and obtains high-bandwidth communication between a device and a base station to impede effectiveness of a 5G-NR transmission for subscriber devices in motion. The method allows the vehicle to interact with other vehicles to generate a data pipeline using the unlicensed wireless frequency band from a vehicle to a nearby vehicle, and using the licensed wireless band from the nearby vehicle to the nearest, massive MIMO base station if the vehicle moves into a region outside the operating boundary of the 5G NR massive-multiple-input multiple-output (MIMO) base-station node. The drawing shows a schematic view of a hybrid mesh corresponding to communication between vehicles and a base station. 100Base station110,120,130,140Vehicles115Connection135,145Hybrid meshes160Building", "instruction": "Please summarize the input" }, { "input": "UAV supported vehicle-to-vehicle communicationThe use of unmanned aerial vehicle (UAV) communication cells in conjunction with MEC nodes may provide low-latency processing of vehicle movement data to generate vehicle guidance instructions for vehicles. Vehicle movement data of vehicles are received at a base station of a wireless carrier network from a UAV communication cell that is attached to the base station. The base station sends the vehicle movement data to a mobile edge computing (MEC) node that directly communicates with the base station so that the MEC node generates vehicle guidance instructions. The vehicle guidance instructions are then received by the base station from the MEC node. In turn, the base station sends the vehicle guidance instructions to the UAV communication cell for broadcasting to vehicles.What is claimed is: \n | 1. A computer-implemented method, comprising: \nreceiving, at a mobile edge computing (MEC) node that is paired with a corresponding base station of a wireless carrier network, a report of a malfunction of a transceiver of the corresponding base station or a communication traffic overload of the corresponding base station from the corresponding base station, wherein the MEC node is deployed in addition to a server of a centralized data center; \ndeploying, by the MEC node that is paired with the corresponding base station and deployed in addition to the server of the centralized data center, an unmanned aerial vehicle (UAV) communication cell to communicate with a plurality of vehicles in response to the malfunction of a transceiver of the corresponding base station or the communication traffic overload of the corresponding base station; \nreceiving vehicle movement data of one or more vehicles at the MEC node-that is paired with the corresponding base station from the UAV communication cell that is deployed by the MEC node to communicate with the one or more vehicles; \nprocessing the vehicle movement data at the MEC node that is paired with the corresponding base station to generate vehicle guidance instructions for at least one vehicle; and \nsending the vehicle guidance instructions from the MEC node that is paired with the corresponding base station to the UAV communication cell that is deployed by the MEC node for broadcasting to at least one vehicle. \n | 2. The computer-implemented method of claim 1, wherein the MEC node is directed by a deployment controller of the wireless carrier network to deploy the UAV communication cell.\n | 3. The computer-implemented method of claim 1, wherein the UAV communication cell is attached to the MEC node via a wired connection that provides power to the UAV communication cell and a communication link between the UAV communication cell and the MEC node.\n | 4. The computer-implemented method of claim 1, wherein the vehicle movement data is received by the UAV communication cell from a plurality of vehicles or received by the UAV communication cell from an additional UAV communication cell via cellular vehicle-to-everything (CV2X) communication.\n | 5. The computer-implemented method of claim 4, wherein the vehicle movement data is obtained by the additional UAV communication cell from the one or more vehicles via cellular vehicle-to-everything (CV2X) communication.\n | 6. The computer-implemented method of claim 1, wherein the broadcasting is performed by an additional UAV communication cell that receives the vehicle guidance instructions from the UAV communication cell via cellular vehicle-to-everything (CV2X) communication.\n | 7. The computer-implemented method of claim 1, wherein the corresponding base station communicates with the MEC node via a dedicated communication link that is independent of a backhaul that connects the corresponding base station to a core network of the wireless carrier network.\n | 8. The computer-implemented method of claim 1, wherein the vehicle movement data for a vehicle includes a current vehicle location of the vehicle, a direction of travel of the vehicle, a speed of travel for the vehicle, an acceleration rate of the vehicle, a deacceleration rate of the vehicle.\n | 9. The computer-implemented method of claim 1, wherein the vehicle movement data for a vehicle includes one or more of a current vehicle location of the vehicle, a direction of travel of the vehicle, a speed of travel for the vehicle, an acceleration rate of the vehicle, or a deacceleration rate of the vehicle.\n | 10. The computer-implemented method of claim 1, wherein the vehicle guidance instructions are used by a vehicle to perform autonomous driving of the vehicle, perform an automatic driving maneuver, or provide a driving condition to a driver of the vehicle.\n | 11. A mobile edge computing (MEC) node, comprising: \none or more processors; and \nmemory having instructions stored therein, the instructions, when executed by the one or more processors, cause the one or more processors to perform acts comprising: \nreceiving, from a neighboring MEC node that is paired with a base station of a wireless carrier network, a command for the MEC node to deploy an unmanned aerial vehicle (UAV) communication cell to communicate with a plurality of vehicles, the MEC node being unpaired with any base station of the wireless carrier network; \ndeploying, by the MEC node, the unmanned aerial vehicle (UAV) communication cell to communicate with a plurality of vehicles; \nreceiving vehicle movement data of one or more vehicles at the MEC node from the UAV communication cell; \nprocessing the vehicle movement data at the MEC node to generate vehicle guidance instructions for at least one vehicle; and \nsending the vehicle guidance instructions to the UAV communication cell for broadcasting to the at least one vehicle. \n | 12. The MEC node of claim 11, wherein each of the one or more vehicles includes a vehicle control module that is authenticated by a core network of the wireless carrier network to communicate with the UAV communication cell, the vehicle control module to provide corresponding vehicle movement data to the UAV communication cell and receive vehicle guidance instructions from the UAV communication cell.\n | 13. The MEC node of claim 11, wherein the UAV communication cell is deployed by the MEC node based on a deployment schedule to receive the vehicle movement data of the one or more vehicles.\n | 14. The MEC node of claim 11, wherein a deployment command from the neighboring MEC node by the base station paired to the neighboring.\n | 15. The MEC node of claim 11, wherein the UAV communication cell is deployed in response to a number of user devices that are connected to the base station exceeding a predetermined number threshold, vehicle traffic data provided by a traffic monitoring service indicating that a number of vehicles in a geographical area serviced by the base station exceeding a predetermined number threshold, the base station reporting a service outage, or a predetermined number of user devices serviced by the base station reporting a quality of service (QoS) value has dropped below or exceeded a threshold.\n | 16. The MEC node of claim 11, wherein the vehicle movement data is obtained by an additional UAV communication cell from the one or more vehicles and relayed to the UAV communication cell, and wherein the sending includes sending the vehicle guidance instructions from the UAV communication cell to another communication cell that broadcast the vehicle guidance instructions to the at least one vehicle.\n | 17. The MEC node of claim 11, wherein the receiving includes receiving the vehicle movement data from the UAV communication cell when the UAV communication cell or a distributed computing network of multiple UAV communication cells that include the UAV communication cell is unable to process the vehicle movement data into the vehicle guidance instructions in a predetermined amount of time.\n | 18. One or more non-transitory computer-readable media of an unmanned aerial vehicle (UAV) communication cell storing computer-executable instructions that upon execution cause one or more processors to perform acts comprising: \nreceiving vehicle movement data of one or more ground vehicles at the UAV communication cell; \nin response to determining that the UAV communication cell is unable to process the vehicle movement data into vehicle guidance instructions for at least one ground vehicle in a predetermined amount of time, transmitting at least a portion of the vehicle movement data to an additional UAV communication cell; and \nin response to determining that the UAV communication cell is able to process the vehicle movement data into the vehicle guidance instructions in the predetermined amount of time, processing the vehicle movement data at the UAV communication cell into the vehicle guidance instructions for distribution to the at least one ground vehicle, where the vehicle guidance instructions include one or more automatic lane change directives for the at least one ground vehicle, one or more braking directives for the at least one ground vehicle, or one or more vehicle turning commands for the at least one ground vehicle. \n | 19. The one or more non-transitory computer-readable media of claim 18, wherein the UAV communication cell and the additional UAV communication cell are part of a distributed computing network, and wherein the transmitting includes transmitting the vehicle movement data to a plurality of UAV communication cells of the distributed computing network for processing into the vehicle guidance instructions.\n | 20. The one or more non-transitory computer-readable media of claim 18, wherein the additional UAV communication cell is connected to a mobile edge computing (MEC) node that processes the at least a portion of the vehicle movement data received by the additional UAV communication cell into the vehicle guidance instructions.", "output": "The method (400) involves receiving (402) vehicle movement data of several vehicles at a base station of a wireless carrier network from an UAV communication cell that is attached to the base station. The vehicle movement data from the base station is sent to a MEC node that directly communicates with the base station for the MEC node to generate vehicle guidance instructions for a vehicle. The vehicle guidance instructions are received (406) from the MEC node that is coupled to the base station. The vehicle guidance instructions are sent (408) to the UAV communication cell for broadcasting to a vehicle. INDEPENDENT CLAIMS are included for the following:a mobile edge computing (MEC) node; anda non-transitory computer-readable medium storing program for using UAV communication cell to support vehicle movement data processing by MEC node. Computer-based method for using unmanned aerial vehicle (UAV) communication cell to support vehicle movement data processing by mobile edge computing (MEC) node (claimed). The generation of the vehicle guidance instructions by a MEC node that is locally paired with a base station reduces communication latency. The capabilities enables the UAV communication cells to act as a mesh network or a distributed computing network to provide redundant communication and/or data processing capabilities. The drawing shows a flowchart illustrating the method for a base station to use a UAV communication cell to support vehicle movement data processing by a MEC node. 400Method for using a UAV communication cell to support vehicle movement data processing by a MEC node for a base station402Step for receiving vehicle movement data of several vehicles at a base station of a wireless carrier network from an UAV communication cell that is attached to the base station404Step for transmitting the vehicle movement data from the base station is sent to a MEC node that directly communicates with the base station for the MEC node to generate vehicle guidance instructions for a vehicle406Step for receiving the vehicle guidance instructions from the MEC node that is coupled to the base station408Step for sending the vehicle guidance instructions to the UAV communication cell for broadcasting to a vehicle", "instruction": "Please summarize the input" }, { "input": "V2X SUPPORTED UNmanned AircraftTechniques for performing traffic and road operations using communication between unmanned aerial vehicles (UAVs) and vehicles with vehicle-to-everything (V2X) support are described below. The UAVs can be used at different locations and can be functionally connected to a cell site, a Mobile Edge Computing (MEC) server and / or a V2X sensor. Upon detection of an incoming vehicle from a destination moving toward one or more autonomous vehicles (AVs), the UAVs can communicate directly with the one or more AVs to warn the AVs of the incoming vehicle. If the UAVs cannot communicate with the AVs, the UAVs can forward messages to other UAVs that can communicate with the AVs. The UAVs can also transport messages to a cell site and / or a V2X sensor to send messages to one or more AVs. Cell sites can also send messages to send the messages to the AVs within their respective coverage areas.|1. One or more non-volatile, computer-readable, unmanned aerial vehicle (UAV) media that store computer-executable instructions that, when executed, cause one or more processors to perform actions that include:\nDetecting a presence of a first vehicle at a first position, the first vehicle moving toward a second position; \nGenerating a message for transmission to a second vehicle having a vehicle-to-all (V2X) capability in an environment of the second location, the message indicating the presence of the first vehicle; and \nif the second vehicle is not in a communication area of ??the UAV, sending the message to a V2X sensor at the second position, the V2X sensor being configured to forward the message to the second vehicle via V2X communication. \n | 2. One or more non-volatile computer readable media Claim 1, with the message being transmitted over a cellular air interface.\n | 3. One or more non-volatile computer readable media Claim 1, wherein the first vehicle is a train and the second vehicle is an autonomous vehicle (AV).\n | 4. One or more non-volatile computer readable media Claim 1, wherein the UAV is operatively connected to a cell site and the actions further include:\nTransmission of the message to the cell site after determining that the second vehicle is not within a second communication area of ??the V2X sensor. \n | 5. One or more non-volatile computer readable media Claim 5, wherein the UAV is connected to the cell site via a wired transport.\n | 6. One or more non-volatile computer readable media Claim 1the actions further comprising:\nSend the message to a second UAV at the second location after determining that the second vehicle is not within range of the communication. \n | 7. One or more non-volatile computer readable media Claim 1wherein the second vehicle is further configured to transmit the message to a third vehicle via vehicle-to-vehicle (V2V) communication.\n | 8. Computer-implemented method that includes:\nReceiving a message from a UAV indicating the presence of a first vehicle at a first location, the first vehicle traveling to a second location; and \nTransmitting the message indicating the presence of the first vehicle via a cell station to a second vehicle near the second location, the second vehicle having a Vehicle-To-All (V2X) capability. \n | 9. Computer-implemented procedure according to Claim 8, further comprising the steps to:\nDetermining a period of time in which the first vehicle and the second vehicle arrive at an intersection at the second position based at least in part on a first speed of the first vehicle and a first distance between the first vehicle and the second position with respect to a second speed the second vehicle and a second distance between the second vehicle and the intersection; and \nGenerate a speed display for the second vehicle. \n | 10. Computer-implemented procedure according to Claim 8, further comprising the steps to: \nTransmitting the message to a second cell station when it is determined that the second vehicle is not in a communication area of ??the cell station, the second cell station being configured to transmit the message to the second vehicle. \n | 11. Computer-implemented procedure according to Claim 8, further comprising the steps to:\nDetermining that the message is to be transmitted to a third vehicle that is not in a communication area of ??the cell station, the second vehicle configured to send the message to the third vehicle via vehicle-to-vehicle (V2V) communication transfer. \n | 12. Computer-implemented procedure according to Claim 8, further comprising the steps to:\nDetermining that the second vehicle is not in a communication range of the cell station; and \nTransmitting the message to a second UAV, the second UAV being configured to communicate with the second vehicle via a cellular air interface. \n | 13. Computer-implemented procedure according to Claim 8, further comprising the steps to:\nTransmission of the message to a vehicle-to-everything (V2X) sensor at the second location if the second vehicle is not in a communication area of ??the cell station. \n | 14. System comprising:\none or more unmanned aerial vehicle (UAV) non-volatile storage media configured to provide stored code segments, the one or more non-volatile storage media coupled to one or more processors each configured to execute the code segments and cause the one or more processors to: \nDetecting an existence of a first vehicle at a first location, the first vehicle traveling to a second location; \nGenerating a message for transmission to a second vehicle near the second location, the message indicating the presence of the first vehicle; and \nif the second vehicle is not within a communication area of ??the UAV, sending the message to at least one device connected to a server at the second location after determining that the second vehicle is in a second communication area of ??the at least one device . \n | 15. System according to Claim 14wherein the at least one unit is a vehicle-to-everything (V2X) sensor, a cell location, or a second UAV.", "output": "Computer-readable media includes executable instructions to detect (302) presence of vehicle at location capable to travel towards a second location. A message is generated to transmit (304) to a second vehicle having a vehicle-to-everything (V2X) capability in vicinity of the second location. The message indicates that the first vehicle is traveling towards the second location. When the second vehicle is not within a communication range of the UAV, the message is broadcasted (306) to V2X sensor at the second location. The V2X sensor is configured (310) to relay the message to the second vehicle using a V2X communication protocol. INDEPENDENT CLAIMS are also included for the following:a computer-implemented method; anda system. Computer-readable media in an system of automatic aerial vehicle for stable communications to monitor road conditions and detecting oncoming traffic. The message includes information that would enable an AV to make driving decisions. The control station can instruct each Unmanned Aerial Vehicle (UAV) to deploy to a target location to ensure coverage while minimizing overlapping. Cellular based V2X can provide a higher percentage of successful data packet delivery and communication range than WLAN based V2X, a vehicle must still be within a communication range of a target entity to enable successful passing of information or data packet delivery in V2X communication. The Mobile Edge Computing (MEC) server can provide computing resources, storage capacity, connectivity, and access to RAN information. The drawing shows a flow-chart of a method for utilizing UAVs and V2X sensors to conduct traffic and road operations. 302Detecting first location of a vehicle304Transmitting a message to second location306Broadcasting message to second location sensors310Communicating message to second vehicle", "instruction": "Please summarize the input" }, { "input": "NETWORK TESTING DETERMINATIONA fifth generation (5G) network can provide testing capabilities by employing a test server to generate requests for testing at remote locations of the 5G network. The test server can be used to initiate tests, determine test conditions for conducting the tests, direct test locations, and receive test data from test tools remote from the server. The test server can initiate, establish, maintain, format, or otherwise determine tests that are usable to improve operation of the 5G network.What is claimed is: \n | 1. A method comprising: \nconfiguring, by a server of a network, a message indicating a test location and a test condition associated with the network; \nsending, by the server, the message to a test module included in a vehicle, the test module configured to: \nreceive network data associated with one or more user equipment (UEs), a base station (BS), and a navigation component included in the vehicle; and \ndetermine a capability of the network at the test location in accordance with the test condition based at least in part on the network data; \nsending, by the server, the test location to the navigation component of the vehicle; \ncausing, by the server, the vehicle to navigate to the test location; \nreceiving, by the server and based at least in part on the vehicle navigating to the test location, test data from the test module indicating one or more capabilities of the network at the test location; \nassociating, by the server, metadata with the test data from the test module; \nstoring the test data and the metadata in a database; and \ndetermining, by the server and based at least in part on using the metadata to access the test data from the database, network parameters associated with peak throughput for the one or more UEs accessing the network. \n | 2. The method of claim 1, wherein the test data is first test data and the test module is a first test module, and further comprising: \nreceiving, by the server, second test data from a second test module indicating one or more capabilities of the network at a second test location; and \ndetermining, by the server and based at least in part on the first test data and the second test data, the network parameters, \nwherein the network parameters comprising one or more of: a location, a signal strength, a beam Precoding Matrix Indicator (PMI) number, a frequency, a power, or a signal-to-noise ratio to transmit data between the one or more UEs and the network. \n | 3. The method of claim 1, wherein the message indicates multiple test locations and multiple test conditions associated with testing at least one of: a Time Division Duplex (TDD) system, a Frequency Division Duplex (FDD) system, an Unmanned Aerial Vehicle (UAV) system, a Vehicle to Everything (V2X) system, or a multiple user multi-input multi-output (MU-MIMO) system.\n | 4. The method of claim 1, wherein the network is a 5G network and the message is associated with multiple user multi-input multi-output (MU-MIMO) testing in the 5G network.\n | 5. The method of claim 1, further comprising: \nidentifying interference between one or more Unmanned Aerial Vehicles (UAVs) and a base station; and \nbased at least in part on identifying the interference, at least one of: \nmitigating interference between the one or more UAVs and the base station; \nmitigating noise rise associated with the one or more UEs and the base station; or \ncausing the one or more UAVs or UEs to be inoperable. \n | 6. The method of claim 1, further comprising: \ndetermining, by the server, a first test and a second test different from the first test; \nranking, by the server, the first test relative to the second test based at least in part on a comparison between a first location associated with the first test and a second location associated with the second test; and \nsending an indication of the ranking to the navigation component to cause the vehicle to navigate to the first location and the second location in accordance with the ranking. \n | 7. The method of claim 1, wherein: \nthe test condition indicates to perform at least one of: an upload test, a download test, a combined upload and download test, a latency test, a jitter test, a packet rate test, a voice quality test, a video quality test, a backhaul test, or a handover test; and \nthe one or more capabilities of the network indicated by the test data comprises one or more of: downlink and uplink throughputs, latency, volume of data, time/location, amount of call attempts, accepts, failures, amount of handovers, mean-opinion-score (MOS), signal to interference plus noise ratio (SINR), modulation and coding scheme (MCS), signal strength, UE transmit power, cell site number, or a frequency. \n | 8. A system comprising: \none or more processors; and \nmemory storing computer-executable instructions that, when executed by the one or more processors, cause the system to perform operations comprising: \nconfiguring, by a server of a network, a message indicating a test location and a test condition associated with the network; \nsending, by the server, the message to a test module included in a vehicle, the test module configured to: \nreceive network data associated with one or more user equipment (UEs), a base station (BS), and a navigation component included in the vehicle; and \ndetermine a capability of the network at the test location in accordance with the test condition based at least in part on the network data; \nsending, by the server, the test location to the navigation component of the vehicle; \ncausing, by the server, the vehicle to navigate to the test location; \nreceiving, by the server and based at least in part on the vehicle navigating to the test location, test data from the test module indicating one or more capabilities of the network at the test location; \nassociating, by the server, metadata with the test data from the test module; \nstoring the test data and the metadata in a database; and \ndetermining, by the server and based at least in part on using the metadata to access the test data from the database, network parameters associated with peak throughput for the one or more UEs accessing the network. \n | 9. The system of claim 8, the operations further comprising causing the network to communicate with multiple user equipment (UEs) at a same time based at least in part on the test data.\n | 10. The system of claim 8, wherein: \nthe message is associated with testing separation between two base stations; and \ndetermining the network parameters further comprises determining a horizontal distance or a vertical distance between the two base stations and determining a frequency guard band between the two base stations. \n | 11. The system of claim 8, the operations further comprising configuring the message based at least in part on an occurrence of one or more events, the one or more events comprising a drop call rate, a change in noise, or a pre-determined throughput value.\n | 12. The system of claim 8, the operations further comprising: \nreceiving network data from a remote sensor; and \nconfiguring the message based at least in part on the network data from the remote sensor meeting or exceeding a testing threshold. \n | 13. The system of claim 8, wherein: \nthe navigation component included in the vehicle comprises a navigation device or a Vehicle to Everything (V2X) device; and \nwherein the metadata indicates an upload time or a test time of the test data. \n | 14. The system of claim 8, the operations further comprising: \ndetermining one or more of: a bandwidth, a Time Division Duplex (TDD) ratio configuration, a frequency channel, a transmission power, a beamforming Precoding Matrix Indicator (PMI), or a distance between two test modules to provide peak throughput for the one or more UEs accessing the network. \n | 15. The system of claim 8, the operations further comprising identifying, based at least in part on the test data, a Time Division Duplex (TDD) ratio configuration that enables a first operator and a second operator to communicate simultaneously in a TDD system.\n | 16. The system of claim 8, the operations further comprising: \nreceiving an indication to test the network; and \nsending, by the server, the message to the test module included in the vehicle based at least in part on the indication to test the network. \n | 17. A method comprising: \nreceiving, by a server of a network, test data from a test module included in a vehicle, the test data indicating one or more capabilities of the network at a test location and in accordance with a test condition, the test module configured to: \nreceive network data associated with one or more user equipment (UEs), a base station (BS), and a navigation component included in the vehicle; and \ndetermine the one or more capabilities of the network based at least in part on the network data; \ndetermining, by the server, metadata to associate with the test data, the metadata comprising the test location, the test condition, a test time, and first navigation data associated with the navigation component; \nreceiving, by the server, second navigation data from an additional navigation component included in an additional vehicle; \nidentifying, by the server, a sequence of remote tests to be performed at multiple test locations; \nsending, by a server, a first message to the navigation component included in the vehicle indicating a first portion of the sequence of remote tests to perform at a first test location of multiple test locations; and \nsending, by a server, a second message to the additional navigation component included in the additional vehicle indicating a second portion of the sequence of remote tests to perform at a second test location of multiple test locations. \n | 18. The method of claim 17, wherein the test data is first test data and further comprising: \ncausing the vehicle to navigate to the first test location; \ncausing the additional vehicle to navigate to the second test location; \nreceiving second test data from at least one of the vehicle or the additional vehicle; and \ndetermining, by the server and based at least in part on the first test data and the second test data, network parameters associated with peak throughput for the one or more UEs accessing the network. \n | 19. The method of claim 17, further comprising: \ndetermining, by the server and based at least in part on the test data, network parameters associated with peak throughput for the one or more UEs accessing the network; and \ncausing the network to communicate with multiple user equipment (UEs) at a same time based at least in part on the test data. \n | 20. The method of claim 17, wherein the vehicle comprises at least one of: an autonomous vehicle or an Unmanned Aerial Vehicle (UAV).", "output": "The method involves configuring a test message (108) indicating test location and test condition associated with a network (104) by a test server (106) of network. The message is sent to a test module included in a vehicle (110) by the server. The test module determines capability of network at the test location in accordance with test condition based at least in portion on network data. The server sends test location to navigation component in vehicle, causes vehicle to navigate to test location, receives test data from test module indicating capabilities of network at test location based at least in portion on vehicle navigating to test location, associates metadata with test data from test module, stores test data and metadata in a database, and determines network parameters associated with peak throughput for UEs (112) accessing the network based at least in portion on using metadata to access test data from database. An INDEPENDENT CLAIM is included for a system for testing the integrity or reliability of a telecommunication network. Method for testing network using server to receive test data from remote test tools for testing integrity or reliability of telecommunication network such as fifth generation (5G) telecommunication networks. The method enables providing improved bandwidth and decreased response times to a multitude of devices that are connected to a network. The drawing shows a schematic diagram of a network environment in which devices can connect to a telecommunication network to implement the testing techniques.104Network 106Test server 108Test message 110Vehicle 112UE ", "instruction": "Please summarize the input" }, { "input": "APPARATUS AND METHOD FOR TRANSMITTING VEHICLE INFORMATIONMethods and systems are provided for generating a signal based on the information of the information system and transmitting/broadcasting the signal to the other vehicle. For example, a method includes receiving, by a processor, an indication that an information system of a vehicle is activated and in response to receiving the indication, generating, by the processor, a signal based on the activated information system of the vehicle. The method further includes broadcasting the signal.", "output": "The method involves receiving (501) an indication that an information system of a vehicle is activated by a processor. A signal is generated (503) based on the activated information system of the vehicle by the processor in response to receiving the indication. The signal is broadcasted (505). The information system of the vehicle is activated by a driver of the vehicle or a driving system of the vehicle. The signal includes information associated with an action of the driver of the vehicle or the driving system of the vehicle. INDEPENDENT CLAIMS are included for the following:a system for processing information;a vehicle; anda non-transitory computer program product for generating signal based on information of information system and for transmitting/broadcasting signal. Method for generating signal based on information of information system and for transmitting/broadcasting signal to other vehicle (claimed). The information that vehicle make a brake is generated and transmitted by the signal generation and transmission system when a driver or the self-driving system of the vehicle activates the vehicle brake system. The risk of autonomous driving caused is reduced and when the visual line of sight of the vehicle is blocked and the vehicle is taking an action that will affect the vehicle and provide higher reliability by providing more and better information to the vehicle. The drawing shows a flowchart illustrating the method for generating a signal based on the information of the information system and for transmitting/broadcasting the signal. 501Step for receiving an indication that an information system of a vehicle is activated by a processor503Step for generating a signal based on the activated information system of the vehicle505Step for broadcasting the signal", "instruction": "Please summarize the input" }, { "input": "Data processing apparatus of V2X and LDM for autonomous vehicle systemThe information transmission device of the vehicle-to-object communication device and the local dynamic map device for an autonomous driving system according to the present invention includes: a V2X message transceiver for transmitting and receiving a V2X message; C-ITS message processing unit for processing C-ITS messages transmitted and received through the V2X message transmission and reception unit; an LDM message transceiver for transmitting and receiving an LDM message; an LDM message processing unit for processing LDM messages transmitted and received through the LDM message transceiving unit; An autonomous driving system UDP message sending/receiving processing unit for sending and receiving an autonomous driving system UDP message through Ethernet communication with the autonomous driving system; An autonomous driving system CAN message transmission/reception processing unit that transmits and receives an autonomous driving system CAN message through CAN communication with the autonomous driving system; A vehicle information collection management unit for collecting and managing vehicle information processed by the autonomous driving system CAN message transmission and reception processing unit; and a real-time information processing unit that processes information in real time; It may be composed of|1. V2X message transmitting and receiving unit for transmitting and receiving V2X messages;\nC-ITS message processing unit for processing a C-ITS (Cooperrative Intelligent Transport Systems) message transmitted and received through the V2X message transceiver; \nan LDM message transceiver for transmitting and receiving an LDM (local dynamic map device) message; \nan LDM message processing unit for processing an LDM message transmitted and received through the LDM message transceiving unit; \nAn autonomous driving system UDP message sending/receiving processing unit for sending and receiving an autonomous driving system UDP message through Ethernet communication with the autonomous driving system; \nan autonomous driving system CAN message transmission/reception processing unit for transmitting and receiving an autonomous driving system CAN message through CAN communication with the autonomous driving system; \na vehicle information collection management unit for collecting and managing vehicle information processed by the autonomous driving system CAN message transmission and reception processing unit; and a real-time information processing unit that processes information processed in the C-ITS message processing unit, the LDM message processing unit, the autonomous driving system UDP message transmission/reception processing unit, and the vehicle information collection and management unit in real time; A vehicle-to-things communication device for an autonomous driving system and an information delivery device for a local dynamic map device, characterized in that it comprises a. \n | 2. The method of claim 1, wherein the real-time information processing unit inquires the nearest intersection from the intersection information based on the information processed by the C-ITS message processing unit and the LDM message processing unit, and acquires the inquired traffic light information of the nearest intersection. to provide the information to the autonomous driving system through the autonomous driving system CAN message transmission/reception processing unit.\n | 3. The real-time traffic information data processed by the C-ITS message processing unit, the LDM message processing unit and the real-time information processing unit is stored in a real-time traffic information data storage device according to claim 1 or 2, and is processed by the vehicle information collection and management unit. The vehicle information data to be used and the vehicle information data of the autonomous driving system are configured to be stored in a vehicle information data storage device.", "output": "The device has a vehicle-to-things (V2X) message transmitting and receiving unit for transmitting/receiving V2X messages. A collaborative intelligent transport systems (C-ITS) message processing unit (112) processes a C-ITS message transmitted/received through a V2X message transceiver (110). A local dynamic map device (LDM) message is received/processed by a LDM message transceiving unit (120) . An autonomous driving system (AD) message transmission/reception processing unit sends/receive an AD message through Ethernet communication. A vehicle information collection management unit collects and manages vehicle information processed by an AD CAN message transmission and reception processing unit (140) . A real-time information processing unit (150) is utilized for processing information processed in the C-ITS message processing unit. Vehicle-to-object communication device (V2X) and local dynamic map device (LDM) information transmission device for an autonomous driving system. The information necessary for driving the vehicle among the information around the vehicle and regional dynamic information transmitted and received from the V2X OBU terminal and the LDM terminal is provided to the autonomous driving system by selecting and processing quickly and effectively. The drawing shows a schematic block diagram of a V2X and LDM information transmission device for an autonomous driving system (Drawing includes non-English language text).110V2X message transceiver112C-ITS message processing unit120LDM message transceiving unit140AD CAN message transmission and reception processing unit150Real-time information processing unit", "instruction": "Please summarize the input" }, { "input": "Transportation, transportation vehicle and autonomous vehicle, equipment rental, rental system that can be shared or that provides infrastructure services for vehicle owners to rent and share their own vehicles.Infrastructure that can be rented, shared, traded for invention, transportation, transportation vehicle, autonomous vehicle, transport box-cabin, equipment and system, providing infrastructure services for vehicle owners to rent/share their own vehicles and facilities, providing products and services that they can use in their own vehicles. It is about the infrastructure and methods of the rental/sharing system and logistics system, which creates and operates the logistics and transportation system with the infrastructure created.|1. REQUESTS\none. It is a rental system that provides transportation, transportation vehicles and autonomous vehicles, equipment rental, sharing or infrastructure services for vehicle owners to rent and share their own vehicles. \n- users (67), user terminal (81) User ID/Payment Tools (65), Smart Hardware (45), station (79) or third party providers (83), where the entire system is managed automatically or manually by artificial intelligence algorithms, and Cloud or server-based control center (82) containing integration interfaces, Control Center Software and databases (82S) that enable communication with other systems (84), \n- User Identity/Payment Tools (65) that enable users who want to receive service from the system to be defined in the system and that carry out or mediate payment collection transactions (65), \n- into computing equipment such as mobile phones, tablets, computers, smart glasses, watches, brain, neuro-machine interfaces or autonomous machines, or third party hardware, which also contain user identification information, enable users to communicate with system elements and carry out their work that needs to be carried out in the system. embedded electronic circuit groups user terminal (81) and user terminal software (81 S) running on these hardware, \n- located on transportation vehicles (106), transport boxes (117) or at the station (79) and enabling transportation vehicles to receive service, identify them in the system, station units (79), smart hardware (45B), User ID/Payment Tools (65), user Intelligent hardware (45) and identification hardware (51) or Slot hardware (50) that enable terminal (81) and other systems (84) to communicate and interoperate with transportation vehicles (106) and transport boxes (117), \n- a parking lot, which ensures the safety of transportation vehicles (106), transport boxes (117), receiving the desired services, and being present at the desired location. \n171 The station (79), whose infrastructure includes the infrastructure that creates a stock area and allows users to interact with the system, \n- Payment infrastructures that can receive offline and online payments that can communicate or work together with third party providers (83), smart hardware (45) and identification hardware (51) providing software, services and information that can be used in the system or processing third-party users and systems -pos devices, \n- payment infrastructures (68), \n- third party hardware, systems and applications (85), \n- System software (90S), \n- Artificial Intelligence Decision support systems (82.4) that perform learning, benchmarking, decision making and output production functions according to data and parameters coming from control center services and smart equipment and user terminals in the field, \n- In case of intelligent equipment (45) station power and signal line (59) of stations with more than one platform, which are independently or in the system within the smart hardware (45), the multiplexed connections will duplicate their ports (31) and switch to other smart equipment. T is characterized in that it includes a link conversion unit (37) comprising a link function unit (36). \n | 2. It is the users (67) mentioned in Claim 1 and its feature is; It is characterized by the standard user (67S) representing the legal entity or acting on its behalf, and autonomous users (670) with autonomous mobility.\n | 3. It is the users (67) mentioned in Claim 2 and its feature is;\n- Undefined user role (67R1) \n- Tenant role (67R2) \n- Role of user, vehicle or vehicle owner receiving service from the system (67R3) \n- Rental vehicle owner role (67R4) \n- System attendant role (67R5) \n- It is characterized by containing the hardware-software user role (67R6). \n | 4. It is the user identification information mentioned in Claim 1 and its feature is;\n172 \n- User information, biometric data (67D), \n- Credentials and session keys (83D) obtained from third party identity providers (83) \n- Identity/Payment Instruments information and keys (65D), \n- User terminal identification information (81 D), \n- Characterized by having smart hardware/hardware/slot hardware identifiers (45D, 45BD, 51D, 50D). \n | 5. It is the Control Center Software and databases (82S) mentioned in Claim 1, and its feature is;\n- System Interfaces (82.1) \n- Integration service between systems (90) (82.2) \n- Communication and Security Services and their structures (82.3) \n- Artificial Intelligence Decision support systems (82.4) \n- Map based Services and Routing Services (82.5) \n- Autonomous Vehicle and Equipment Management Services (82.6) \n- Fleet and Operations Management services (82.7) \n- System hardware, tools and third party hardware and software update services and software templates database (82.8) \n- Promotion and Marketing Automation Services (82.9) \n- Log database (82.10) prepared according to system and operation and purpose \n- Business partners and business models services and database(82.11) \n- Automatic and manual system management services, system databases, report services and interfaces, (82.12) \n- Blockchain transaction interfaces, services and integration interfaces (82.13) \n- Integration services (82.14) \n- It is characterized by its content. \n173 \n | 6. It is the transportation vehicles (106) mentioned in claim 1, and its feature is; It is characterized by having at least one of the qualities of manned or unmanned, motorized or non-motorised, autonomous or conventional vehicle, transportation vehicle (106S) and Host vehicle (106K), used to carry a person, object or perform any task.\n | 7. It is the transportation vehicle (106S) mentioned in claim 6 and its feature is; It is characterized by having at least one of the smart hardware (45), identification hardware (51), Slot hardware (50), third-party locking system or third-party identification units on it.\n | 8. The Mansion (106K) mentioned in Claim 6 is the vehicle and its feature is;\n- Intelligent hardware (45), identification hardware (51), Slot hardware (50), wireless identification unit, third-party identification unit, third-party locking system, or at least one or more of the lights or signals providing audio-visual identification simultaneously to have \n- Contain at least one of the smart hardware, Slot hardware, or third-party locking system hardware that enables it to transport more than one box, object or transport cabinet (117) with one vehicle at the same time \n- It is characterized in that it contains at least one of the smart hardware, Slot hardware or third party locking system hardware that enables it to carry a transport box-object-cab (117) or interlocked box-object and cabinets (117) simultaneously with another vehicle (106) according to requirements. \n | 9. It is a transport box (117) consisting of boxes, cabins, passenger cabins, objects or machines with or without active features (electronic, software, electromechanical) produced for specific or general use as mentioned in Claim 1, and its feature is;\n- Intelligent hardware (45), identification hardware (51), Slot hardware (50) or third-party lock systems, which include box-object transport mode (45S10) software on the host vehicle (106K) or stations, identification hardware , Slot hardware, barcode/datacode/visual marking, or third-party lock \n174 It is characterized in that it contains at least one of its systems. \n | 10. It is the Slot hardware (50) mentioned in Claim 1 and its feature is;\n- It contains an automatic/manually lockable nail slot (48) where it can be inserted into the tabs in the smart hardware (45) or third-party lock system, \n- In case of power, signal and peripheral connections (49) where the slot hardware (50) is mounted (vehicle, platform hardware, third party equipment, mounting brackets, carrying apparatus-transport boxes or any floor or wall) with smart hardware ( 45) to provide these connections bidirectionally, \n- Creating the Slot hardware (50) Identification Hardware Memory Area / Identification information (50D) with the RFID tag ID on the slot hardware (50) or the serial number, QR code, visual signals or chips used for identification programmed on the tag, \n- the hardware identification ID can be configured as a platform or station in the system so that it can be used by the system as a station identification ID, \n- the hardware identification ID can be configured in the system as a vehicle or transport box so that it can be used by the system as a vehicle or transport box identification ID, \n- contains magnets or magnetic fields on the ground to facilitate mounting on metal surfaces, \n- Operation data storage and vehicle(106), transport box(117), smart equipment(45), user smart equipment(45), user terminal(81), station computer(89), other smart hardware(45) in the station(79) Slot hardware is characterized by containing memory space/identification information (50D), which allows it to be sent to the control center with . \n | 11th. It is the identification equipment (51) mentioned in Claim 1 and its feature is;\n- the information given by the chips providing a unique serial number on it, or the identification information obtained from the QR codes containing the identification information. \n175 contain RFID tag serial number or programmed serial number, at least one of which creates a Cookie Memory Space / Cookies (51 D), \n- the vehicle, the station, the equipment contain the chip and antenna (53) that give the Identification Hardware Memory Area / Cookies (51 D), \n- It contains magnets with N and S poles that allow the surface of the smart hardware (45) to hold on the floor, \n- includes a cut-resistant cable-chain attached to the equipment floor, \n- It contains Locking Pin-arms (54) that can be used with the Smart Hardware (45) locking module (22), which is in a closed position with the weak magnet on its floor, and which provides locking by pulling towards the module when the locking module (22) approaches, \n- Operation data storage and vehicle(106), transport box(117), smart hardware(45), user smart equipment(45), user terminal(81), station computer(89), other smart hardware(45) in station(79) It is characterized in that it contains the memory area / cookies (51 D) of the cookie hardware that allows it to be sent to the control center with ). \n | 12. It is the station (79) mentioned in claim 1 and its feature is;\n- consists of at least one of the fixed mobile station (79A), fixed station (79B), mobile station (79C), restricted area/point type station (79D) configurations, \n- It is characterized by being included in at least one of the System Station (79S1) or Special Station (79S2) class. \n | 13. It is the fixed mobile station (79A) mentioned in claim 12 and its feature is;\n- Positioning the hardware and software that make up the station on the host vehicle (106K), \n- With smart hardware (79.1), with identification hardware (79.2), with wireless identification hardware (79.3), with Slot hardware (79.4) and with autonomous switching hardware (79.5), with kiosk equipped (with kiosk computer and peripherals) where stations are installed together with kiosk computer and peripherals. 79.6) contain at least one or more of the station types at the same time, \n176 \n- Vehicle mode (45S1), autonomous vehicle mode (45S7), object-box transport mode (45S10), third-party hardware mode (45S3), basic mode (45S10) smart hardware running software, identification hardware (51), Slot hardware ( 50) can be locked to existing system elements and third-party locking systems. \n | 14. It is the fixed station (79B) mentioned in claim 12 and its feature is;\n- Fixed to fixed floors and surfaces, whose locations and locations are always defined, \n- Kiosk/ kiosk where stations with smart hardware (79.1), identification hardware (79.2), wireless identification hardware (79.3), slot hardware (79.4) and autonomous switching hardware (79.5) stations are installed together with the platform and kiosk computer and peripherals. It is characterized in that it includes at least one of the station types (79.6) equipped with platform. \n | 15. It is the Mobile station (79C) mentioned in Claim 12 and its feature is; Information processing equipment or smart equipment that can create a coverage area with wireless communication in order to collect the vehicles, to carry out rental or other services and to follow the station signal of the vehicles, where the location of the user terminal or the vehicle that generates the signal is used as the station location. , 45B), vehicles within the scope of wireless communication, It is characterized by the fact that they can identify the transport boxes or the identification hardware (51), the Slot hardware (50), the smart hardware (45, 45B) are within the scope of RFID reading and the vehicles are created by identifying the transport boxes by reading the identification information provided by the RFID tags.\n | 16. It is the restricted area/point type station (79D) mentioned in claim 12 and its feature is; It includes a hardware such as a third party lock (64) integrated into the smart hardware (45) on the vehicle or the transport box and informing the control center the coordinate where the vehicle or the transport box is located, or the integration interface of the control center (82) so that the location can be controlled and the station functions can be restored. and from the control center or the user terminal, the vehicle or\n177 It is characterized by being a station type that covers an area or point whose coordinates are determined by the transport box owners. \n | 17. It is a Station (79.6) with or without a Kiosk, where the stations mentioned in Claims 13 and 14 are installed together with the platform and kiosk computer and peripherals.\n- station kiosk including software and databases (89S) \n- contains smart hardware (45) working in gateway mode (45S4) connected to the station computer \n- contain at least one of the smart hardware (45) or identification equipment (51) or Slot hardware (51) operating in platform mode (45s6) on platforms, if it contains smart hardware (45) operating in gateway mode (45S4) connected to the station computer \n- contains at least one of the smart hardware (45) or identification hardware (51) or Slot hardware (51) operating in station mode (45s5) on platforms that can be directly connected to the station computer, if it does not contain intelligent hardware (45) operating in gateway mode (45S4) connected to the station computer \n- in platform smart equipment (45), it is characterized by the fact that it contains a third party locking system if the hardware does not have a pin locking system module (69), a locking system module (22) containing the hall. \n | 18. The station kiosk mentioned in Claim 17 is software and databases (89S) and its feature is;\n- Creating system interface for users, \n- Creating the unique IDs and identification information of the station and the equipment on the station platforms, \n- Users can perform system registration, information update, data monitoring/reporting, service procurement, payment transactions, \n- Carrying out offline or online car rental, leasing, returning the rented car, providing service, \n- station equipment, smart equipment (45), identification equipment (51) and smart equipment (45) contained in vehicles (106) and transport boxes (117), \n178 identification hardware (51), third party lock hardware (64), communication and management of necessary procedures, \n- Execution of station installation and technical support procedures, \n- Managing online and offline payment procedures, \n- Managing the user, environmental warning and alarm system, \n- Keeping a database of station equipment inventory and status, vehicle inventory and status, platform status and reservations, and all transaction information performed at the station and putting the necessary transactions into effect, \n- Ensuring the connection with the control center system interface and synchronizing with the system databases and executing the system operations, \n- Execution of control center procedures at the station, platforms and vehicles connected to the station, \n- Making and broadcasting announcements/warnings and advertisements by the system during waiting times when there is no user session of the station computer, \n- The route, tariff, location, host vehicle of the host vehicle (106) or the host vehicle used in the system (106K) with the information coming from the city, region and general transportation systems, third party providers(83), control center(82) the arrival time of the vehicle (106K) to the station (79) or the station (79) at the nearby/determined location, its capacity, the infrastructure in the host vehicle, the vehicle that can be rented, used, purchased (106s), transport box (117), or listing of services, route planning and, if necessary, making reservations, purchases or related transactions \n- Execution and monitoring of all system(90) transactions for which the user(67) is authorized \n- It is characterized by the execution of the communication procedure with the user terminal/software and hardware. \n179 \n | 19. It is the smart hardware (45) mentioned in Claim 1, and its feature is;\n- Containing smart firmware (45s) that enables one or more modes to run simultaneously, \n- When the smart hardware is directly connected to the station kiosk computer, reading the users' identity / payment tools (65), establishing communication with the user wireless equipment, transmitting the alarms, status and transaction information of the station electronic equipment, smart hardware (45), and battery to the station computer, and enabling the management of these equipment, Initiating and managing the communication between the station computer and the smart equipment (45) on the platforms, running the communication protocols such as RS485/RS422 or other serial and parallel communication layers, when the communication protocol needs to select master / slave mode, the gateway Smart hardware is in Master Mode. or kiosk gateway mode (45S4), which allows it to work in slave mode or stay in listening mode, \n- Performing the commands of the station kiosk software by communicating with the smart hardware operating in the gateway mode, the procedures for vehicle identification, user identification, payment information identification by communicating with the Smart hardware (45) identification hardware (51) or third party identification RFID tags and other identification equipment. to be fulfilled, Performing the locking and unlocking work on the platform by driving the motor system in the smart hardware (45) or in the third party lock system, by checking the status of the switches, RFID tags and hardware that enable the vehicle to be identified on the platform, in the third party lock system or in the smart hardware. the open, closed position of the lock, whether the vehicle parked on the platform is on the platform, Checking whether there are unauthorized unlocking attempts and whether the rented vehicle has left the platform or whether the parked vehicle is properly parked according to whether it is defined in the system and whether the return procedure has been completed. \n180 User warning unit (58) (a warning that can appeal to all senses such as buzzer, led, loudspeaker, visual, auditory, etc.) or smart hardware (45) on the platform during rental, return and adding/removing vehicles to the system. control and operation of the systems according to the defined function groups, on the platform and on the vehicle, if there is a charging module in the smart hardware (45) and identification equipment (51) in the transport box (117), the vehicle equipment charging procedure is managed, if the vehicle is equipped with a sensor infrastructure on the transport box (117), in the smart hardware (45), identification equipment (51) or slot if the necessary sensor and control hardware connection is available in the hardware (50), reading the sensor information from the vehicle, If there is a payment information or payment hardware module in the smart hardware (45) and identification equipment (51) on the platform and on the vehicle, making payment and collection transactions, accepting the return process offline if the station computer or the smart hardware that plays the gateway cannot be accessed, when the platform computer and the system are reached. transfer the necessary information to the system, daisy chain, point to point, If one of the token ring topologies is used, the information coming from the previously located smart hardware on the station platforms is routed to the next smart hardware by making the necessary changes according to the process or without changing it, to apply the protocols determined by the system during installation, update and use, to apply the protocols that are in technical support mode and technical support personnel with their smart hardware. to communicate, to transmit fault information, to make software updates, execution of installation and troubleshooting procedures, creation of unique platform ID in the system using the ID information obtained from the smart hardware (45) system processor and circuit group and other hardware or software created cookies, the establishment of the unique platform ID in the system and the errors and failures in the platform hardware. keeping log information, sending malfunctions and errors to the control center via the station infrastructure, the actions taken, sent by the system \n181 updates and parameters, the parameters determined during installation and production are stored in the local database and memory, if there is a positioning module, the station location and changes are notified to the control center and users, if the platform has slot hardware (50) and Peron smart hardware (45) has the necessary module reading the slot hardware (50) ID, mapping it to the smart hardware (45) ID, platform mode (45S6), which enables power and signal connections to be activated, \n- Fulfilling the commands of the control center by communicating with the smart hardware (45), station computer (89) operating in the Kiosk Gateway Mode (45S4) or station mode (45S5), and the control center System Interfaces (82.1) using the communication module/unit (95), Vehicle identification by communicating with Intelligent hardware (45) identification equipment (51) or RFID tags and other identification equipment that provide third-party identification on the vehicle (106) and transport boxes (117), or by detecting the identification information with the visual data processing module (102), fulfillment of user identification, station identification, identification of payment information procedures, Carrying out the locking and unlocking job by driving the motor system in the smart hardware (45) or in the third party lock system, checking the status of the switches/sensors in the smart hardware or the third party lock system, RFID tags and hardware that enable the vehicle to be identified and unlocking the engine driven lock , its closed position, whether the vehicle parked on the platform is on the platform, whether there have been any unauthorized unlocking attempts, and the leased vehicle, Checking whether the transport box has left the platform or whether the parked vehicle is properly parked according to whether it is defined in the system and whether the return procedure has been completed, user warning systems (58) (buzzer, led, speaker) during rental, return and adding / removing vehicles to the system. warning systems that can appeal to all sensory organs such as visual, auditory, etc.) or user warning systems on smart hardware (45). \n182 control and operation according to the function groups, if there is a power connection to the smart hardware and if there is a charging module in the smart hardware (45) and identification equipment (51) on the vehicle or the carrying box, if there is a charging module in the socket hardware, managing the charging procedure of the vehicle equipment, if there is a sensor infrastructure on the vehicle the vehicle, if the necessary sensor and control hardware connection is available in the smart hardware (45) and identification equipment (51), Reading the sensor information from the transport box (117), making payment information in the smart hardware and the smart hardware (45) and identification hardware (51), the socket hardware (50) or the payment and collection transactions if there is a payment hardware module, to the control center (82) ), Accepting offline return if the smart hardware (45) running the station computer (89) or kiosk gateway mode (45S4) or other station mode (45S5) cannot be accessed, transferring the necessary information when the system is reached, if daisy chain, point to point, token ring topologies are used, the information coming from the previous smart hardware on the station platforms is routed to the next smart hardware with or without making the necessary changes according to the process, to apply the protocols determined by the system during installation and update and use , Using ID information from hardware (45) system processor and circuit assembly (11) and other hardware running technical support mode (45S8) and technical support personnel communicating with smart hardware (45), transmitting fault information, performing software updates, executing installation procedures Creating the unique platform ID in the system (90) or station (79), keeping the log information of the transactions and errors that occur in the system and during the operations, and the failures in the hardware, sending malfunctions and errors to the control center, operations, updates and parameters sent by the system, storing the parameters determined during installation, during production in the local database and memory, communicating with user terminals wirelessly and user transactions, rental, return, vehicle addition, service receiving \n183 Ensuring the execution of procedures, enabling Users(67) to run payment methods for payment/collection transactions from user ID/payment tools(65), smart hardware(45), socket hardware(50), identification hardware(51), third party hardware, 67), vehicles (106), payment-collection in transport boxes (117), electronic circuit group and software (99) or visual data processing unit containing Global location and indoor positioning technology, Determining the station location by utilizing the data using RFID readers and tags and notifying the location and changes to the control center, users, user terminal software (81S), other systems (84) and smart hardware (45), with the location information obtained, the vehicle (106), transport determining the location of the user terminal, performing the tracking and docking functions, If there is a slot hardware (50) on the platform and the platform smart hardware (45) has the necessary module, the station mode (45S5) that enables to read the ID of the slot hardware (50), match it with the smart hardware (45) ID, activate the power and signal connections. ), \n- Connecting to smart equipment working with platform mode (45S6) and station mode (45S5), reading user identity/payment tools(65), authentication and payment transactions, and visible Light Channels and Warning Interface User interface mode (45S9) that enables interaction with the user as (39) or User Alert Unit(58), \n- communication with other smart hardware (45) and identification equipment (51) and hardware connected by connecting or locking the socket hardware, Technical support user(67) via wireless or wired port to terminal(81) or Communication module/unit(95) or nearby network Connecting to System Interfaces (82.1) by means of means of accessing the data and systems of the vehicle or station or transport boxes (117) to which the smart equipment (45) is connected, connecting to sensors and subsystems, to the logs and transaction database on the hardware to which it is connected. \n184 access, storage, transfer to the control center(82) and user terminal(81), computer, the hardware that is connected with the software and parameters in the memory of the hardware operating in technical support mode or downloaded in the control center, and error/fault correction procedures on the vehicle to which the hardware is connected, installation and automatic or manual execution of disassembly procedures, hardware-software update procedures, Providing energy to the hardware in case the connected hardware has the necessary modules during error correction, connecting to the vehicle technical support team, equipment, systems in case it serves the hardware in third party vehicles and creating an interface for error/fault correction operations, carrying out automatic or manual error removal procedures, technical support mode (45S8), which enables the execution of test procedures on the hardware it is connected to by emulating/simulating other hardware modes, \n- Identification of the station / platform parked or taken from the park by communicating with the smart hardware (45) identification equipment (51) or RFID tags and other identification equipment that provide third party identification, and by detecting the identification information of the visual data processing unit (102) on the platforms, fulfillment of the procedures for transferring payment information, the driving modes determined and programmed by the manufacturer or the user in the control center interface and terminals, transferring vehicle behaviors, vehicle management data, routes and other preferences to the vehicle rented or owned by means of smart hardware, Connecting to the control center (82.1) system interfaces (82.1) via the communication infrastructure and with the Communication module / unit (95) on the vehicle smart hardware (45), receiving software and operational information updates, executing troubleshooting procedures, accumulating and generated logs and data \n185 In case of rental, if the control center is a vehicle with a system interface, transfer to the system interfaces (82.1) or third party ser...", "output": "The infrastructure provides infrastructure services for vehicle owners to rent/share own vehicles and facilities, and provides products and services used in the own vehicles. The rental/sharing system and logistics system are set to create and operate the transportation system. Infrastructure rented, shared, traded for invention, transportation vehicle, autonomous vehicle, transport box-cabin, and equipment. The infrastructure of rental/sharing system and logistics system, creates and operates the logistics and transportation system.", "instruction": "Please summarize the input" }, { "input": "METHOD AND COMPUTING DEVICE FOR PLANNING AN AUTONOMOUS DRIVING OF A SUBJECT AUTONOMOUS VEHICLEA method for planning an autonomous driving by using a V2X communication and an image processing under a road circumstance where both vehicles capable of the V2X communication and vehicles incapable of the V2X communication exist is provided. And the method includes steps of: (a) a computing device [100], corresponding to a subject autonomous vehicle, instructing a planning module [150] to acquire recognition information on surrounding vehicles including (i) first vehicles capable of a V2X communication and (ii) second vehicles incapable of the V2X communication; (b) the computing device [100] instructing the planning module [150] to select an interfering vehicle among the surrounding vehicles; and (c) the computing device [100] instructing the planning module [150] to generate a potential interference prediction model, and to modify current optimized route information in order to evade a potential interfering action, to thereby generate updated optimized route information of the subject autonomous vehicle.|1. A method for planning an autonomous driving of a subject autonomous vehicle by using a Vehicle-to-everything (V2X) communication and an image processing under a road circumstance where both vehicles capable of the V2X communication and vehicles incapable of the V2X communication exist, comprising steps of: (a) a computing device [100] instructing a planning module [150] to acquire recognition information on at least part of surrounding vehicles including at least part of (i) a first group including one or more first vehicles which are capable of a V2X communication and are located closer than a threshold distance from the subject autonomous vehicle and (ii) a second group including one or more second vehicles which are incapable of the V2X communication and are located closer than the threshold distance from the subject autonomous vehicle, by interworking with at least one of a V2X communication module [130] and an image processing module [140]; (b) the computing device [100] instructing the planning module [150] to select at least one interfering vehicle among the surrounding vehicles, whose probability of obstructing the subject autonomous vehicle is larger than a threshold probability, by referring to a current optimized route information of the subject autonomous vehicle and the recognition information; and (c) the computing device [100] instructing the planning module [150] to generate a potential interference prediction model on the interfering vehicle by interworking with at least one of the V2X communication module [130] and the image processing module [140], and to modify the current optimized route information in order to evade a potential interfering action of the interfering vehicle, which is estimated by using the potential interference prediction model, to thereby generate updated optimized route information of the subject autonomous vehicle, wherein, at the step of (a), the computing device [100] instructs a 1-st neural network included in the image processing module [140] to acquire at least one circumstance image, corresponding to at least one direction from the subject autonomous vehicle, through at least one camera installed to the subject autonomous vehicle, and to apply one or more 1-st neural network operations to the circumstance image, to thereby generate the recognition information and then to thereby transmit the recognition information to the planning module [150], wherein the recognition information includes at least part of (i) vehicle identifier information, (ii) vehicle exterior information, and (iii) vehicle relative location information from the subject autonomous vehicle.\n | 2. The method of Claim 1, wherein, at the step of (b), the computing device [100] instructs the planning module [150] to (i) generate a scheduled direction vector by using scheduled direction information on a direction to which the subject autonomous vehicle is planned to move in a threshold time and generate one or more relative location vectors by using the vehicle relative location information corresponding to at least part of the surrounding vehicles, (ii) generate each of similarity scores between the scheduled direction vector and each of the relative location vectors, and (iii) select at least one specific surrounding vehicle, among the surrounding vehicles, as the interfering vehicle, whose specific similarity score is larger than a threshold similarity score.\n | 3. The method of Claim 1, wherein, at the step of (b), the computing device [100] instructs the planning module [150] to select at least one specific surrounding vehicle, whose corresponding partial image is located in a current lane region, corresponding to a current lane of a road including the subject autonomous vehicle, of the circumstance image, as the interfering vehicle, by referring to information on locations, of bounding boxes including the surrounding vehicles, on the circumstance image, which is acquired by using the image processing module [140].\n | 4. A method for planning an autonomous driving of a subject autonomous vehicle by using a Vehicle-to-everything (V2X) communication and an image processing under a road circumstance where both vehicles capable of the V2X communication and vehicles incapable of the V2X communication exist, comprising steps of: (a) a computing device [100] instructing a planning module [150] to acquire recognition information on at least part of surrounding vehicles including at least part of (i) a first group including one or more first vehicles which are capable of a V2X communication and are located closer than a threshold distance from the subject autonomous vehicle and (ii) a second group including one or more second vehicles which are incapable of the V2X communication and are located closer than the threshold distance from the subject autonomous vehicle, by interworking with at least one of a V2X communication module [130] and an image processing module [140]; (b) the computing device [100] instructing the planning module [150] to select at least one interfering vehicle among the surrounding vehicles, whose probability of obstructing the subject autonomous vehicle is larger than a threshold probability, by referring to a current optimized route information of the subject autonomous vehicle and the recognition information; and (c) the computing device [100] instructing the planning module [150] to generate a potential interference prediction model on the interfering vehicle by interworking with at least one of the V2X communication module [130] and the image processing module [140], and to modify the current optimized route information in order to evade a potential interfering action of the interfering vehicle, which is estimated by using the potential interference prediction model, to thereby generate updated optimized route information of the subject autonomous vehicle, wherein, at the step of (c), the computing device [100] , if the interfering vehicle belongs to the second group, (i) instructs the image processing module (i-1) to acquire a modeling image including the interfering vehicle through at least one camera installed to the subject autonomous vehicle and (i-2) to apply one or more 2-nd neural network operations to the modeling image, to thereby generate acceleration capability information of the interfering vehicle, and then to thereby transmit the acceleration capability information to the planning module, and (ii) instructs the planning module to generate the potential interference prediction model by referring to the acceleration capability information and current velocity information of the interfering vehicle acquired by using at least one of the image processing module [140] and the V2X communication module [130].\n | 5. The method of Claim 4, wherein, at the step of (c), the computing device [100] instructs a 2-nd neural network included in the image processing module [140] to apply one or more (2-1)-st neural network operations, among the 2-nd neural network operations, to the modeling image by additionally referring to a self-vehicle velocity information of the subject autonomous vehicle, to thereby generate (i) relative velocity information of the interfering vehicle in relation to the subject autonomous vehicle, (ii) category information corresponding to a class of the interfering vehicle, and (iii) acceleration variable information corresponding to at least part of a mass and a volume of the interfering vehicle, and instructs the 2-nd neural network to apply one or more (2-2)-nd neural network operations, among the 2-nd neural network operations, to a concatenated vector including the relative velocity information, the category information and the acceleration variable information as its components, to thereby generate the acceleration capability information of the interfering vehicle.\n | 6. The method of Claim 5, wherein, at the step of (c), the computing device [100] instructs the 2-nd neural network to apply the (2-2)-nd neural network operations to the concatenated vector, further including current section average velocity information on an average velocity of vehicles in a current section of a road where the surrounding vehicles and the subject autonomous vehicle are driving currently, to thereby generate the acceleration capability information.\n | 7. The method of Claim 4, wherein the computing device [100] instructs the planning module [150] to generate the potential interference prediction model by referring to the acceleration capability information and the current velocity information, generated by referring to velocity ratio information of the interfering vehicle and current section average velocity information, wherein the velocity ratio information has been generated by comparing each of average velocities for each of past sections of a road, where the surrounding vehicles and the subject autonomous vehicle have been driving, with each of velocities of the interfering vehicle in each of the past sections, and transmitted from a center server to the V2X communication module [130], and the current section average velocity information has been generated by calculating an average velocity of vehicles in a current section of a road where the surrounding vehicles and the subject autonomous vehicle are driving currently, and transmitted from the center server to the V2X communication module [130].\n | 8. The method of Claim 4, wherein the computing device [100] instructs the 2-nd neural network to apply the 2-nd neural network operations to the modeling image, to thereby generate the current velocity information along with the acceleration capability information and then to thereby transmit the current velocity information and the acceleration capability information to the planning module, and instructs the planning module [150] to generate the potential interference prediction model by referring to the current velocity information and the acceleration capability information.\n | 9. The method of Claim 4, wherein the computing device [100] instructs the planning module [150] to generate estimated velocity range information by referring to (i) a TTC value corresponding to a time for the subject autonomous vehicle to evade the potential interfering action, (ii) the acceleration capability information and (iii) the current velocity information, to thereby generate the potential interference prediction model including the estimated velocity range information.\n | 10. A method for planning an autonomous driving of a subject autonomous vehicle by using a Vehicle-to-everything (V2X) communication and an image processing under a road circumstance where both vehicles capable of the V2X communication and vehicles incapable of the V2X communication exist, comprising steps of: (a) a computing device [100] instructing a planning module [150] to acquire recognition information on at least part of surrounding vehicles including at least part of (i) a first group including one or more first vehicles which are capable of a V2X communication and are located closer than a threshold distance from the subject autonomous vehicle and (ii) a second group including one or more second vehicles which are incapable of the V2X communication and are located closer than the threshold distance from the subject autonomous vehicle, by interworking with at least one of a V2X communication module [130] and an image processing module [140]; (b) the computing device [100] instructing the planning module [150] to select at least one interfering vehicle among the surrounding vehicles, whose probability of obstructing the subject autonomous vehicle is larger than a threshold probability, by referring to a current optimized route information of the subject autonomous vehicle and the recognition information; and (c) the computing device [100] instructing the planning module [150] to generate a potential interference prediction model on the interfering vehicle by interworking with at least one of the V2X communication module [130] and the image processing module [140], and to modify the current optimized route information in order to evade a potential interfering action of the interfering vehicle, which is estimated by using the potential interference prediction model, to thereby generate updated optimized route information of the subject autonomous vehicle, wherein, at the step of (c), the computing device [100], if the interfering vehicle belongs to the second group, instructs the V2X communication module [130] to (i) acquire (i-1) acceleration capability information of the interfering vehicle, (i-2) velocity ratio information of the interfering vehicle generated by comparing each of average velocities for each of past sections of a road, where the surrounding vehicles and the subject autonomous vehicle have been driving, with each of velocities of the interfering vehicle in each of the past sections, and (i-3) current section average velocity information generated by calculating an average velocity of vehicles in the current section, from the center server, (ii) generate current velocity information of the interfering vehicle by referring to the velocity ratio information and the current section average velocity information, and (iii) generate the potential interference prediction model by referring to the current velocity information and the acceleration capability information.\n | 11. The method of Claim 1, wherein, at the step of (c), the computing device [100], if the interfering vehicle belongs to the first group, instructs the V2X communication module [130] to acquire scheduled route information of the interfering vehicle by communicating with the interfering vehicle, and instructs the planning module [150] to generate the potential interference prediction model by referring to the scheduled route information.\n | 12. A method for planning an autonomous driving of a subject autonomous vehicle by using a Vehicle-to-everything (V2X) communication and an image processing under a road circumstance where both vehicles capable of the V2X communication and vehicles incapable of the V2X communication exist, comprising steps of: (a) a computing device [100] instructing a planning module [150] to acquire recognition information on at least part of surrounding vehicles including at least part of (i) a first group including one or more first vehicles which are capable of a V2X communication and are located closer than a threshold distance from the subject autonomous vehicle and (ii) a second group including one or more second vehicles which are incapable of the V2X communication and are located closer than the threshold distance from the subject autonomous vehicle, by interworking with at least one of a V2X communication module [130] and an image processing module [140]; (b) the computing device [100] instructing the planning module [150] to select at least one interfering vehicle among the surrounding vehicles, whose probability of obstructing the subject autonomous vehicle is larger than a threshold probability, by referring to a current optimized route information of the subject autonomous vehicle and the recognition information; and (c) the computing device [100] instructing the planning module [150] to generate a potential interference prediction model on the interfering vehicle by interworking with at least one of the V2X communication module [130] and the image processing module [140], and to modify the current optimized route information in order to evade a potential interfering action of the interfering vehicle, which is estimated by using the potential interference prediction model, to thereby generate updated optimized route information of the subject autonomous vehicle, wherein, at the step of (c), the computing device [100] instructs the planning module [150] to acquire lane average velocity information of at least one surrounding lane located in at least one of a left side and a right side of a current lane including the subject autonomous vehicle, and to modify the current optimized route information in order to add an evading action, to be executed in correspondence with the surrounding lane in order to evade the potential interfering action, by referring to the lane average velocity information, to thereby generate the updated optimized route information.\n | 13. A computing device [100] for planning an autonomous driving of a subject autonomous vehicle by using a Vehicle-to-everything (V2X) communication and an image processing under a road circumstance where both vehicles capable of the V2X communication and vehicles incapable of the V2X communication exist, comprising: at least one memory [115]; and at least one processor [120] configured to perform processes of: (I) instructing a planning module [150] to acquire recognition information on at least part of surrounding vehicles including at least part of (i) a first group including one or more first vehicles which are capable of a V2X communication and are located closer than a threshold distance from the subject autonomous vehicle and (ii) a second group including one or more second vehicles which are incapable of the V2X communication and are located closer than the threshold distance from the subject autonomous vehicle, by interworking with at least one of a V2X communication module [130] and an image processing module [140]; (II) instructing the planning module [150] to select at least one interfering vehicle among the surrounding vehicles, whose probability of obstructing the subject autonomous vehicle is larger than a threshold probability, by referring to a current optimized route information of the subject autonomous vehicle and the recognition information; and (III) instructing the planning module [150] to generate a potential interference prediction model on the interfering vehicle by interworking with at least one of the V2X communication module [130] and the image processing module [140], and to modify the current optimized route information in order to evade a potential interfering action of the interfering vehicle, which is estimated by using the potential interference prediction model, to thereby generate updated optimized route information of the subject autonomous vehicle, wherein, at the process of (I), the processor instructs a 1-st neural network included in the image processing module [140] to acquire at least one circumstance image, corresponding to at least one direction from the subject autonomous vehicle, through at least one camera installed to the subject autonomous vehicle, and to apply one or more 1-st neural network operations to the circumstance image, to thereby generate the recognition information and then to thereby transmit the recognition information to the planning module [150], wherein the recognition information includes at least part of (i) vehicle identifier information, (ii) vehicle exterior information, and (iii) vehicle relative location information from the subject autonomous vehicle.\n | 14. The computing device [100] of Claim 13, wherein, at the process of (II), the processor instructs the planning module [150] to (i) generate a scheduled direction vector by using scheduled direction information on a direction to which the subject autonomous vehicle is planned to move in a threshold time and generate one or more relative location vectors by using the vehicle relative location information corresponding to at least part of the surrounding vehicles, (ii) generate each of similarity scores between the scheduled direction vector and each of the relative location vectors, and (iii) select at least one specific surrounding vehicle, among the surrounding vehicles, as the interfering vehicle, whose specific similarity score is larger than a threshold similarity score.\n | 15. The computing device [100] of Claim 13, wherein, at the process of (II), the processor instructs the planning module [150] to select at least one specific surrounding vehicle, whose corresponding partial image is located in a current lane region, corresponding to a current lane of a road including the subject autonomous vehicle, of the circumstance image, as the interfering vehicle, by referring to information on locations, of bounding boxes including the surrounding vehicles, on the circumstance image, which is acquired by using the image processing module [140].\n | 16. A computing device [100] for planning an autonomous driving of a subject autonomous vehicle by using a Vehicle-to-everything (V2X) communication and an image processing under a road circumstance where both vehicles capable of the V2X commu...", "output": "The method involves instructing a planning module (150) to acquire recognition information on at least portion of surrounding vehicles by a computing device (100) includes at least a portion of a first group with one or more first vehicles that capable of a V2X communication and located closer than a threshold distance from a subject autonomous vehicle corresponding to the computing device. A second group includes one or more second vehicles that are incapable of the V2X communication and located closer than the threshold distance from the subject autonomous vehicle, by interworking with at least one of a V2X communication module (130) and an image processing module (140). The computing device is instructed on the planning module to selects at least one interfering vehicle among the surrounding vehicles. An INDEPENDENT CLAIM is included for a computing device for planning an autonomous driving by using a V2X communication and image processing under a road circumstance. Method for planning an autonomous driving by using a V2X communication and an image processing under a road circumstance. Method ensures the computing device that may instruct the planning module to modify the current optimized route information in order to evade a potential interfering action of the interfering vehicle, estimated by using the potential interference prediction model by generates updated optimized route information of the subject autonomous vehicle. The drawing shows a schematic representation of a method for planning an autonomous driving by using a V2X communication and an image processing under a road circumstance. 100Computing device120Processor130V2X communication module140Image processing module150Planning module", "instruction": "Please summarize the input" }, { "input": "METHOD AND DEVICE FOR ATTENTION-DRIVEN RESOURCE ALLOCATION BY USING REINFORCEMENT LEARNING AND V2X COMMUNICATION TO THEREBY ACHIEVE SAFETY OF AUTONOMOUS DRIVINGA method for achieving better autonomous driving performance while saving computing power by using a confidence score representing reliability of object detection, generated in parallel with the object detection process, comprising: (a) a computing device , acquiring at least one situation image of the surroundings of the target vehicle through at least one image sensor installed in the target vehicle; (b) causing, by the computing device, a Convolutional Neural Network (CNN) to apply a CNN operation to the context image at least once to generate initial object information and initial confidence information for the context image; and (c) the computing device, through V2X communication with at least some of the surrounding objects and support of the reinforcement learning agent, with reference to the initial object information and the initial confidence information, final object information for the context image A method comprising; generating; is provided.|1. A method for achieving better autonomous driving performance while saving computing power by using a confidence score representing the credibility of object detection, generated in parallel with the object detection process, comprising: (a) acquiring, by the computing device, at least one circumstance image of the surroundings of the target vehicle through at least one image sensor installed in the target vehicle;\n(b) causing, by the computing device, a Convolutional Neural Network (CNN) to apply a CNN operation to the context image at least once to generate initial object information and initial confidence information for the context image; And (c) the computing device, the initial object information and the initial confidence information through the support of the reinforcement learning agent and V2X communication with at least some of the surrounding objects of which the distance from the target vehicle is less than or equal to a threshold, with reference to, the generating final object information for the context image; but, in step (c), the computing device causes the reinforcement learning agent to: (i) the initial confidence information and basic meta information of the surrounding objects and one or more specific surrounding objects corresponding to one or more specific target areas on the situation image to which a complementary operation is to be applied, from among the surrounding objects, with reference to the sensor information of the image sensor, (ii) the specific surrounding objects; and to obtain supplementary information through the V2X communication, and (iii) to generate the final object information by adjusting the initial object information using the supplementary information. \n | 2. delete\n | 3. The method of claim 1, wherein when the initial confidence information, the basic meta information, and the sensor information are input to the reinforcement learning agent, the reinforcement learning agent (i) selects the one or more specific surrounding objects using its parameters. selecting, (ii) generating at least one reward with reference to the supplementary information, and (iii) learning at least a portion of the parameter with reference to the reward.\n | 4. The method of claim 1, wherein the computing device causes the reinforcement learning agent to: (i) the relative position information and the scheduled path information of the surrounding objects included in the basic meta information, (ii) the sensor information included , FOV (Field-Of-View) information, internal parameter information, external parameter information and distortion information, and (iii) determining whether to select the one or more specific surrounding objects using at least some of the initial confidence information how to do it with\n | 5. The method of claim 1 , wherein the computing device refers to supplementary information including at least a portion of reference object information and reference confidence information generated by the specific surrounding object, and specific metadata of the specific surrounding object as a reference, The final object information is generated by adjusting initial object information, wherein the specific surrounding object performs object detection on its own surrounding object to generate the reference object information and the reference confidence information.\n | 6. The method of claim 1, wherein (d) the computing device transmits the final object information to the autonomous driving module, thereby causing the autonomous driving module to perform autonomous driving of the target vehicle using the final object information. to do;\nMethod, characterized in that it further comprises. \n | 7. A method for achieving better autonomous driving performance while saving computing power by using a confidence score representing the credibility of object detection, generated in parallel with the object detection process, comprising: (a) acquiring, by the computing device, at least one circumstance image of the surroundings of the target vehicle through at least one image sensor installed in the target vehicle;\n(b) causing, by the computing device, a Convolutional Neural Network (CNN) to apply a CNN operation to the context image at least once to generate initial object information and initial confidence information for the context image; And (c) the computing device, the initial object information and the initial confidence information through the support of the reinforcement learning agent and V2X communication with at least some of the surrounding objects of which the distance from the target vehicle is less than or equal to a threshold, with reference to, the Including, but before the step (a), (a1) when a training image is obtained, the learning apparatus causes at least one convolutional layer included in the CNN to cause the applying a convolution operation to the training image at least once to generate at least one convolutional feature map; \n(a2) while the learning apparatus performs a process of generating a predicted ROI (Region Of Interest) on the training image by applying an anchor operation to the convolutional feature map at least once, RPN (Region Proposal Network) By generating at least one anchor layer included in each at least one RPN confidence score representing each at least one probability that the prediction ROI is the same as a ground truth (GT) ROI for each pixel of the convolutional feature map generating an RPN confidence map including the RPN confidence score; \n(a3) When at least one ROI-pooled feature map generated using the convolutional feature map and the predicted ROI is obtained through the ROI-pooling layer included in the CNN, the learning apparatus, the ROI-pooled feature map While performing a process of generating a prediction object detection result using Each CNN confidence score indicating each at least one probability that the ratio result is predicted to be the same as each at least one GT CNN classification result and each at least one GT CNN regression result included in the GT object detection result is said generating a CNN confidence map including the CNN confidence score by generating for each predicted ROI; and (a4) the learning device causes the loss layer to at least one RPN loss and at least one CNN loss with reference to the RPN confidence map, the CNN confidence map, the prediction object detection result and the GT object detection result. and to learn at least some of the parameters included in the CNN and the RPN by performing backpropagation using the RPN loss and the CNN loss. \n | 8. The method of claim 7, wherein in step (a4), the learning device causes the loss layer to generate the RPN loss according to the following equation,is a constant corresponding to the size of the convolutional feature map generated by performing an anchor operation,is a constant corresponding to the training image,is an i-th RPN confidence score corresponding to the i-th pixel of the convolutional feature map among the RPN confidence scores,means the i-th prediction RPN classification result corresponding to the i-th pixel,denotes an i-th GT RPN classification result corresponding to the i-th pixel,is the i-th prediction RPN regression result corresponding to the i-th pixel,denotes an i-th GT RPN regression result corresponding to the i-th pixel, and the i-th GT RPN classification result and the i-th GT RPN regression result correspond to the GT object detection result.\n | 9. The method of claim 7, wherein in step (a4), the learning device causes the loss layer to generate the CNN loss according to the following equation, andis the number of the predicted ROI,is the i-th CNN confidence score corresponding to the i-th prediction ROI selected from the prediction ROI among the CNN confidence scores,means the i-th prediction CNN classification result corresponding to the i-th prediction ROI,is the i-th GT CNN classification result corresponding to the i-th pixel,Means the i-th prediction CNN regression result corresponding to the i-th prediction ROI,means an i-th GT CNN regression result corresponding to the i-th pixel, and the i-th prediction CNN classification result and the i-th prediction CNN regression result correspond to the prediction object detection result.\n | 10. The method according to claim 7, wherein after step (a3), the learning device causes the confidence layer to refer to the RPN confidence map and the CNN confidence map, and for each integrated confidence score for each pixel in the training image. A method comprising generating a unified confidence map comprising information.\n | 11. The method of claim 10, wherein the learning device, the confidence layer, (i-1) from the CNN, generated during the process of generating the prediction object detection result is performed, NMS (Non-NMS for the prediction ROI) -Maximum Suppression) a process of obtaining a result, (i-2) a process of generating a resized RPN confidence map by applying a Resize operation to the RPN confidence map at least once, and (ii) the NMS result and Method characterized in that to perform the process of generating the unified confidence map with reference to the resized RPN confidence map.\n | 12. The method according to claim 11, wherein the learning device causes the confidence layer to coordinate coordinates on the training image among the integrated confidence scores.To generate an X_Y th integrated confidence score corresponding to the following formula,means the X_Y th integrated confidence score,is the coordinates on the resized RPN confidence mapMeans the X_Y-th resized RPN confidence score corresponding to,is determined from the NMS result, the coordinatesincludesMethod characterized in that it means the i-th CNN confidence score for the i-th predicted ROI expressed as .\n | 13. A method for achieving better autonomous driving performance while saving computing power by using a confidence score representing the credibility of object detection, generated in parallel with the object detection process, comprising: (a) acquiring, by the computing device, at least one circumstance image of the surroundings of the target vehicle through at least one image sensor installed in the target vehicle;\n(b) causing, by the computing device, a Convolutional Neural Network (CNN) to apply a CNN operation to the context image at least once to generate initial object information and initial confidence information for the context image; And (c) the computing device, the initial object information and the initial confidence information through the support of the reinforcement learning agent and V2X communication with at least some of the surrounding objects of which the distance from the target vehicle is less than or equal to a threshold, with reference to, the generating final object information for the context image; but, in step (b), (b1) when the context image is obtained, the computing device causes at least one convolutional layer included in the CNN to generating at least one convolutional feature map by applying a convolution operation to the context image at least once; \n(b2) causing the computing device to cause at least one anchor layer included in the RPN to perform a process for generating a predictive ROI on the context image by applying an anchor operation to the convolutional feature map at least once, generating an RPN confidence map comprising the RPN confidence score by generating, for each pixel of the convolutional feature map, each of at least one RPN confidence score, each of which represents at least one probability that a predicted ROI will be the same as the GT ROI; \n(b3) When at least one ROI pooled feature map generated using the convolutional feature map and the predicted ROI is obtained through the ROI pooling layer included in the CNN, the computing device generates the ROI pooled feature map While performing a process of generating a prediction object detection result using Each of the at least one GT CNN classification result and the at least one GT CNN regression result included in the GT object detection result are each CNN confidence score indicating at least one probability that the ratio result is the same as each of the prediction ROIs. By doing so, generating a CNN confidence map including the CNN confidence score; and (b4) causing the computing device to generate an integrated confidence map with reference to the RPN confidence map and the CNN confidence map by the confidence layer operating in conjunction with the CNN; \nIncluding, wherein the computing device causes the CNN to output the initial object information including the prediction object detection result and the initial confidence information including the integrated confidence map. \n | 14. In a computing device for achieving better autonomous driving performance while saving computing power by using a confidence score representing the credibility of object detection, generated in parallel with the object detection process, an instruction at least one memory for storing; and (I) a process of obtaining at least one circumstance image of the surroundings of the target vehicle through at least one image sensor installed in the target vehicle, (II) causing a Convolutional Neural Network (CNN) to A process of generating initial object information and initial confidence information for the situation image by applying a CNN operation to the image at least once, and (III) V2X with at least some of the surrounding objects whose distance from the target vehicle is less than or equal to a threshold at least one processor configured to execute the instructions for performing a process of generating final object information for the context image with reference to the initial object information and the initial confidence information, through communication and support of a reinforcement learning agent including; in the process (III), the processor causes the reinforcement learning agent to: (i) the initial confidence information; With reference to the basic meta information of the surrounding object and the sensor information of the image sensor, one or more specific surrounding objects corresponding to one or more specific target areas on the context image to which a complementary operation is to be applied are selected from among the surrounding objects, (ii) to obtain supplementary information through the V2X communication with the specific surrounding object, and (iii) to generate the final object information by adjusting the initial object information using the supplementary information .\n | 15. delete\n | 16. The method of claim 14, wherein when the initial confidence information, the basic meta information, and the sensor information are input to the reinforcement learning agent, the reinforcement learning agent (i) selects the one or more specific surrounding objects using its parameters. and (ii) generating at least one reward with reference to the supplementary information, and (iii) learning at least a portion of the parameter with reference to the reward.\n | 15. The method of claim 14, wherein the processor causes the reinforcement learning agent to: (i) included in the basic meta information, relative position information and scheduled path information of the surrounding object, (ii) included in the sensor information, It is characterized in that it is determined whether to select the one or more specific surrounding objects by using at least some of Field-Of-View (FOV) information, internal parameter information, external parameter information and distortion information, and (iii) the initial confidence information. device to do.\n | 15 . The method of claim 14 , wherein the processor refers to supplemental information including at least a portion of reference object information and reference confidence information generated by the specific surrounding object, and specific metadata of the specific surrounding object as a reference, The final object information is generated by adjusting object information, and the specific surrounding object generates the reference object information and the reference confidence information by performing object detection on its own surrounding object.\n | 19. The method of claim 14, wherein the processor (IV) transmits the final object information to the autonomous driving module, thereby causing the autonomous driving module to perform autonomous driving of the target vehicle using the final object information. A device, characterized in that it further performs a process.\n | 20. In a computing device for achieving better autonomous driving performance while saving computing power by using a confidence score representing the credibility of object detection, generated in parallel with the object detection process, an instruction at least one memory for storing; and (I) a process of obtaining at least one circumstance image of the surroundings of the target vehicle through at least one image sensor installed in the target vehicle, (II) causing a Convolutional Neural Network (CNN) to A process of generating initial object information and initial confidence information for the situation image by applying a CNN operation to the image at least once, and (III) V2X with at least some of the surrounding objects whose distance from the target vehicle is less than or equal to a threshold at least one processor configured to execute the instructions for performing a process of generating final object information for the context image with reference to the initial object information and the initial confidence information, through communication and support of a reinforcement learning agent Including, but before the (I) process, (I1) when a training image is obtained, the learning device, a process of causing at least one convolutional layer included in the CNN to generate at least one convolutional feature map by applying a convolution operation to the training image at least once; (I2) The learning apparatus, while performing the process of generating a predicted ROI (Region Of Interest) on the training image by applying an anchor operation to the convolutional feature map at least once, RPN (Region Proposal Network) By generating at least one anchor layer included in each at least one RPN confidence score representing each at least one probability that the prediction ROI is the same as a ground truth (GT) ROI for each pixel of the convolutional feature map a process for generating an RPN confidence map comprising the RPN confidence score; (I3) When at least one ROI pooled feature map generated using the convolutional feature map and the predicted ROI is obtained through the ROI pooling layer included in the CNN, the learning device is configured to select the ROI pooled feature map While performing a process of generating a prediction object detection result using Each CNN confidence score indicating each at least one probability that the ratio result is predicted to be the same as each at least one GT CNN classification result and each at least one GT CNN regression result included in the GT object detection result is said a process of generating a CNN confidence map including the CNN confidence score by generating for each predicted ROI; and (I4) the learning device causes the loss layer to at least one RPN loss and at least one CNN loss with reference to the RPN confidence map, the CNN confidence map, the prediction object detection result and the GT object detection result. a process of learning at least a part of the parameters included in the CNN and the RPN by performing backpropagation using the RPN loss and the CNN loss; by performing, the CNN is learned device to do.\n | 21. The method of claim 20, wherein in the process (I4), the learning device causes the loss layer to generate the RPN loss according to the following equation,is a constant corresponding to the size of the convolutional feature map generated by performing an anchor operation,is a constant corresponding to the training image,is an i-th RPN confidence score corresponding to the i-th pixel of the convolutional feature map among the RPN confidence scores,means the i-th prediction RPN classification result corresponding to the i-th pixel,denotes an i-th GT RPN classification result corresponding to the i-th pixel,is the i-th prediction RPN regression result corresponding to the i-th pixel,denotes an i-th GT RPN regression result corresponding to the i-th pixel, and the i-th GT RPN classification result and the i-th GT RPN regression result correspond to the GT object detection result.\n | 22. The method of claim 20, wherein in the process (I4), the learning apparatus causes the loss layer to generate the CNN loss according to the following equation, andis the number of the predicted ROI,is the i-th CNN confidence score corresponding to the i-th prediction ROI selected from the prediction ROI among the CNN confidence scores,means the i-th prediction CNN classification result corresponding to the i-th prediction ROI,is the i-th GT CNN classification result corresponding to the i-th pixel,Means the i-th prediction CNN regression result corresponding to the i-th prediction ROI,denotes an i-th GT CNN regression result corresponding to the i-th pixel, and the i-th prediction CNN classification result and the i-th prediction CNN regression result correspond to the prediction object detection result.\n | 23. The method according to claim 20, wherein, after the (I3) process, the learning device causes the confidence layer to refer to the RPN confidence map and the CNN confidence map for each integrated confidence score for each pixel in the training image. Device characterized in that for generating a unified confidence map comprising the information.\n | 24. The method according to claim 23, wherein the learning apparatus causes the confidence layer to (i-1) generate the prediction object detection result from the CNN while the process of generating the prediction object detection result is performed. -Maximum Suppression) a process of obtaining a result, (i-2) a process of generating a resized RPN confidence map by applying a Resize operation to the RPN confidence map at least once, and (ii) the NMS result and Device characterized in that to perform the process of generating the unified confidence map with reference to the resized RPN confidence map.\n | 25. The method according to claim 24, wherein the learning device causes the confidence layer to coordinate coordinates on the training image among the integrated confidence scores.To generate an X_Y th integrated confidence score corresponding to the following formula,means the X_Y th integrated confidence score,is the coordinates on the resized RPN confidence mapMeans the X_Y-th resized RPN confidence score corresponding to,is determined from the NMS result, the coordinatesincludesApparatus characterized in that it means the i-th CNN confidence score for the i-th predicted ROI expressed as .\n | 26. In a computing device for achieving better autonomous driving performance while saving computing power by using a confidence score representing the credibility of object detection, generated in parallel with the object detection process, an instruction at least one memory for storing; and (I) a process of acquiring at least one circumstance image of the surroundings of the target vehicle through at least one image sensor installed in the target vehicle, (II) causing a Convolutional Neural Network (CNN) to perform the situation A process for generating initial object information and initial confidence information for the situation image by applying a CNN operation to the image at least once, and (III) V2X with at least some of the surrounding objects whose distance from the target vehicle is less than or equal to a threshold at least one processor configured to execute the instructions for performing a process of generating final object information for the context image with reference to the initial object information and the initial confidence information, through communication and support of a reinforcement learning agent ; but, the (II) process, (II1) when the situation image is obtained, a process of causing at least one convolutional layer included in the CNN to generate at least one convolutional feature map by applying a convolution operation to the context image at least once; (II2) At least one anchor layer included in the RPN, while performing a process for generating a prediction ROI on the context image by applying an anchor operation to the convolutional feature map at least once, causes the prediction ROI to be a GT ROI generating an RPN confidence map including the RPN confidence score by generating, for each pixel of the convolutional feature map, at least one RPN confidence score, each of which represents at least one probability to be equal to ; (II3) When at least one ROI pooled feature map generated using the convolutional feature map and the predicted ROI is obtained through the ROI pooling layer included in the CNN, a prediction object is detected using the ROI pooled feature map While performing the process of generating a result, the FC layer included in the CNN causes each at least one prediction CNN classification result and each at least one prediction CNN regression result included in the prediction object detection result to be a GT object By generating, for each prediction ROI, each CNN confidence score indicating at least one probability to be the same as each at least one GT CNN classification result and each at least one GT CNN regression result included in the detection result for each prediction ROI, the CNN confidence a process for generating a CNN confidence map including the score; and (II4) a process of causing the confidence layer operating in conjunction with the CNN to generate an integrated confidence map with reference to the RPN confidence map and the CNN confidence map; The apparatus characterized in that it outputs the initial object information including the object detection result and the initial confidence information including the integrated confidence map.", "output": "The method involves acquiring one circumstance image on surroundings of a subject vehicle by a computing device, through one image sensor installed on the subject vehicle. The computing device instructs a Convolutional Neural Network(CNN) to apply one Convolutional Neural Network operation to the circumstance image, to generate initial object information and initial confidence information on the circumstance image. The computing device generates final object information on the circumstance image by referring to the initial object information and the initial confidence information with a support of a Reinforcement Learning (RL) agent, and through V2X communications with portion of surrounding objects whose distances from the subject vehicle are smaller than a threshold. An INDEPENDENT CLAIM is included for a computing device for achieving better performance in an autonomous driving while saving computing powers, by using confidence scores representing a credibility of an object detection. Method for achieving better performance in an autonomous driving while saving computing powers, by using confidence scores representing a credibility of an object detection by a computing device (claimed). Reduces a consumption of the computing powers required for the autonomous driving. The drawing shows a block representation of a configuration of a computing device performing a method for an attention-driven resource allocation by using Reinforcement Learning and V2X communication, to achieve a safety of an autonomous driving. 110Communication portion115Memory120Processor140Region proposal network150Confidence layer", "instruction": "Please summarize the input" }, { "input": "LEARNING METHOD AND LEARNING DEVICE FOR INTEGRATING OBJECT DETECTION INFORMATION ACQUIRED THROUGH V2V COMMUNICATION FROM OTHER AUTONOMOUS VEHICLE WITH OBJECT DETECTION INFORMATION GENERATED BY PRESENT AUTONOMOUS VEHICLE, AND TESTING METHOD AND TESTING DEVICE USING THE SAMEA learning method for generating integrated object detection information by integrating first object detection information and second object detection information is provided. And the method includes steps of: (a) a learning device instructing a concatenating network to generate one or more pair feature vectors; (b) the learning device instructing a determining network to apply FC operations to the pair feature vectors, to thereby generate (i) determination vectors and (ii) box regression vectors; (c) the learning device instructing a loss unit to generate an integrated loss by referring to the determination vectors, the box regression vectors and their corresponding GTs, and performing backpropagation processes by using the integrated loss, to thereby learn at least part of parameters included in the DNN.|1. A learning method for generating integrated object detection information on an integrated target space including a first target space and a second target space, by integrating first object detection information on the first target space generated by a first vehicle and second object detection information on the second target space generated by a second vehicle, comprising steps of: (a) a learning device (100), if the first object detection information on the first target space and the second object detection information on the second target space are acquired by processing a first original image on the first target space and a second original image on the second target space, instructing a concatenating network (210) included in a DNN (200) to generate one or more pair feature vectors including information on one or more pairs of first original ROIs included in the first target space and second original ROIs in the second target space; (b) the learning device (100) instructing a determining network (220) included in the DNN (200) to apply one or more FC operations to the pair feature vectors, to thereby generate (i) one or more determination vectors including information on probabilities of the first original ROIs and the second original ROIs included in each of the pairs being appropriate to be integrated and (ii) one or more box regression vectors including information on each of relative 3-dimensional locations of integrated ROIs, corresponding to at least part of the pairs, comparing to each of original 3- dimensional locations of each component of said at least part of the pairs, on the integrated target space; (c) the learning device (100) instructing a loss unit (230) to generate an integrated loss by referring to the determination vectors, the box regression vectors and their corresponding GTs, and performing backpropagation processes by using the integrated loss, to thereby learn at least part of parameters included in the DNN (200).\n | 2. The learning method as claimed in claim 1, wherein, at the step of (a), a specific pair feature vector, which is one of the pair feature vectors, includes (i) first class information of a first specific object included in the first target space, (ii) feature values of a first specific original ROI including the first specific object, (iii) 3-dimensional coordinate values of a first specific original bounding box corresponding to the first specific original ROI, (iv) 3- dimensional coordinate values of the first specific original ROI, (v) second class information of a second specific object included in the second target space, (vi) feature values of a second specific original ROI including the second specific object, and (vii) 3-dimensional coordinate values of a second specific original bounding box corresponding to the second specific original ROI, and (viii) 3-dimensional coordinate values of the second specific original ROI.\n | 3. The learning method as claimed in claim 2, wherein, at the step of (b), a specific determination vector, which is one of the determination vectors and corresponds to the specific pair feature vector, includes information on a probability of the first specific original ROI and the second specific original ROI being integrated on the integrated target space, and a specific box regression vector, which is one of the box regression vectors and corresponds to the specific pair feature vector, includes information on 3- dimensional coordinates of a specific integrated bounding box generated by merging the first specific original ROI and the second specific original ROI on the integrated target space.\n | 4. The learning method as claimed in claim 1, wherein, at the step of (c), the learning device instructs the loss unit (i) to generate a determination loss by using at least part of the determination vectors through a cross entropy method, (ii) to generate a box regression loss by using at least part of the box regression vectors through a smooth-L1 method, and (iii) to generate the integrated loss by referring to the determination loss and the box regression loss.\n | 5. The learning method as claimed in claim 4, wherein, at the step of (c), the determination loss is generated by a formula below: Equation wherein n denotes the number of the determination vectors, vc i denotes an i-th determination vector, vC-GT i denotes an i-th determination GT vector on the i-th determination vector, and the box regression loss is generated by a formula below: Equation wherein n denotes the number of the box regression vectors, vc i denotes an i-th box regression vector, vC-GT i denotes an i-th box regression GT vector on the i-th box regression vector.\n | 6. The learning method as claimed in claim 1, wherein the learning device instructs each of deep learning neurons included in one or more layers of the DNN to repeatedly apply one or more convolutional operations to its input by using its own at least one parameter and deliver its output to its next deep learning neuron, to thereby generate the pair feature vectors, the determination vectors and the box regression vectors.\n | 7. The learning method as claimed in claim 1, wherein, at the step of (b), the learning device instructs the determining network included in the DNN to generate the determination vectors by applying at least part of the FC operations to the pair feature vectors, and to generate the one or more box regression vectors corresponding to one or more specific pair feature vectors, among the pair feature vectors, whose values in corresponding specific determination vectors denoting specific probabilities of specific pairs to be integrated are larger than a prescribed threshold, by applying the other part of the FC operations to the specific pair feature vectors.\n | 8. A testing method for generating integrated object detection information for testing on an integrated target space for testing including a first target space for testing and a second target space for testing, by integrating first object detection information for testing on the first target space for testing generated by a first vehicle for testing and second object detection information for testing on the second target space for testing generated by a second vehicle for testing, comprising steps of: (a) on condition that (1) a learning device (100), if first object detection information for training on a first target space for training and second object detection information for training on a second target space for training have been acquired by processing a first original image for training on the first target space for training and a second original image for training on the second target space for training, has instructed a concatenating network (210) included in a DNN (200) to generate one or more pair feature vectors for training including information on one or more pairs for training of first original ROIs for training included in the first target space for training and second original ROIs for training in the second target space for training; (2) the learning device (100) has instructed a determining network (220) included in the DNN to apply one or more FC operations to the pair feature vectors for training, to thereby generate (i) one or more determination vectors for training including information on probabilities for training of the first original ROIs for training and the second original ROIs for training included in each of the pairs for training being appropriate to be integrated and (ii) one or more box regression vectors for training including information on each of relative 3-Dimensional locations for training of integrated ROIs for training, corresponding to at least part of the pairs for training, comparing to each of original 3-Dimensional locations for training of each component of said at least part of the pairs for training, on an integrated target space for training; (3) the learning device (100) has instructed a loss unit (230) to generate an integrated loss by referring to the determination vectors for training, the box regression vectors for training and their corresponding GTs, and performing backpropagation processes by using the integrated loss, to thereby learn at least part of parameters included in the DNN, a testing device installed on the first vehicle, if the first object detection information for testing on the first target space for testing and the second object detection information for testing on the second target space for testing are acquired by processing a first original image for testing on the first target space for testing and a second original image for testing on the second target space for testing, instructing the concatenating network (210) included in the DNN (200) to generate one or more pair feature vectors for testing including information on one or more pairs for testing of first original ROIs for testing included in the first target space for testing and second original ROIs for testing in the second target space for testing; (b) the testing device (100) instructing the determining network (220) included in the DNN to apply the FC operations to the pair feature vectors for testing, to thereby generate (i) one or more determination vectors for testing including information on probabilities for testing of the first original ROIs for testing and the second original ROIs for testing included in each of the pairs for testing being appropriate to be integrated and (ii) one or more box regression vectors for testing including information on each of relative 3-dimensional locations for testing of integrated ROIs for testing, corresponding to at least part of the pairs for testing, comparing to each of original 3-Dimensional locations for testing of each component of said at least part of the pairs for testing, on the integrated target space for testing; (c) the testing device (100) instructing a merging unit to generate the integrated object detection information for testing by merging at least part of the pairs for testing of first original bounding boxes for testing and second original bounding boxes for testing by referring to the determination vectors for testing and the box regression vectors for testing.\n | 9. The testing method as claimed in claim 8, wherein the first original image for testing on the first target space for testing, acquired through at least part of one or more first cameras, one or more first lidars and one or more first radars installed on the first vehicle, is processed by a first neural network included in the first vehicle, to thereby generate the first object detection information for testing including (i) first class information for testing on objects for testing included in the first target space for testing, (ii) feature values for testing of the first original ROIs for testing, (iii) 3-dimensional coordinate values for testing of the first original bounding boxes for testing and (iv) 3-dimensional coordinate values for testing of the first original ROIs for testing, and the second original image for testing on the second target space for testing, acquired through at least part of one or more second cameras, one or more second lidars and one or more second radars installed on the second vehicle, is processed by a second neural network included in the second vehicle, to thereby generate the second object detection information including (i) second class information for testing on objects for testing included in the second target space for testing, (ii) feature values for testing of the second original ROIs for testing, (iii) 3-dimensional coordinate values for testing of the second original bounding boxes for testing and (iv) 3- dimensional coordinate values for testing of the second original ROIs for testing, and the second object detection information is delivered to the first vehicle through a V2V communication.\n | 10. The testing method as claimed in claim 8, wherein, at the step of (a), a specific pair feature vector for testing, which is one of the pair feature vectors for testing, includes (i) first class information for testing of a first specific object for testing included in the first target space for testing, (ii) feature values for testing of a first specific original ROI for testing including the first specific object for testing, (iii) 3-dimensional coordinate values of a first specific original bounding box for testing corresponding to the first specific original ROI for testing, (iv) 3-dimensional coordinate values of the first specific original ROI for testing, (v) second class information for testing of a second specific object for testing included in the second target space for testing, (vi) feature values for testing of a second specific original ROI for testing including the second specific object for testing, (vii) 3-dimensional coordinate values of a second specific original bounding box for testing corresponding to the second specific original ROI for testing, and (viii) 3- dimensional coordinate values of the second specific original ROI for testing.\n | 11. The testing method as claimed in claim 10, wherein, at the step of (b), a specific determination vector for testing, which is one of the determination vectors for testing and corresponds to the specific pair feature vector for testing, includes information on a probability of the first specific original ROI for testing and the second specific original ROI for testing being integrated on the integrated target space for testing, and a specific box regression vector for testing, which is one of the box regression vectors for testing and corresponds to the specific pair feature vector for testing, includes information on 3-dimensional coordinates of a specific integrated bounding box for testing generated by merging the first specific original ROI for testing and the second specific original ROI for testing on the integrated target space for testing.\n | 12. A learning device (100) for generating integrated object detection information on an integrated target space including a first target space and a second target space, by integrating first object detection information on the first target space generated by a first vehicle and second object detection information on the second target space generated by a second vehicle, comprising: at least one memory (115) that stores instructions; and at least one processor (120) configured to execute the instructions to perform processes of: (I) if the first object detection information on the first target space and the second object detection information on the second target space are acquired by processing a first original image on the first target space and a second original image on the second target space, instructing a concatenating network (210) included in a DNN (200) to generate one or more pair feature vectors including information on one or more pairs of first original ROIs included in the first target space and second original ROIs in the second target space; (II) instructing a determining network (220) included in the DNN to apply one or more FC operations to the pair feature vectors, to thereby generate (i) one or more determination vectors including information on probabilities of the first original ROIs and the second original ROIs included in each of the pairs being appropriate to be integrated and (ii) one or more box regression vectors including information on each of relative 3-dimensional locations of integrated ROIs, corresponding to at least part of the pairs, comparing to each of original 3- dimensional locations of each component of said at least part of the pairs, on the integrated target space; (III) instructing a loss unit (230) to generate an integrated loss by referring to the determination vectors, the box regression vectors and their corresponding GTs, and performing backpropagation processes by using the integrated loss, to thereby learn at least part of parameters included in the DNN.\n | 13. The learning device as claimed in claim 12, wherein, at the process of (I), a specific pair feature vector, which is one of the pair feature vectors, includes (i) first class information of a first specific object included in the first target space, (ii) feature values of a first specific original ROI including the first specific object, (iii) 3-dimensional coordinate values of a first specific original bounding box corresponding to the first specific original ROI, (iv) 3- dimensional coordinate values of the first specific original ROI, (v) second class information of a second specific object included in the second target space, (vi) feature values of a second specific original ROI including the second specific object, and (vii) 3-dimensional coordinate values of a second specific original bounding box corresponding to the second specific original ROI, and (viii) 3-dimensional coordinate values of the second specific original ROI.\n | 14. The learning device as claimed in claim 13, wherein, at the process of (II), a specific determination vector, which is one of the determination vectors and corresponds to the specific pair feature vector, includes information on a probability of the first specific original ROI and the second specific original ROI being integrated on the integrated target space, and a specific box regression vector, which is one of the box regression vectors and corresponds to the specific pair feature vector, includes information on 3- dimensional coordinates of a specific integrated bounding box generated by merging the first specific original ROI and the second specific original ROI on the integrated target space.\n | 15. The learning device as claimed in claim 12, wherein, at the process of (III), the processor instructs the loss unit (i) to generate a determination loss by using at least part of the determination vectors through a cross entropy device, (ii) to generate a box regression loss by using at least part of the box regression vectors through a smooth-L1 device, and (iii) to generate the integrated loss by referring to the determination loss and the box regression loss.\n | 16. The learning device as claimed in claim 15, wherein, at the process of (III), the determination loss is generated by a formula below: Equation wherein n denotes the number of the determination vectors, vc i denotes an i-th determination vector, vC-GT i denotes an i-th determination GT vector on the i-th determination vector, and the box regression loss is generated by a formula below: Equation wherein n denotes the number of the box regression vectors, vc i denotes an i-th box regression vector, vC-GT i denotes an i-th box regression GT vector on the i-th box regression vector.\n | 17. The learning device as claimed in claim 12, wherein the processor instructs each of deep learning neurons included in one or more layers of the DNN to repeatedly apply one or more convolutional operations to its input by using its own at least one parameter and deliver its output to its next deep learning neuron, to thereby generate the pair feature vectors, the determination vectors and the box regression vectors.\n | 18. The learning device as claimed in claim 12, wherein, at the process of (II), the processor instructs the determining network included in the DNN to generate the determination vectors by applying at least part of the FC operations to the pair feature vectors, and to generate the one or more box regression vectors corresponding to one or more specific pair feature vectors, among the pair feature vectors, whose values in corresponding specific determination vectors denoting specific probabilities of specific pairs to be integrated are larger than a prescribed threshold, by applying the other part of the FC operations to the specific pair feature vectors.\n | 19. A testing device (100) for generating integrated object detection information for testing on an integrated target space for testing including a first target space for testing and a second target space for testing, by integrating first object detection information for testing on the first target space for testing generated by a first vehicle for testing and second object detection information for testing on the second target space for testing generated by a second vehicle for testing, comprising: at least one memory (115) that stores instructions; and at least one processor (120) configured to execute the instructions to perform processes of: (I) on condition that (1) a learning device (100), if first object detection information for training on a first target space for training and second object detection information for training on a second target space for training have been acquired by processing a first original image for training on the first target space for training and a second original image for training on the second target space for training, has instructed a concatenating network (210) included in a DNN (200) to generate one or more pair feature vectors for training including information on one or more pairs for training of first original ROIs for training included in the first target space for training and second original ROIs for training in the second target space for training; (2) the learning device (100) has instructed a determining network (220) included in the DNN to apply one or more FC operations to the pair feature vectors for training, to thereby generate (i) one or more determination vectors for training including information on probabilities for training of the first original ROIs for training and the second original ROIs for training included in each of the pairs for training being appropriate to be integrated and (ii) one or more box regression vectors for training including information on each of relative 3-Dimensional locations for training of integrated ROIs for training, corresponding to at least part of the pairs for training, comparing to each of original 3-Dimensional locations for training of each component of said at least part of the pairs for training, on an integrated target space for training; (3) the learning device (100) has instructed a loss unit (230) to generate an integrated loss by referring to the determination vectors for training, the box regression vectors for training and their corresponding GTs, and performing backpropagation processes by using the integrated loss, to thereby learn at least part of parameters included in the DNN, if the first object detection information for testing on the first target space for testing and the second object detection information for testing on the second target space for testing are acquired by processing a first original image for testing on the first target space for testing and a second original image for testing on the second target space for testing, instructing the concatenating network (210) included in the DNN (200) to generate one or more pair feature vectors for testing including information on one or more pairs for testing of first original ROIs for testing included in the first target space for testing and second original ROIs for testing in the second target space for testing; (II) instructing the determining network (220) included in the DNN to apply the FC operations to the pair feature vectors for testing, to thereby generate (i) one or more determination vectors for testing including information on probabilities for testing of the first original ROIs for testing and the second original ROIs for testing included in each of the pairs for testing being appropriate to be integrated and (ii) one or more box regression vectors for testing including information on each of relative 3-dimensional locations for testing of integrated ROIs for testing, corresponding to at least part of the pairs for testing, comparing to each of original 3-Dimensional locations for testing of each component of said at least part of the pairs for testing, on the integrated target space for testing; (III) instructing a merging unit to generate the integrated object detection information for testing by merging at least part of the pairs for testing of first original bounding boxes for testing and second original bounding boxes for testing by referring to the determination vectors for testing and the box regression vectors for testing.\n | 20. The testing device as claimed in claim 19, wherein the first original image for testing on the first target space for testing, acquired through at least part of one or more first cameras, one or more first lidars and one or more first radars installed on the first vehicle, is processed by a first neural network included in the first vehicle, to thereby generate the first object detection information for testing including (i) first class information on objects for testing included in the first target space for testing, (ii) feature values for testing of the first original ROIs for testing, (iii) 3-dimensional coordinate values for testing of the first original bounding boxes for testing and (iv) 3- dimensional coordinate values for testing of the first original ROIs for testing, and the second original image for testing on the second target space for testing, acquired through at least part of one or more second cameras, one or more second lidars and one or more second radars installed on the second vehicle, is processed by a second neural network included in the second vehicle, to thereby generate the second object detection information including (i) second class information on objects for testing included in the second target space for testing, (ii) feature values for testing of the second original ROIs for testing, (iii) 3-dimensional coordinate values for testing of the second original bounding boxes for testing and (iv) 3-dimensional coordinate values for testing of the second original ROIs for testing, and the second object detection information is delivered to the first vehicle through a V2V communication.\n | 21. The testing device as claimed in claim 19, wherein, at the process of (I), a specific pair feature vector for testing, which is one of the pair feature vectors for testing, includes (i) first class information for testing of a first specific object for testing included in the first target space for testing, (ii) feature values for testing of a first specific original ROI for testing including the first specific object for testing, (iii) 3-dimensional coordinate values of a first specific original bounding box for testing corresponding to the first specific original ROI for testing, (iv) 3-dimensional coordinate values of the first specific original ROI for testing, (v) second class information for testing of a second specific object for testing included in the second target space for testing, (vi) feature values for testing of a second specific original ROI for testing including the second specific object for testing, (vii) 3-dimensional coordinate values of a second specific original bounding box for testing corresponding to the second specific original ROI for testing, and (viii) 3- dimensional coordinate values of the second specific original ROI for testing.\n | 22. The testing device as claimed in claim 21, wherein, at the process of (II), a specific determination vector for testing, which is one of the determination vectors for testing and corresponds to the specific pair feature vector for testing, includes information on a probability of the first specific original ROI for testing and the second specific original ROI for testing being integrated on the integrated target space for testing, and a specific box regression vector for testing, which is one of the box regression vectors for testing and corresponds to the specific pair feature vector for testing, includes information on 3-dimensional coordinates of a specific integrated bounding box for testing generated by merging the first specific original ROI for testing and the second specific original ROI for testing on the integrated target space for testing.", "output": "The learning method involves instructing a concatenating network (210) included in a determining network (DNN) to generate the pair feature vectors including information on the pairs of first original relative locations of integrated (ROIs) included in the first target space and second original ROIs in the second target space. The learning device instructing a determining network (220) included in the DNN to apply an operations to the pair feature vectors. The learning device instructs a loss unit (230) to generate an integrated loss by referring to the determination vectors. The box regression vectors performing a backpropagation processes by using the integrated loss to learn a unit of parameters included in the DNN. INDEPENDENT CLAIMS are included for the following:a testing method for generating integrated object detection information for testing on integrated target space; anda learning device for generating integrated object detection information on an integrated target space. Learning method for generating integrated object detection information on integrated target space using learning device (claimed). The original images are integrated and the results of object detection on the integrated target space are generated without additional operations on the integrated image. The safety of the autonomous vehicles using the integrated space detection result is improved. The drawing shows a schematic view of a learning device performing a learning process for integrating object detection information. 210Concatenating network220Determining network230Loss unit", "instruction": "Please summarize the input" }, { "input": "METHOD FOR PROVIDING ROBUST OBJECT DISTANCE ESTIMATION BASED ON CAMERA BY PERFORMING PITCH CALIBRATION OF CAMERA MORE PRECISELY WITH FUSION OF INFORMATION ACQUIRED THROUGH CAMERA AND INFORMATION ACQUIRED THROUGH V2V COMMUNICATION AND DEVICE USING THE SAMEA method for enhancing an accuracy of object distance estimation based on a subject camera by performing pitch calibration of the subject camera more precisely with additional information acquired through V2V communication is provided. And the method includes steps of: (a) a computing device, performing (i) a process of instructing an initial pitch calibration module to apply a pitch calculation operation to the reference image, to thereby generate an initial estimated pitch, and (ii) a process of instructing an object detection network to apply a neural network operation to the reference image, to thereby generate reference object detection information; (b) the computing device instructing an adjusting pitch calibration module to (i) select a target object, (ii) calculate an estimated target height of the target object, (iii) calculate an error corresponding to the initial estimated pitch, and (iv) determine an adjusted estimated pitch on the subject camera by using the error.|1. A method for enhancing an accuracy of object distance estimation based on at least one subject camera by performing pitch calibration of the subject camera more precisely with additional information acquired through Vehicle-to-Vehicle (V2V) communication, comprising steps of: (a) a computing device [100], if at least one reference image is acquired through the subject camera, performing (i) a process of instructing an initial pitch calibration module [140] to apply at least one pitch calculation operation to the reference image, to thereby generate an initial estimated pitch, which is a value generated by estimating an angle between an optical axis of the subject camera and a ground, and (ii) a process of instructing an object detection network [170] to apply at least one neural network operation to the reference image, to thereby generate reference object detection information on one or more reference objects in the reference image; (b) the computing device [100] instructing an adjusting pitch calibration module [150] to (i) select at least one target object among the reference objects, (ii) calculate at least one estimated target height of the target object by referring to the initial estimated pitch and at least one relative location of the target object from a subject autonomous vehicle including the subject camera, (iii) calculate at least one error corresponding to the initial estimated pitch by referring to the estimated target height and at least one Ground-Truth (GT) target height acquired through V2V communication, and (iv) determine at least one adjusted estimated pitch on the subject camera by using the error; (c) the computing device [100] instructing the object detection network [170] and a distance calculation module [160] to generate autonomous driving information including information on distances, calculated by referring to the adjusted estimated pitch, between the subject autonomous vehicle and surrounding objects included in an autonomous driving image, wherein the computing device [100] instructs the adjusting pitch calibration module [150] to select one or more specific reference objects, among the reference objects, which satisfy a first condition on whether each of the reference objects has a specific class corresponding to a communicability or not, and to select said at least one target object, among the specific reference objects, which satisfies at least one of (i) a second condition on whether each of specific reference bounding boxes including each of the specific reference objects is located in at least one illustration window area of the reference image or not and (ii) a third condition on whether an aspect ratio of each of the specific reference bounding box is smaller than an estimation threshold value or not.\n | 2. The method of Claim 1, wherein, before the step of (b), the computing device [100] instructs the distance calculation module [160] to map each of one or more reference location base points, which are points in each of lower sides of each of reference bounding boxes including each of the reference objects, onto a space coordinate system corresponding to a virtual space including the subject autonomous vehicle by referring to the initial estimated pitch, the reference object detection information and the reference image, to thereby calculate one or more longitudinal floor distances and one or more lateral floor distances between the reference objects and the subject autonomous vehicle, and then to thereby generate each of reference relative coordinates including each of the longitudinal floor distances and the lateral floor distances as its components.\n | 3. The method of Claim 2, wherein, before the step of (b), the computing device [100] instructs the distance calculation module [160] to map the reference location base points onto the space coordinate system.\n | 4. The method of Claim 2, wherein, before the step of (b), the computing device [100] instructs a V2V communication module [130] to communicate with one or more communicable objects located closer than a threshold distance from the subject autonomous vehicle, to thereby acquire information on one or more communicable object classes, one or more communicable object GT heights, and one or more communicable object coordinates, and instructs the adjusting pitch calibration module [150] to select specific reference objects, among the reference objects, which have a specific class corresponding to a communicability, and pair at least part of the communicable object GT heights with at least part of the specific reference objects by referring to (i) communicable object relative coordinates in relation to the subject autonomous vehicle, calculated by using the communicable object coordinates and (ii) specific reference relative coordinates on the specific reference objects, to thereby acquire specific reference GT heights on the specific reference objects.\n | 5. The method of Claim 1, wherein, at the step of (b), the computing device [100], if the target object is selected, instructs the adjusting pitch calibration module [150] to select a target relative coordinate corresponding to the target object, among the reference relative coordinates, and to calculate the estimated target height by performing a height estimating operation by referring to the initial estimated pitch.\n | 6. The method of Claim 1, wherein, at the step of (b), the computing device [100], in case the number of the target object is 1, instructs the adjusting pitch calibration module [150] (i) to set an overestimated range and an underestimated range by referring to the GT target height, and (ii-1) to adjust the initial estimated pitch to be decreased by a prescribed adjustment ratio if the estimated target height is included in the overestimated range, or (ii-2) to adjust the initial estimated pitch to be increased by the prescribed adjustment ratio if the estimated target height is included in the underestimated range.\n | 7. The method of Claim 1, wherein, at the step of (b), the computing device [100], in case the number of the target object is larger than or same as 2, instructs the adjusting pitch calibration module [150] (i) to set an overestimated range and an underestimated range by referring to the GT target height, (ii) to acquire information on at least one of an overestimated error ratio corresponding to the overestimated range and an underestimated error ratio corresponding to the underestimated range, and (iii) adjust the initial estimated pitch by referring to said information.\n | 8. The method of Claim 1, wherein, at the step of (a), the computing device [100] instructs the initial pitch calibration module [140] to generate the initial estimated pitch by applying the pitch calculation operation to the reference image.\n | 9. The method of Claim 1, wherein, at the step of (a), the computing device [100] (i) instructs a convolutional layer of the object detection network [170] to generate at least one reference convolutional feature map by applying at least one convolutional operation, which is a part of the neural network operation, to the reference image, (ii) instructs an ROI pooling layer of the object detection network [170] to apply at least one pooling operation, which is a part of the neural network operation, in order to pool values, corresponding to ROIs of the reference image, from the reference convolutional feature map, to thereby generate at least one reference ROI25 Pooled feature map, and (iii) instructs an FC layer of the object detection network [170] to apply at least one FC operation, which is a part of the neural network operation, to the reference ROI-Pooled feature map, to thereby generate the reference object detection information including information on reference classes of the reference objects and reference bounding boxes including the reference objects.\n | 10. A computing device [100] for enhancing an accuracy of object distance estimation based on at least one subject camera by performing pitch calibration of the subject camera more precisely with additional information acquired through Vehicle-to-Vehicle (V2V) communication, comprising: at least one memory [115] that stores instructions; and at least one processor [110] configured to execute the instructions to perform processes of: (I) if at least one reference image is acquired through the subject camera, performing (i) a process of instructing an initial pitch calibration module [140] to apply at least one pitch calculation operation to the reference image, to thereby generate an initial estimated pitch, which is a value generated by estimating an angle between an optical axis of the subject camera and a ground, and (ii) a process of instructing an object detection network [170] to apply at least one neural network operation to the reference image, to thereby generate reference object detection information on one or more reference objects in the reference image; (II) instructing an adjusting pitch calibration module [150] to (i) select at least one target object among the reference objects, (ii) calculate at least one estimated target height of the target object by referring to the initial estimated pitch and at least one relative location of the target object from a subject autonomous vehicle including the subject camera, (iii) calculate at least one error corresponding to the initial estimated pitch by referring to the estimated target height and at least one Ground-Truth (GT) target height acquired beforehand, and (iv) determine at least one adjusted estimated pitch on the subject camera by using the error; (III) the computing device [100] instructing the object detection network [170] and a distance calculation module [160] to generate autonomous driving information including information on distances, calculated by referring to the adjusted estimated pitch, between the subject autonomous vehicle and surrounding objects included in an autonomous driving image, wherein the computing device [100] instructs the adjusting pitch calibration module [150] to select one or more specific reference objects, among the reference objects, which satisfy a first condition on whether each of the reference objects has a specific class corresponding to a communicability or not, and to select said at least one target object, among the specific reference objects, which satisfies at least one of (i) a second condition on whether each of specific reference bounding boxes including each of the specific reference objects is located in at least one illustration window area of the reference image or not and (ii) a third condition on whether an aspect ratio of each of the specific reference bounding box is smaller than an estimation threshold value or not.\n | 11. The device of Claim 10, wherein, before the process of (II), the processor [110] instructs a distance calculation module [160] to map each of one or more reference location base points, which are points in each of lower sides of each of reference bounding boxes including each of the reference objects, onto a space coordinate system corresponding to a virtual space including the subject autonomous vehicle by referring to the initial estimated pitch, the reference object detection information, and the reference image, to thereby calculate one or more longitudinal floor distances and one or more lateral floor distances between the reference objects and the subject autonomous vehicle, and then to thereby generate each of reference relative coordinates including each of the longitudinal floor distances and the lateral floor distances as its components.\n | 12. The device of Claim 11, wherein, before the process of (II), the processor [110] instructs the distance calculation module [160] to map the reference location base points onto the space coordinate system.\n | 13. The device of Claim 11, wherein, before the process of (II), the processor [110] instructs a V2V communication module [130] to communicate with one or more communicable objects located closer than a threshold distance from the subject autonomous vehicle, to thereby acquire information on one or more communicable object classes, one or more communicable object GT heights, and one or more communicable object coordinates, and instructs the adjusting pitch calibration module [150] to select specific reference objects, among the reference objects, which have a specific class corresponding to a communicability, and pair at least part of the communicable object GT heights with at least part of the specific reference objects by referring to (i) communicable object relative coordinates in relation to the subject autonomous vehicle, calculated by using the communicable object coordinates and (ii) specific reference relative coordinates on the specific reference objects, to thereby acquire specific reference GT heights on the specific reference objects.\n | 14. The device of Claim 10, wherein, at the process of (II), the processor [110], if the target object is selected, instructs the adjusting pitch calibration module [150] to select a target relative coordinate corresponding to the target object, among the reference relative coordinates, and to calculate the estimated target height by performing a height estimating operation by referring to the initial estimated pitch.\n | 15. The device of Claim 10, wherein, at the process of (II), the processor [110], in case the number of the target object is 1, instructs the adjusting pitch calibration module [150] (i) to set an overestimated range and an underestimated range by referring to the GT target height, and (ii-1) to adjust the initial estimated pitch to be decreased by a prescribed adjustment ratio if the estimated target height is included in the overestimated range, or (ii-2) to adjust the initial estimated pitch to be increased by the prescribed adjustment ratio if the estimated target height is included in the underestimated range.\n | 16. The device of Claim 10, wherein, at the process of (II), the processor [110], in case the number of the target object is larger than or same as 2, instructs the adjusting pitch calibration module [150] (i) to set an overestimated range and an underestimated range by referring to the GT target height, (ii) to acquire information on at least one of an overestimated error ratio corresponding to the overestimated range and an underestimated error ratio corresponding to the underestimated range, and (iii) adjust the initial estimated pitch by referring to said information.\n | 17. The device of Claim 10, wherein, at the process of (I), the processor [110] instructs the initial pitch calibration module [140] to generate the initial estimated pitch by applying the pitch calculation operation to the reference image.\n | 18. The device of Claim 10, wherein, at the process of (I), the processor [110] (i) instructs a convolutional layer of the object detection network [170] to generate at least one reference convolutional feature map by applying at least one convolutional operation, which is a part of the neural network operation, to the reference image, (ii) instructs an ROI pooling layer of the object detection network [170] to apply at least one pooling operation, which is a part of the neural network operation, in order to pool values, corresponding to ROIs of the reference image, from the reference convolutional feature map, to thereby generate at least one reference ROIPooled feature map, and (iii) instructs an FC layer of the object detection network [170] to apply at least one FC operation, which is a part of the neural network operation, to the reference ROI-Pooled feature map, to thereby generate the reference object detection information including information on reference classes of the reference objects and reference bounding boxes including the reference objects.", "output": "The method involves instructing an initial pitch calibration module (140) to apply a pitch calculation operation to the reference image. An initial estimated pitch is generated. A value is generated by estimating an angle between an optical axis of the subject camera and a ground. An object detection network (170) is instructed to apply a neural network operation to the reference image. The reference object detection information is generated on multiple reference objects in the reference image. A target object is selected among the reference objects. An estimated target height of the target object is calculated by referring to the initial estimated pitch. A relative location of the target object is provided from a subject autonomous vehicle including the subject camera. An error corresponding to the initial estimated pitch is calculated by referring to the estimated target height. A ground-truth (GT) target height is acquired. An INDEPENDENT CLAIM is included for a computing device for providing an object distance estimation based on a camera for communicating with a vehicle. Method for providing an object distance estimation based on a camera for communicating with an autonomous vehicle, such as car. The accuracy of object distance estimation is improved, even when the quality of the input image acquired through the camera is not good. The pitch calibration is performed more precisely with the additional information acquired through the vehicle-to-vehicle communication. The drawing shows a block diagram of computing device for providing an object distance estimation based on a camera for communicating with a vehicle. 115Memory140Initial pitch calibration module150Adjusting pitch calibration module160Distance calculation module170Object detection network", "instruction": "Please summarize the input" }, { "input": "METHOD AND DEVICE FOR PERFORMING MULTIPLE AGENT SENSOR FUSION IN COOPERATIVE DRIVING BASED ON REINFORCEMENT LEARNINGA method for learning a sensor fusion network for sensor fusion of an autonomous vehicle performing a cooperative driving is provided. The method includes steps of: a learning device, (a) inputting (i) a driving image including the autonomous vehicle, m cooperatively-driving vehicles, and second virtual vehicles and (ii) sensor status information on n sensors in the m cooperatively-driving vehicles into the sensor fusion network, to generate sensor fusion probabilities of sensor values of the n sensors being transmitted and generate fusion sensor information on s sensors having large probabilities, (b) inputting a road-driving video into a detection network, to detect the second virtual vehicles, pedestrians, and lanes and output nearby object information, and inputting sensor values and the nearby object information into a drive network, to generate moving direction probabilities and drive the autonomous vehicle and (c) acquiring traffic condition information, generating a reward, and learning the sensor fusion network.|1. A method for learning a sensor fusion network (140) to be used for sensor fusion of an autonomous vehicle performing a cooperative driving, comprising steps of: (a) if (i) a driving image for training including (i-1) a subject autonomous vehicle, (i-2) m cooperatively-driving vehicles for training having first virtual vehicles performing the cooperative driving with the subject autonomous vehicle, and (i-3) second virtual vehicles performing a non-cooperative driving and (ii) multiple pieces of sensor status information for training on n sensors for training in each of the m cooperatively-driving vehicles for training are acquired, a learning device (100) performing a process of inputting the driving image for training and the multiple pieces of the sensor status information for training into the sensor fusion network (140), to thereby allow the sensor fusion network (a-1) to generate sensor fusion probabilities for training which are probabilities of said each of the m cooperatively-driving vehicles for training transmitting each of sensor values of each of the n sensors for training over vehicle to vehicle (V2V) communication for the cooperative driving, by applying its neural network operation to the driving image for training and the multiple pieces of the sensor status information for training and (a-2) to generate fusion sensor information for training on s sensors for training having probabilities larger than a preset threshold among the sensor fusion probabilities for training wherein s is an integer ranging from 1 to mxn; (b) the learning device (100) performing a process of inputting a road-driving video for training acquired over the V2V communication in response to the fusion sensor information for training into a detection network (150), to thereby allow the detection network (150) to detect at least part of the second virtual vehicles, one or more pedestrians, and one or more lanes on a traveling road of the subject autonomous vehicle and thus to output nearby object information for training, and a process of inputting both sensor values for training, acquired over the V2V communication in response to the fusion sensor information for training, and the nearby object information for training into a drive network (160), to thereby allow the drive network (160) to generate moving direction probabilities for training of said each of the m cooperatively-driving vehicles for training by referring to the sensor values for training and the nearby object information for training, and thus to drive the subject autonomous vehicle by referring to the moving direction probabilities for training; and (c) the learning device (100) performing a process of acquiring traffic condition information for training on the subject autonomous vehicle driven by the drive network (160), a process of generating a reward by referring to the traffic condition information for training, and a process of learning the sensor fusion network (140) by using the reward.\n | 2. The method as claimed in 1, wherein, at the step of (a), the learning device (100) performs a process of inputting the driving image for training and the multiple pieces of the sensor status information for training into the sensor fusion network (140), to thereby allow the sensor fusion network (140) to (i) generate a feature map for training by applying convolution operation using a Convolutional Neural Network (CNN) to the driving image for training and generate an image feature vector for training by applying fully-connected operation to the feature map for training, (ii) generate a sensor status feature vector for training by applying recurrent neural network operation using at least one long short-term memory (LSTM) to the multiple pieces of the sensor status information for training, and (iii) generate a concatenated feature vector for training by concatenating the image feature vector for training and the sensor status feature vector for training and generate the sensor fusion probabilities for training by applying fully-connected operation of at least one fully connected layer to the concatenated feature vector for training.\n | 3. The method as claimed in 2, wherein the learning device (100) updates at least one parameter of the CNN, the at least one LSTM, and the at least one fully connected layer which are included in the sensor fusion network (140), by using the reward.\n | 4. The method as claimed in 2, wherein the learning device (100) instructs the sensor fusion network (140) to (i) allow a pooling layer to apply max-pooling operation to the feature map for training and then (ii) apply fully-connected operation to a result of said (i), to thereby generate the image feature vector for training.\n | 5. The method as claimed in 2, wherein the learning device (100) normalizes and outputs each of the sensor fusion probabilities for training using a softmax algorithm.\n | 6. The method as claimed in 1, wherein the reward is generated by subtracting the number of the s sensors for training from a sum of the number of the n sensors for training in each of the m cooperatively-driving vehicles for training, and wherein the learning device (100) increases or decreases the reward by referring to the traffic condition information for training.\n | 7. The method as claimed in 1, wherein the driving image for training is an entire road image of an entire road on which the m cooperatively-driving vehicles for training is in the cooperative driving, and is an image with m+1 channels which represents whether each of blocks of a certain size, into which the entire road image is divided as a grid, is occupied by said each of the m cooperatively driving vehicles for training or by all of the second virtual vehicles, and wherein each of m channels among said m+1 channels corresponds to said each of the m cooperatively-driving vehicles for training, and a remaining channel among said m+1 channels corresponds to the second virtual vehicles.\n | 8. A method for testing a sensor fusion network (140) to be used for sensor fusion of an autonomous vehicle performing a cooperative driving, comprising steps of: (a) on condition that a learning device (100) has performed, if (i) a driving image for training including (i-1) a subject autonomous vehicle for training, (i-2) m cooperatively-driving vehicles for training having first virtual vehicles performing the cooperative driving with the subject autonomous vehicle for training, and (i-3) second virtual vehicles performing a noncooperative driving and (ii) multiple pieces of sensor status information for training on n sensors for training in each of the m cooperatively-driving vehicles for training are acquired, (1) a process of inputting the driving image for training and the multiple pieces of the sensor status information for training into the sensor fusion network (140), to thereby allow the sensor fusion network (1-1) to generate sensor fusion probabilities for training which are probabilities of said each of the m cooperatively-driving vehicles for training transmitting each of sensor values of each of the n sensors for training over vehicle to vehicle (V2V) communication for the cooperative driving, by applying its neural network operation to the driving image for training and the multiple pieces of the sensor status information for training and (1-2) to generate fusion sensor information for training on s sensors for training having probabilities larger than a preset threshold among the sensor fusion probabilities for training wherein s is an integer ranging from 1 to mxn, (2) a process of inputting a road-driving video for training acquired over the V2V communication in response to the fusion sensor information for training into a detection network (150), to thereby allow the detection network (150) to detect at least part of the second virtual vehicles, one or more pedestrians, and one or more lanes on a traveling road of the subject autonomous vehicle for training and thus to output nearby object information for training, (3) a process of inputting both sensor values for training, acquired over the V2V communication in response to the fusion sensor information for training, and the nearby object information for training into a drive network (160), to thereby allow the drive network (160) to generate moving direction probabilities for training of said each of the m cooperatively driving vehicles for training by referring to the sensor values for training and the nearby object information for training, and thus to drive the subject autonomous vehicle for training by referring to the moving direction probabilities for training, and (4) a process of acquiring traffic condition information for training on the subject autonomous vehicle for training driven by the drive network (160), a process of generating a reward by referring to the traffic condition information for training, and a process of learning the sensor fusion network (140) by using the reward, if (i) a driving image for testing including (i-1) a subject autonomous vehicle for testing, (i-2) k cooperatively-driving vehicles for testing having first vehicles performing the cooperative driving with the subject autonomous vehicle for testing, and (i-3) second vehicles performing the non-cooperative driving, in an actual driving environment, and (ii) multiple pieces of sensor status information for testing on i sensors for testing in each of the k cooperatively-driving vehicles for testing are acquired, a testing device (200) of at least one of the k cooperatively-driving vehicles for testing performing a process of inputting the driving image for testing and the multiple pieces of the sensor status information for testing into the sensor fusion network (140), to thereby allow the sensor fusion network (a-1) to generate sensor fusion probabilities for testing which are probabilities of said each of the k cooperatively-driving vehicles for testing transmitting each of sensor values of each of the i sensors for testing over the V2V communication for the cooperative driving, by applying its neural network operation to the driving image for testing and the multiple pieces of the sensor status information for testing (a-2) to generate fusion sensor information for testing on s sensors for testing having probabilities larger than a predetermined threshold among the sensor fusion probabilities for testing, and (a-3) to transmit the fusion sensor information for testing on the s sensors for testing to at least part of the k cooperatively-driving vehicles for testing over the V2V communication; and (b) the testing device (200) of said at least one of the k cooperatively-driving vehicles for testing performing a process of inputting a road-driving video for testing acquired over the V2V communication in response to the fusion sensor information for testing into the detection network (150), to thereby allow the detection network (150) to detect at least part of the second vehicles, the pedestrians, and the lanes on a driving road of the subject autonomous vehicle for testing and thus to output nearby object information for testing, and a process of inputting both sensor values for testing, acquired over the V2V communication in response to the fusion sensor information for testing, and the nearby object information for testing into the drive network (160), to thereby allow the drive network (160) to generate moving direction probabilities for testing of said each of the k cooperatively driving vehicles for testing by referring to the sensor values for testing and the nearby object information for testing, and thus to drive the subject autonomous vehicle for testing by referring to the moving direction probabilities for testing.\n | 9. The method as claimed in 8, wherein, at the step of (a), the testing device (200) of said at least one of the k cooperatively-driving vehicles for testing performs a process of inputting the driving image for testing and the multiple pieces of the sensor status information for testing into the sensor fusion network (140), to thereby allow the sensor fusion network (140) to (i) generate a feature map for testing by applying convolution operation of a Convolutional Neural Network (CNN) to the driving image for testing and generate an image feature vector for testing by applying fully-connected operation to the feature map for testing, (ii) generate a sensor status feature vector for testing by applying recurrent neural network operation of at least one long short-term memory (LSTM) to the multiple pieces of the sensor status information for testing, and (iii) generate a concatenated feature vector for testing by concatenating the image feature vector for testing and the sensor status feature vector for testing and generate the sensor fusion probabilities for testing by applying fully-connected operation of at least one fully connected layer to the concatenated feature vector for testing.\n | 10. The method as claimed in 9, wherein the testing device (200) of said at least one of the k cooperatively-driving vehicles for testing instructs the sensor fusion network (140) to (i) allow a pooling layer to apply max-pooling operation to the feature map for testing and then (ii) apply fully-connected operation to a result of said (i), to thereby generate the image feature vector for testing.\n | 11. The method as claimed in 9, wherein the testing device (200) of said at least one of the k cooperatively-driving vehicles for testing normalizes and outputs each of the sensor fusion probabilities for testing using a softmax algorithm.\n | 12. The method as claimed in 8, wherein, at the step of (a), the testing device (200) of said at least one of the k cooperatively-driving vehicles for testing performs (i) a process of generating a feature map for testing by applying multiple convolution operation using a specific CNN to the driving image for testing, acquired from a specific cooperatively-driving vehicle among the k cooperatively-driving vehicles for testing, and if an image feature vector for testing is generated by applying fully-connected operation to the feature map for testing, a process of acquiring the image feature vector for testing from the specific cooperatively-driving vehicle over the V2V communication, (ii) a process of generating a sensor status feature vector for testing by applying recurrent neural network operation using at least one LSTM to the multiple pieces of the sensor status information for testing, and (iii) a process of generating a concatenated feature vector for testing by concatenating the image feature vector for testing and the sensor status feature vector for testing acquired over the V2V communication and a process of generating the sensor fusion probabilities for testing by applying fully-connected operation of at least one fully connected layer to the concatenated feature vector for testing.\n | 13. The method as claimed in 12, wherein the specific vehicle allows a specific CNN to apply convolution operation to the driving image for testing to thereby generate a feature map for testing, and to apply fully-connected operation to the feature map for testing to thereby generate the image feature vector for testing, and wherein the specific vehicle is one of the k cooperatively-driving vehicles for testing which is designated sequentially at stated intervals according to a round-robin schedule.\n | 14. A learning device (100) for learning a sensor fusion network (140) to be used for sensor fusion of an autonomous vehicle performing a cooperative driving, comprising: at least one memory (120) that stores instructions; and at least one processor (130) configured to execute the instructions to perform or support another device to perform: (I) if (i) a driving image for training including (i-1) a subject autonomous vehicle, (i-2) m cooperatively-driving vehicles for training having first virtual vehicles performing the cooperative driving with the subject autonomous vehicle, and (i-3) second virtual vehicles performing a non-cooperative driving and (ii) multiple pieces of sensor status information for training on n sensors for training in each of the m cooperatively driving vehicles for training are acquired, a process of inputting the driving image for training and the multiple pieces of the sensor status information for training into the sensor fusion network (140), to thereby allow the sensor fusion network (I-1) to generate sensor fusion probabilities for training which are probabilities of said each of the m cooperatively-driving vehicles for training transmitting each of sensor values of each of the n sensors for training over vehicle to vehicle (V2V) communication for the cooperative driving, by applying its neural network operation to the driving image for training and the multiple pieces of the sensor status information for training and (I-2) to generate fusion sensor information for training on s sensors for training having probabilities larger than a preset threshold among the sensor fusion probabilities for training wherein s is an integer ranging from 1 to mxn, (II) a process of inputting a road-driving video for training acquired over the V2V communication in response to the fusion sensor information for training into a detection network (150), to thereby allow the detection network (150) to detect at least part of the second virtual vehicles, one or more pedestrians, and one or more lanes on a traveling road of the subject autonomous vehicle and thus to output nearby object information for training, and a process of inputting both sensor values for training, acquired over the V2V communication in response to the fusion sensor information for training, and the nearby object information for training into a drive network (160), to thereby allow the drive network (160) to generate moving direction probabilities for training of said each of the m cooperatively-driving vehicles for training by referring to the sensor values for training and the nearby object information for training, and thus to drive the subject autonomous vehicle by referring to the moving direction probabilities for training, and (III) a process of acquiring traffic condition information for training on the subject autonomous vehicle driven by the drive network (160), a process of generating a reward by referring to the traffic condition information for training, and a process of learning the sensor fusion network (140) by using the reward.\n | 15. The learning device (100) as claimed in 14, wherein, at the process of (I), the processor (130) performs a process of inputting the driving image for training and the multiple pieces of the sensor status information for training into the sensor fusion network (140), to thereby allow the sensor fusion network (140) to (i) generate a feature map for training by applying convolution operation using a Convolutional Neural Network (CNN) to the driving image for training and generate an image feature vector for training by applying fully-connected operation to the feature map for training, (ii) generate a sensor status feature vector for training by applying recurrent neural network operation using at least one long short-term memory (LSTM) to the multiple pieces of the sensor status information for training, and (iii) generate a concatenated feature vector for training by concatenating the image feature vector for training and the sensor status feature vector for training and generate the sensor fusion probabilities for training by applying fully-connected operation of at least one fully connected layer to the concatenated feature vector for training.\n | 16. The learning device (100) as claimed in 15, wherein the processor (130) updates at least one parameter of the CNN, the at least one LSTM, and the at least one fully connected layer which are included in the sensor fusion network (140), by using the reward.\n | 17. The learning device (100) as claimed in 15, wherein the processor (130) instructs the sensor fusion network (140) to (i) allow a pooling layer to apply max-pooling operation to the feature map for training and then (ii) apply fully-connected operation to a result of said (i), to thereby generate the image feature vector for training.\n | 18. The learning device (100) as claimed in 15, wherein the processor (130) normalizes and outputs each of the sensor fusion probabilities for training using a softmax algorithm.\n | 19. The learning device (100) as claimed in 14, wherein the reward is generated by subtracting the number of the s sensors for training from a sum of the number of the n sensors for training in each of the m cooperatively-driving vehicles for training, and wherein the processor (130) increases or decreases the reward by referring to the traffic condition information for training.\n | 20. The learning device (100) as claimed in 14, wherein the driving image for training is an entire road image of an entire road on which the m cooperatively-driving vehicles for training is in the cooperative driving, and is an image with m+1 channels which represents whether each of blocks of a certain size, into which the entire road image is divided as a grid, is occupied by said each of the m cooperatively-driving vehicles for training or by all of the second virtual vehicles, and wherein each of m channels among said m+1 channels corresponds to said each of the m cooperatively-driving vehicles for training, and a remaining channel among said m+1 channels corresponds to the second virtual vehicles.\n | 21. A testing device (200) of at least one of k cooperatively-driving vehicles for testing, to be used for testing a sensor fusion network (140) to be used for sensor fusion of an autonomous vehicle performing a cooperative driving, comprising: at least one memory (220) that stores instructions; and at least one processor (230) configured to execute the instructions to perform or support another device to perform: (I) on condition that a learning device (100) has performed, if (i) a driving image for training including (i-1) a subject autonomous vehicle for training, (i-2) m cooperatively-driving vehicles for training having first virtual vehicles performing the cooperative driving with the subject autonomous vehicle for training, and (i-3) second virtual vehicles performing a noncooperative driving and (ii) multiple pieces of sensor status information for training on n sensors for training in each of the m cooperatively-driving vehicles for training are acquired, (1) a process of inputting the driving image for training and the multiple pieces of the sensor status information for training into the sensor fusion network (140), to thereby allow the sensor fusion network (1-1) to generate sensor fusion probabilities for training which are probabilities of said each of the m cooperatively-driving vehicles for training transmitting each of sensor values of each of the n sensors for training over vehicle to vehicle (V2V) communication for the cooperative driving, by applying its neural network operation to the driving image for training and the multiple pieces of the sensor status information for training and (1-2) to generate fusion sensor information for training on s sensors for training having probabilities larger than a preset threshold among the sensor fusion probabilities for training wherein s is an integer ranging from 1 to mxn, (2) a process of inputting a road-driving video for training acquired over the V2V communication in response to the fusion sensor information for training into a detection network (150), to thereby allow the detection network (150) to detect at least part of the second virtual vehicles, one or more pedestrians, and one or more lanes on a traveling road of the subject autonomous vehicle for training and thus to output nearby object information for training, (3) a process of inputting both sensor values for training, acquired over the V2V communication in response to the fusion sensor information for training, and the nearby object information for training into a drive network (160), to thereby allow the drive network (160) to generate moving direction probabilities for training of said each of the m cooperatively-driving vehicles for training by referring to the sensor values for training and the nearby object information for training, and thus to drive the subject autonomous vehicle for training by referring to the moving direction probabilities for training, and (4) a process of acquiring traffic condition information for training on the subject autonomous vehicle for training driven by the drive network (160), a process of generating a reward by referring to the traffic condition information for training, and a process of learning the sensor fusion network (140) by using the reward, if (i) a driving image for testing including (i-1) a subject autonomous vehicle for testing, (i-2) the k cooperativelydriving vehicles for testing having first vehicles performing the cooperative driving with the subject autonomous vehicle for testing, and (i-3) second vehicles performing the noncooperative driving, in an actual driving environment, and (ii) multiple pieces of sensor status information for testing on i sensors for testing in each of the k cooperatively-driving vehicles for testing are acquired, a process of inputting the driving image for testing and the multiple pieces of the sensor status information for testing into the sensor fusion network (140), to thereby allow the sensor fusion network (I-1) to generate sensor fusion probabilities for testing which are probabilities of said each of the k cooperatively-driving vehicles for testing transmitting each of sensor values of each of the i sensors for testing over the V2V communication for the cooperative driving, by applying its neural network operation to the driving image for testing and the multiple pieces of the sensor status information for testing (I2) to generate fusion sensor information for testing on s sensors for testing having probabilities larger than a predetermined threshold among the sensor fusion probabilities for testing, and (I3) to transmit ...", "output": "The method involves training a driving image which includes a subject autonomous vehicle, the cooperatively-driving vehicles for training having first virtual vehicles performing the cooperative driving with the subject autonomous vehicle. The second virtual vehicles performing a non-cooperative driving. The learning device (100) is performing a process of inputting a road-driving video for training acquired over the Vehicle-to-vehicle (V2V) communication in response to the fusion sensor information for training into a detection network (150). The learning device performing a process of acquiring traffic condition information for training on the subject autonomous vehicle driven by the drive network (160). A process of generating a reward by referring to the traffic condition information for training, and a process of learning the sensor fusion network (140) by using the reward. INDEPENDENT CLAIMS are included for the following:a learning device for learning a sensor fusion network to be used for sensor fusion of an autonomous vehicle performing a cooperative driving; anda testing device of one of cooperatively-driving vehicles for testing. Method for learning a sensor fusion network to be used for sensor fusion of an autonomous vehicle performing a cooperative driving. Utilize the sensor information from other autonomous vehicles on the cooperative driving mode to allow a functional safety, and reliability of the autonomous vehicles may be updated by training from virtual driving. The vehicle reduce its speed if the nearby vehicle is detected as having a possibility of collision with the subject autonomous vehicle within a second threshold time. The drawing shows a block diagram of a learning device for learning a sensor fusion network for sensor fusion of a subject autonomous vehicle. 100Learning device130Processor140Sensor fusion network150Detection network160Drive network", "instruction": "Please summarize the input" }, { "input": "METHOD AND DEVICE FOR EGO-VEHICLE LOCALIZATION TO UPDATE HD MAP BY USING V2X INFORMATION FUSIONA method for calculating exact location of a subject vehicle by using information on relative distances is provided. And the method includes steps of: (a) a computing device, if a reference image is acquired through a camera on the subject vehicle, detecting reference objects in the reference image; (b) the computing device calculating image-based reference distances between the reference objects and the subject vehicle, by referring to information on reference bounding boxes, corresponding to the reference objects, on the reference image; (c) the computing device (i) generating a distance error value by referring to the image-based reference distances and coordinate-based reference distances, and (ii) calibrating subject location information of the subject vehicle by referring to the distance error value.|1. A computer implemented method for calculating a location of a subject vehicle, the method comprising the steps of: \n * (a) detecting one or more reference objects in a reference image acquired through a camera on the subject vehicle, by applying at least one object detection operation to the reference image;\n * (b) calculating each of one or more image-based reference distances (1-..) between each of the reference objects and the subject vehicle, by referring to information on each of reference bounding boxes, corresponding to each of the reference objects, on the reference image; and\n * (c) (i) acquiring one or more coordinate-based reference distances (2-..) between each of the reference objects and the subject-vehicle by using information on a position of the subject vehicle acquired through a global positioning system, GPS, and information on locations of the reference objects acquired via vehicle-to-everything, V2X communications, or from a database, and generating at least one distance error value by referring to the image-based reference distances (1-..) and the one or more coordinate-based reference distances (2-..) between each of the reference objects and the subject vehicle, and (ii) calibrating subject location information of the subject vehicle by calculating an optimized location of the subject vehicle by minimizing the distance error value,\n * wherein the step of (b) further comprises calculating the image-based reference distances by applying at least one image distance estimation operation, which uses information on parameters of the camera, to the reference bounding boxes, and\n * wherein (i) one or more (1-1)-st specific image-based reference distances, among the image-based reference distances, are generated by applying at least one (1-1)-st specific image distance estimation operation, which further uses (1-1)-st specific reference object height information acquired through V2X communications between the subject vehicle and one or more (1-1)-st specific reference objects among the reference objects, to (1-1)-st specific reference bounding boxes, and (ii) one or more (1-2)-nd specific image-based reference distances, among the image-based reference distances, are generated by applying at least one (1-2)-nd specific image distance estimation operation, which further uses information on a reference point in the reference image, to (1-2)-nd specific reference bounding boxes,\n * wherein the (1-1)-st specific image distance estimation operation is performed by using the following formula: D=f×VHh\n * wherein D denotes one of the (1-1)-st specific image-based reference distances, f denotes a focal length of the camera, VH denotes a piece of the (1-1)-st specific reference object height information corresponding to one of the (1-1)-st specific reference objects, and h denotes an apparent height of said one of the (1-1)-st specific reference objects on the reference image,\n * wherein the (1-2)-nd specific image distance estimation operation is performed by using the following formula: D=f×Hb?cy\n * wherein D denotes one of the (1-2)-nd specific image-based reference distances, f denotes a focal length of the camera, H denotes a height of the camera, and |b - cy| denotes an apparent distance, on the reference image, between the reference point and a lower boundary of one of the (1-2)-nd specific reference bounding boxes corresponding to one of (1-2)-nd specific reference objects, wherein the reference point is a center point of the reference image.\n | 2. The method of Claim 1, wherein the step of (c) further comprises generating the distance error value by using the following formula: ErrorD=?k=1NwkDik?Dck2 wherein Dik denotes one of the image-based reference distances corresponding to a k-th reference object among the reference objects, Dck denotes one of the coordinate-based reference distances corresponding to the k-th reference object, wk denotes a weight for the k-th object, and N denotes the number of the reference objects.\n | 3. The method of Claim 1, wherein (i) one or more (2-1)-st specific coordinate-based reference distances, among the coordinate-based reference distances, are generated by referring to the subject location information and (2-1)-st specific reference object location information acquired through V2X communications between the subject vehicle and one or more (2-1)-st specific reference objects among the reference objects, and (ii) one or more (2-2)-nd specific coordinate-based reference distances, among the coordinate-based reference distances, are generated by referring to the subject location information and (2-2)-nd specific reference object location information which has been acquired from a database.\n | 4. The method of Claim 1, wherein the step of (a) further comprises applying the object detection operation to the reference image by instructing a Convolutional Neural Network, CNN, to apply at least one convolutional operation, at least one Region-Of-Interest, ROI, pooling operation and at least one Fully-Connected, FC, network operation to the reference image, in that order.\n | 5. The method of Claim 1, wherein the step of (c) further comprises calibrating the subject location information to generate calibrated subject location information which makes its corresponding distance error value smallest, by repeating processes of (i) adjusting the subject location information, (ii) re-calculating the distance error value by using the adjusted subject location information, and (iii) re-adjusting the adjusted subject location information by referring to information on whether the re-calculated distance error value has become smaller or not.\n | 6. The method of Claim 1, wherein the step of (c) further comprises, while storing update information to be used for updating a High-Definition Map(HD Map), recording information on locations where the update information has been acquired, by referring to the calibrated subject location information.\n | 7. The method of Claim 1, wherein the reference objects include at least part of (i) one or more mobile objects capable of a V2X communication and (ii) one or more fixed objects, (ii-1) which are capable of the V2X communication or (ii-2) whose information is stored in a database, and\n wherein the reference objects include at least part of one or more traffic signs, one or more traffic lights, one or more road markings and one or more surrounding vehicles located closer than a threshold from the subject vehicle. \n | 8. The method of Claim 1, wherein the step of (c) further comprises calculating a location, on a High-Definition, HD, map, of the subject vehicle by referring to the calibrated subject location information, and acquiring information, from the HD map, on one or more objects located closer than a threshold from the subject vehicle, to thereby support an autonomous driving of the subject vehicle.\n | 9. A device (100) for calculating a location of a subject vehicle, the device comprising: \n * at least one memory (115) that stores instructions; and\n * at least one processor (120) configured to execute the instructions to perform processes of: (I) detecting one or more reference objects in a reference image acquired through a camera on the subject vehicle, by applying at least one object detection operation to the reference image; (II) calculating each of one or more image-based reference distances (1-..) between each of the reference objects and the subject vehicle, by referring to information on each of reference bounding boxes, corresponding to each of the reference objects, on the reference image; and (III) (i) acquiring one or more coordinate-based reference distances (2-..) between each of the reference objects and the subject-vehicle by using information on a position of the subject vehicle acquired through a GPS , and information on locations of the reference objects acquired via vehicle-to-everything, V2X, communications, or from a database, and generating at least one distance error value by referring to the image-based reference distances (1-..) and\n * the one or more coordinate-based reference distances (2-..) between each of the reference objects and the subject vehicle, and (ii) calibrating subject location information of the subject vehicle by calculating an optimized location of the subject vehicle by minimizing the distance error value,\n * wherein the process of (II) further comprises the processor calculating the image-based reference distances by applying at least one image distance estimation operation, which uses information on parameters of the camera, to the reference bounding boxes, and\n * wherein (i) one or more (1-1)-st specific image-based reference distances, among the image-based reference distances, are generated by applying at least one (1-1)-st specific image distance estimation operation, which further uses (1-1)-st specific reference object height information acquired through V2X communications between the subject vehicle and one or more (1-1)-st specific reference objects among the reference objects, to (1-1)-st specific reference bounding boxes, and (ii) one or more (1-2)-nd specific image-based reference distances, among the image-based reference distances, are generated by applying at least one (1-2)-nd specific image distance estimation operation, which further uses information on a reference point in the reference image, to (1-2)-nd specific reference bounding boxes,\n * wherein the (1-1)-st specific image distance estimation operation is performed by using the following formula: D=f×VHh\n * wherein D denotes one of the (1-1)-st specific image-based reference distances, f denotes a focal length of the camera, VH denotes a piece of the (1-1)-st specific reference object height information corresponding to one of the (1-1)-st specific reference objects, and h denotes an apparent height of said one of the (1-1)-st specific reference objects on the reference image,\n * wherein the (1-2)-nd specific image distance estimation operation is performed by using the following formula: D=f×Hb?cy\n * wherein D denotes one of the (1-2)-nd specific image-based reference distances, f denotes a focal length of the camera, H denotes a height of the camera, and |b - cy| denotes an apparent distance, on the reference image, between the reference point and a lower boundary of one of the (1-2)-nd specific reference bounding boxes corresponding to one of (1-2)-nd specific reference objects, wherein the reference point is a center point of the reference image.\n | 10. The device of Claim 9, wherein, at the process of (III), the processor generates the distance error value by using the following formula: ErrorD=?k=1NwkDik?Dck2 wherein Dik denotes one of the image-based reference distances corresponding to a k-th reference object among the reference objects, Dck denotes one of the coordinate-based reference distances corresponding to the k-th reference object, wk denotes a weight for the k-th object, and N denotes the number of the reference objects.\n | 11. The device of Claim 9, wherein (i) one or more (2-1)-st specific coordinate-based reference distances, among the coordinate-based reference distances, are generated by referring to the subject location information and (2-1)-st specific reference object location information acquired through V2X communications between the subject vehicle and one or more (2-1)-st specific reference objects among the reference objects, and (ii) one or more (2-2)-nd specific coordinate-based reference distances, among the coordinate-based reference distances, are generated by referring to the subject location information and (2-2)-nd specific reference object location information which has been acquired from a database.", "output": "The method involves computing device detecting reference objects of the reference image (1) if at least one reference image is acquired through at least one camera on the subject vehicle which interworks with the subject vehicle by applying at least one object detection operation to the reference image. The computing device calculating image-based reference distances (2) between each of the reference objects and the subject vehicle corresponding to each of the reference objects by referring to information on each of reference bounding boxes on the reference image. The computing device, distance error value is generated by referring to the image-based reference distances and the coordinate-based reference distances, if each of one or more coordinate-based reference distances between each of the reference objects and the subject vehicle that is acquired. An INDEPENDENT CLAIM is included for a device for calculating exact location of a subject vehicle by using information on relative distances. Method for calculating exact location of a subject vehicle by using information on relative distances. Can also used for ego-vehicle localization by using the vehicle-to-everything (V2X) information fusion. A method calculates a more precise location of the ego-vehicle which allows information acquired by the ego-vehicle to be mapped onto the high definition (HD) map more correctly. The drawing shows a flowchart of the method for the ego-vehicle localization by using the V2X information fusion. 1Computing device detecting reference objects in the reference image2Computing device calculating image-based reference distances3Computing device calibrating subject location information", "instruction": "Please summarize the input" }, { "input": "LEARNING METHOD AND LEARNING DEVICE FOR DETERMINING WHETHER TO SWITCH MODE OF VEHICLE FROM MANUAL DRIVING MODE TO AUTONOMOUS DRIVING MODE BY PERFORMING TRAJECTORY-BASED BEHAVIOR ANALYSIS ON RECENT DRIVING ROUTEA learning method for calculating collision probability, to be used for determining whether it is appropriate or not to switch driving modes of a vehicle capable of an autonomous driving, by analyzing a recent driving route of a driver is provided. And the method includes steps of: (a) a learning device, on condition that a status vector and a trajectory vector are acquired, performing processes of (i) instructing a status network to generate a status feature map and (ii) instructing a trajectory network to generate a trajectory feature map; (b) the learning device instructing a safety network to calculate a predicted collision probability representing a predicted probability of an accident occurrence; and (c) the learning device instructing a loss layer to generate a loss by referring to the predicted collision probability and a GT collision probability, which have been acquired beforehand, to learn at least part of parameters.|1. A learning method for calculating collision probability, to be used for determining whether it is appropriate or not to switch driving modes of a vehicle capable of an autonomous driving, by analyzing a recent driving route of a driver with regard to a circumstance of its corresponding time range, comprising steps of: \n * (a) a learning device, on condition that (i) at least one status vector, corresponding to at least one piece of circumstance information for verification including at least part of (i-1) at least one piece of subject motion information on at least one subject vehicle and (i-2) one or more pieces of surrounding motion information on at least part of one or more surrounding objects located closer than a threshold from the subject vehicle, in a subject time range for verification from a first timing to a T-th timing, and (ii) at least one trajectory vector, corresponding to at least one piece of route information for verification on at least one driving route driven by the subject vehicle in the subject time range for verification, are acquired, performing processes of (i) instructing (S01) a status network to apply at least one first neural network operation to the status vector, to thereby generate at least one status feature map and (ii) instructing (S02) a trajectory network to apply at least one second neural network operation to the trajectory vector, to thereby generate at least one trajectory feature map;\n * (b) the learning device, if at least one concatenated feature map corresponding to the status feature map and the trajectory feature map is acquired, instructing (S03) a safety network to apply at least one third neural network operation to the concatenated feature map, to thereby calculate at least one predicted collision probability representing a predicted probability of an occurrence of at least one accident caused by the driving route indicated by the route information for verification with regard to a circumstance indicated by the circumstance information for verification, wherein the predicted collision probability is used for determining whether it is appropriate or not to switch driving modes of the subject vehicle from a manual driving mode to an autonomous driving mode; and\n * (c) the learning device instructing (S04) a loss layer to generate at least one loss by referring to the predicted collision probability and at least one Ground-Truth, GT, collision probability, which have been acquired beforehand, and to perform backpropagation by using the loss, to thereby learn at least part of parameters of the safety network, the trajectory network and the status network, wherein, at the step of (b), the learning device instructs (i) at least one concatenating layer to generate the concatenated feature map by concatenating the status feature map and the trajectory feature map, (ii) at least one third convolutional layer of the safety network to generate at least one (3-1)-st feature map by applying at least one third convolutional operation to the concatenated feature map, (iii) at least one third pooling layer of the safety network to generate at least one (3-2)-nd feature map by applying at least one third pooling operation to the (3-1)-st feature map, and (iv) at least one third Fully-Connected, FC, layer to generate the predicted collision probability by applying at least one third FC operation to the (3-2)-nd feature map.\n | 2. The method of Claim 1, wherein, at the step of (a), the learning device instructs (i) at least one first convolutional layer of the status network to generate at least one (1-1)-st feature map by applying at least one first convolutional operation to the status vector, (ii) at least one first pooling layer of the status network to generate at least one (1-2)-nd feature map by applying at least one first pooling operation to the (1-1)-st feature map, and (iii) at least one first Fully-Connected, FC, layer to generate the status feature map by applying at least one first FC operation to the (1-2)-nd feature map.\n | 3. The method of Claim 1, wherein, at the step of (a), the learning device instructs (i) at least one second convolutional layer of the trajectory network to generate at least one (2-1)-st feature map by applying at least one second convolutional operation to the trajectory vector, (ii) at least one second pooling layer of the trajectory network to generate at least one (2-2)-nd feature map by applying at least one second pooling operation to the (2-1)-st feature map, and (iii) at least one second Fully-Connected, FC, layer to generate the trajectory feature map by applying at least one second FC operation to the (2-2)-nd feature map.\n | 4. The method of any one of Claims 1 to 3, before the step of (a), further comprising a step of:\n (a0) the learning device communicating with at least one basement server interworking with the subject vehicle, to perform processes of (i) generating the status vector by using the circumstance information for verification, including (i-1) at least one piece of subject location information of the subject vehicle, (i-2) at least one piece of subject velocity information thereof, (i-3) at least one piece of surrounding location information of at least part of surrounding vehicles among the surrounding objects and (i-4) at least one piece of surrounding velocity information thereof, corresponding to the subject time range for verification, which have been acquired from the basement server, (ii) generating the trajectory vector by using the route information for verification, corresponding to the driving route of the subject vehicle on a region map for verification during the subject time range for verification, which has been acquired from the basement server, and (iii) acquiring the GT collision probability by using at least one piece of accident information on whether the subject vehicle has had at least one accident in an attention time range from a (T+1)-th timing to a (T+K)-th timing or not, which has been acquired from the basement server, wherein K is at least one arbitrary integer. \n | 5. The method of any one of Claims 1 to 4, before the step of (a), further comprising a step of:\n (a1) the learning device performs processes of (i) generating the status vector by using the circumstance information for verification corresponding to the subject time range for verification, including (i-1) at least one piece of subject location information of the subject vehicle and (i-2) at least one piece of subject velocity information thereof, which have been acquired by referring to at least one piece of driving record information of the subject vehicle, and (i-3) at least one piece of surrounding location information of at least part of surrounding vehicles among the surrounding objects and (i-4) at least one piece of surrounding velocity information thereof, which have been acquired by referring to at least one driving video recorded through at least one subject camera on the subject vehicle during the subject time range for verification, (ii) generating the trajectory vector by referring to the route information for verification, corresponding to the driving route of the subject vehicle on a region map for verification during the subject time range for verification, which has been acquired by referring to the driving record information, and (iii) acquiring the GT collision probability by using accident information on whether the subject vehicles has had at least one accident in an attention time range from a (T+1)-th timing to a (T+K)-th timing or not, which has been acquired by referring to the driving record information, wherein K is at least one arbitrary integer. \n | 6. The method of any one of Claims 1 to 5, before the step of (a), further comprising a step of:\n (a2) the learning device performs processes of (i) generating the status vector by using the circumstance information for verification corresponding to the subject time range for verification, including (i-1) at least one piece of subject location information of the subject vehicle and (i-2) at least one piece of subject velocity information thereof, which have been acquired by referring to at least one piece of driving record information of the subject vehicle, and (i-3) at least one piece of surrounding location information of at least part of surrounding vehicles among the surrounding objects and (i-4) at least one piece of surrounding velocity information thereof, which have been acquired by using at least one V2X communication module installed to the subject vehicle, to be used for communicating with said at least part of the surrounding vehicles, (ii) generating the trajectory vector by referring to the route information for verification, corresponding to the driving route of the subject vehicle on a region map for verification during the subject time range for verification, which has been acquired by referring to the driving record information, and (iii) acquiring the GT collision probability by using accident information on whether the subject vehicle has had at least one accident in an attention time range from a (T+1)-th timing to a (T+K)-th timing or not, which has been acquired by referring to the driving record information, wherein K is at least one arbitrary integer. \n | 7. The method of any one of Claims 1 to 6, before the step of (a), further comprising a step of:\n (a3) the learning device communicating with at least one simulating device simulating at least one virtual world including the subject vehicle and the surrounding objects, to perform processes of (i) generating the status vector by using the circumstance information for verification, including (i-1) at least one piece of subject location information of the subject vehicle, (i-2) at least one piece of subject velocity information thereof, (i-3) at least one piece of surrounding location information of at least part of surrounding vehicles among the surrounding objects and (i-4) at least one piece of surrounding velocity information thereof, corresponding to the subject time range for verification, which have been acquired from the simulating device, (ii) generating the trajectory vector by referring to the route information for verification, corresponding to the driving route of the subject vehicle on a region map for verification during the subject time range for verification, which has been acquired from the simulating device, and (iii) acquiring the GT collision probability by using at least one piece of accident information on whether the subject vehicle has had at least one accident in an attention time range from a (T+1)-th timing to a (T+K)-th timing or not, which has been acquired from the simulating device, wherein K is at least one arbitrary integer. \n | 8. The method of any one of Claims 1 to 7, wherein the subject motion information includes at least part of (i-1) at least one piece of subject location information of the subject vehicle, (i-2) at least one piece of subject velocity information thereof, and (i-3) at least one piece of subject acceleration information thereof, and\n wherein the surrounding motion information includes at least part of (ii-1) at least one piece of surrounding location information of at least part of the surrounding objects, (ii-2) at least one piece of surrounding velocity information thereof, and (ii-3) at least one piece of surrounding acceleration information thereof. \n | 9. A testing method for calculating collision probability, to be used for determining whether it is appropriate or not to switch driving modes of a vehicle capable of an autonomous driving, by analyzing a recent driving route of a driver with regard to a circumstance of its corresponding time range, comprising steps of: \n * (a) on condition that (1) a learning device, if (i) at least one status vector for training, corresponding to at least one piece of circumstance information for verification for training including at least part of (i-1) at least one piece of subject motion information for training on at least one subject vehicle for training and (i-2) one or more pieces of surrounding motion information for training on at least part of one or more surrounding objects for training located closer than a threshold from the subject vehicle for training, in a subject time range for verification for training from a first timing to a T-th timing, and (ii) at least one trajectory vector for training, corresponding to at least one piece of route information for verification for training on at least one driving route for training driven by the subject vehicle for training in the subject time range for verification for training, have been acquired, has performed processes of (i) instructing a status network to apply at least one first neural network operation to the status vector for training, to thereby generate at least one status feature map for training and (ii) instructing a trajectory network to apply at least one second neural network operation to the trajectory vector for training, to thereby generate at least one trajectory feature map for training; (2) the learning device, if at least one concatenated feature map for training corresponding to the status feature map for training and the trajectory feature map for training has been acquired, has instructed a safety network to apply at least one third neural network operation to the concatenated feature map for training, to thereby calculate at least one predicted collision probability for training representing a predicted probability for training of an occurrence of at least one accident for training caused by the driving route for training indicated by the route information for verification for training with regard to a circumstance for training indicated by the circumstance information for verification for training, wherein the predicted collision probability is used for determining whether it is appropriate or not to switch driving modes of the subject vehicle from a manual driving mode to an autonomous driving mode; and (3) the learning device has instructed a loss layer to generate at least one loss by referring to the predicted collision probability for training and at least one Ground-Truth, GT, collision probability, which have been acquired beforehand, and to perform backpropagation by using the loss, to thereby learn at least part of parameters of the safety network, the trajectory network and the status network, wherein, the learning device instructs (i) at least one concatenating layer to generate the concatenated feature map for training by concatenating the status feature map and the trajectory feature map, (ii) at least one third convolutional layer of the safety network to generate at least one (3-1)-st feature map for training by applying at least one third convolutional operation to the concatenated feature map, (iii) at least one third pooling layer of the safety network to generate at least one (3-2)-nd feature map for training by applying at least one third pooling operation to the (3-1)-st feature map, and (iv) at least one third Fully-Connected, FC, layer to generate the predicted collision probability for training by applying at least one third FC operation to the (3-2)-nd feature map, a testing device, if (i) at least one status vector for testing, corresponding to at least one piece of test circumstance information for verification including at least part of (i-1) at least one piece of subject motion information for testing on at least one subject vehicle for testing and (i-2) one or more pieces of surrounding motion information for testing on at least part of one or more surrounding objects for testing located closer than the threshold from the subject vehicle for testing, in a test subject time range for verification from a 1'-st timing to a T'-th timing, and (ii) at least one trajectory vector for testing, corresponding to at least one piece of test route information for verification on at least one driving route for testing driven by the subject vehicle for testing in the test subject time range for verification, have been acquired, performing processes of (i) instructing the status network to apply said at least one first neural network operation to the status vector for testing, to thereby generate at least one status feature map for testing and (ii) instructing the trajectory network to apply said at least one second neural network operation to the trajectory vector for testing, to thereby generate at least one trajectory feature map for testing;\n * (b) the testing device, if at least one concatenated feature map for testing corresponding to the status feature map for testing and the trajectory feature map for testing has been acquired, instructing the safety network to apply said at least one third neural network operation to the concatenated feature map for testing, to thereby calculate at least one predicted collision probability for testing representing a predicted probability for testing of an occurrence of at least one accident for testing caused by the driving route for testing indicated by the test route information for verification with regard to a circumstance for testing indicated by the test circumstance information for verification.\n | 10. The method of Claim 9, wherein, at the step of (a), the testing device communicates with at least one basement server for testing interworking with the subject vehicle for testing, to perform processes of (i) generating the status vector for testing by using the test circumstance information for verification, including (i-1) at least one piece of subject location information for testing of the subject vehicle for testing, (i-2) at least one piece of subject velocity information for testing thereof, (i-3) at least one piece of surrounding location information for testing of at least part of surrounding vehicles for testing among the surrounding objects for testing and (i-4) at least one piece of surrounding velocity information for testing thereof, corresponding to the test subject time range for verification, which have been acquired from the basement server for testing, and (ii) generating the trajectory vector for testing by referring to the test route information for verification, corresponding to the driving route for testing of the subject vehicle for testing on a test region map for verification during the test subject time range for verification, which has been acquired from the basement server for testing.\n | 11. The method of Claim 9 or 10, wherein, at the step of (a), the testing device performs processes of (i) generating the status vector for testing by using the test circumstance information for verification corresponding to the test subject time range for verification, including (i-1) at least one piece of subject location information for testing of the subject vehicle for testing and (i-2) at least one piece of subject velocity information for testing thereof, which have been acquired from at least one of a GPS for testing and a velocity control unit for testing included in the subject vehicle for testing, and (i-3) at least one piece of surrounding location information for testing of at least part of surrounding vehicles for testing among the surrounding objects for testing and (i-4) at least one piece of surrounding velocity information for testing thereof, which have been acquired by referring to at least one driving video for testing recorded through at least one subject camera for testing on the subject vehicle for testing during the test subject time range for verification, and (ii) generating the trajectory vector for testing by referring to the test route information for verification, corresponding to the driving route for testing of the subject vehicle for testing on a test region map for verification during the test subject time range for verification, which has been acquired from a planning unit for testing included in the subject vehicle for testing.\n | 12. The method of Claim 9, 10 or 11, wherein, at the step of (a), the testing device performs processes of (i) generating the status vector for testing by using the test circumstance information for verification corresponding to the test subject time range for verification, including (i-1) at least one piece of subject location information for testing of the subject vehicle for testing and (i-2) at least one piece of subject velocity information for testing thereof, which have been acquired from at least one of a GPS for testing and a velocity control unit for testing included in the subject vehicle for testing, and (i-3) at least one piece of surrounding location information for testing of at least part of surrounding vehicles for testing among the surrounding objects for testing and (i-4) at least one piece of surrounding velocity information for testing thereof, which have been acquired by using a V2X communication module for testing included in the subject vehicle of testing during the test subject time range for verification, and (ii) generating the trajectory vector for testing by referring to the test route information for verification, corresponding to the driving route for testing of the subject vehicle for testing on a test region map for verification during the test subject time range for verification, which has been acquired from a planning unit for testing included in the subject vehicle for testing.\n | 13. The method of any one of Claims 9 to 12, further comprising a step of:\n (c) the testing device, if the predicted collision probability for testing is larger than a threshold and a driving mode of the subject vehicle for testing corresponds to a manual driving mode, instructing the subject vehicle for testing to switch its driving mode to an autonomous driving mode. \n | 14. A learning device for calculating collision probability, to be used for determining whether it is appropriate or not to switch driving modes of a vehicle capable of an autonomous driving, by analyzing a recent driving route of a driver with regard to a circumstance of its corresponding time range, comprising: \n * at least one memory (115) that stores instructions; and\n * at least one processor (120) configured to execute the instructions to perform processes of: (I) on condition that (i) at least one status vector, corresponding to at least one piece of circumstance information for verification including at least part of (i-1) at least one piece of subject motion information on at least one subject vehicle and (i-2) one or more pieces of surrounding motion information on at least part of one or more surrounding objects located closer than a threshold from the subject vehicle, in a subject time range for verification from a first timing to a T-th timing, and (ii) at least one trajectory vector, corresponding to at least one piece of route information for verification on at least one driving route driven by the subject vehicle in the subject time range for verification, are acquired, performing processes of (i) instructing a status network (130) to apply at least one first neural network operation to the status vector, to thereby generate at least one status feature map and (ii) instructing a trajectory network (140) to apply at least one second neural network operation to the trajectory vector, to thereby generate at least one trajectory feature map; (II) if at least one concatenated feature map corresponding to the status feature map and the trajectory feature map is acquired, instructing a safety network (150) to apply at least one third neural network operation to the concatenated feature map, to thereby calculate at least one predicted collision probability representing a predicted probability of an occurrence of at least one accident caused by the driving route indicated by the route information for verification with regard to a circumstance indicated by the circumstance information for verification, wherein the predicted collision probability is used for determining whether it is appropriate or not to switch driving modes of the subject vehicle from a manual driving mode to an autonomous driving mode, wherein, the processor (120) is further configured to instruct (i) at least one concatenating layer to generate the concatenated feature map by concatenating the status feature map and the trajectory feature map, (ii) at least one third convolutional layer (152) of the safety network (150) to generate at least one (3-1)-st feature map by applying at least one third convolutional operation to the concatenated feature map, (iii) at least one third pooling layer (153) of the safety network (150) to generate at least one (3-2)-nd feature map by applying at least one third pooling operation to the (3-1)-st feature map, and (iv) at least one third Fully-Connected, FC, layer (154) to generate the predicted collision probability by applying at least one third FC operation to the (3-2)-nd feature map; and (III) instructing a loss layer (160) to generate at least one loss by referring to the predicted collision probability and at least one Ground-Truth, GT, collision probability, which have been acquired beforehand, and to perform backpropagation by using the loss, to thereby learn at least part of parameters of the safety network, the trajectory network and the status network.", "output": "The learning method for calculating collision probability and switch manual mode to autonomous mode in vehicle involves Instructing (S01) status network to apply first neural network operation to status vector to generate status feature map. The trajectory network is instructed (S02) to apply second neural network operation to trajectory vector and generate trajectory feature map. The safety network is instructed (S03) to calculate predicted collision Probability by applying the neural network operation to concatenate feature map. The lost layer is instructed (S04) to generate loss and perform back propagation by using loss. The learning device instructs loss layer to generate predicted collision probability to perform back-propagation using trajectory and status network. An INDEPENDENT CLAIM is included for: a learning device for calculating collision probability. Learning method in autonomous vehicle for automatic switching of driving modes based on behavior analysis and driving style of user on recent driving route using vehicle to everything (V2X) communication network. The method calculates collision probability by analyzing a recent driving route of a driver, to determine whether the driver is driving dangerously and switch manual mode to autonomous driving mode. The learning device instruct the loss layer to generate at least one loss to predicted collision probability and perform back-propagation using the parameters of the safety network. Increase the safety of the driver and reduces the collision dangers to surrounding vehicles. The drawing shows a flow-chart of a method for calculating collision probability, to be used for determining whether it is appropriate to switch driving modes to autonomous by analyzing the recent driving route of the driver. S01Instructing status network to generate status feature mapS02Instructing trajectory network to apply second trajectory feature mapS03Instructing safety network to calculate predicted collision Probability to concatenate feature mapS04Instructing lost layer to generate loss and perform back propagation", "instruction": "Please summarize the input" }, { "input": "METHOD AND DEVICE FOR INTER-VEHICLE COMMUNICATION VIA RADAR SYSTEMA method for a V2V communication by using a radar module used for detecting objects nearby is provided. And the method includes steps of: (a) a computing device (100) performing (i) a process of instructing the radar module (130) to transmit 1-st transmitting signals by referring to at least one 1-st schedule and (ii) a process of generating RVA information by using (1-1)-st receiving signals, corresponding to the 1-st transmitting signals; and (b) the computing device performing a process of instructing the radar module to transmit 2-nd transmitting signals by referring to at least one 2-nd schedule.|1. A method for a V2V communication by using a radar module used for detecting objects nearby, comprising steps of: (a) a computing device, if a 1-st trigger that a 1-st timing corresponding to a current time is included in a 1-st time slot is detected, performing (i) a process of instructing the radar module to transmit one or more 1-st transmitting signals, to be used for acquiring RVA information on at least part of one or more head directions, one or more relative locations and one or more relative velocities of at least part of one or more 1-st surrounding objects including one or more 1-st surrounding vehicles located closer than a first threshold from a subject vehicle interworking with the computing device at the 1-st timing, by referring to at least one 1-st schedule and (ii) a process of generating the RVA information by using one or more (1-1)-st receiving signals, corresponding to the 1-st transmitting signals, acquired through the radar module; and (b) the computing device, if a 2-nd trigger that a 2-nd timing corresponding to a later time from the 1-st timing is included in a 2-nd time slot is detected, performing a process of instructing the radar module to transmit one or more 2-nd transmitting signals by referring to at least one 2-nd schedule, corresponding to V2V transmitting information on the subject vehicle.\n | 2. The method of Claim 1, wherein, at the step of (a), the computing device, if one or more (1-2)-nd receiving signals are acquired through the radar module from at least part of the 1-st surrounding objects, further performs a process of generating 1-st V2V receiving information by referring to the (1-2)-nd receiving signals in parallel with the other processes performed during the step of (a), and wherein, at the step of (b), the computing device, if one or more 2-nd receiving signals are acquired through the radar module from at least part of 2-nd surrounding vehicles which are located closer than the first threshold from the subject vehicle at the 2-nd timing, performs a process of generating 2-nd V2V receiving information on at least part of the 2-nd surrounding vehicles by referring to the 2-nd receiving signals in parallel with the other process performed during the step of (b).\n | 3. The method of Claim 2, wherein, at the step of (a), the computing device (i) detects at least one starting signal and at least one ending signal among the (1-2)-nd receiving signals by referring to a reference length included in a communication rule of the V2V communication, (ii) generates at least one meta data permutation including at least part of the (1-2)-nd receiving signals between the starting signal and the ending signal, and then (iii) generates the 1-st V2V receiving information by referring to the meta data permutation.\n | 4. The method of Claim 3, wherein, at the step of (a), the computing device, if a time gap between receiving timings of its two inputted receiving signals is smaller than a second threshold and thus if the inputted receiving signals are not determined as being included in the (1-1)-st receiving signals or as being included in the (1-2)-nd receiving signals, instructs the radar module to continuously receive receiving signals until the meta data permutation is generated.\n | 5. The method of Claim 3, wherein, at the step of (a), the computing device compares the meta data permutation and each of one or more reference data permutations corresponding to each of driving circumstances, to find a specific reference data permutation whose similarity score with the meta data permutation is larger than a third threshold, to thereby generate the 1-st V2V receiving information by referring to information on a specific driving circumstance corresponding to the specific reference data permutation.\n | 6. The method of Claim 2, wherein, at the step of (a), the computing device generates 1-st circumstance information on at least part of the 1-st surrounding vehicles by referring to the 1-st V2V receiving information, to thereby support an autonomous driving of the subject vehicle by referring to the 1-st circumstance information, and wherein, at the step of (b), the computing device generates 2-nd circumstance information on at least part of the 2-nd surrounding vehicles by referring to the 2-nd V2V receiving information, to thereby support the autonomous driving of the subject vehicle by referring to the 2-nd circumstance information, wherein (2-1)-st circumstance information among the 2-nd circumstance information is acquired by updating at least part of the 1-st circumstance information on one or more specific vehicles included in both of the 1-st surrounding vehicles and the 2-nd surrounding vehicles, using at least part of the 2-nd V2V receiving information thereon, and wherein (2-2)-nd circumstance information on another vehicles among the 2-nd surrounding vehicles other than the specific vehicles is acquired by another part of the 2-nd V2V receiving information.\n | 7. The method of Claim 2, wherein, at the step of (b), the computing device transmits the 2-nd transmitting signals with its transmitting timings determined by referring to the 2-nd schedule to thereby deliver the V2V transmitting information to at least part of the 2-nd surrounding vehicles, wherein the 2-nd schedule has been acquired by referring to a specific reference data permutation corresponding to the V2V transmitting information among each of one or more reference data permutations for each of driving circumstances and a reference length included in a communication rule of the V2V communication.\n | 8. The method of Claim 2, wherein, at the step of (a), the computing device, if a time gap between receiving timings of its two inputted receiving signals is larger than or same as a second threshold, (i) generates Intermediate Frequency(IF) signals between one of the 1-st transmitting signals and said inputted receiving signals, (ii) determines whether each of center frequencies of each of the IF signals is included in a 1-st frequency range or is included in a 2-nd frequency range, to thereby determine whether each of said inputted receiving signals is included in the (1-1)-st receiving signals or the (1-2)-nd receiving signals.\n | 9. The method of Claim 2, wherein the computing device uses a frequency interference prevention filter to acquire the (1-1)-st receiving signals and the (1-2)-nd receiving signals at a timing included in the 1-st time slot, and wherein the computing device does not use the frequency interference prevention filter to acquire the 2-nd receiving signals at a timing included in the 2-nd time slot.\n | 10. The method of Claim 1, wherein, at the step of (a), as the computing device instructs the radar module to transmit the 1-st transmitting signals built as chirp signals, the computing device (i) acquires the (1-1)-st receiving signals, which are reflected signals of the 1-st transmitting signals, through the radar module, (ii) generates each of one or more Intermediate Frequency(IF) signals between each of the 1-st transmitting signals and each of the (1-1)-st receiving signals, and (iii) generates the RVA information by applying a Fourier transform to the IF signals.\n | 11. A computing device for a V2V communication by using a radar module used for detecting objects nearby, comprising: at least one memory that stores instructions; and at least one processor configured to execute the instructions to perform processes of: (I) if a 1-st trigger that a 1-st timing corresponding to a current time is included in a 1-st time slot is detected, (i) instructing the radar module to transmit one or more 1-st transmitting signals, to be used for acquiring RVA information on at least part of one or more head directions, one or more relative locations and one or more relative velocities of at least part of one or more 1-st surrounding objects including one or more 1-st surrounding vehicles located closer than a first threshold from a subject vehicle interworking with the computing device at the 1-st timing, by referring to at least one 1-st schedule and (ii) generating the RVA information by using one or more (1-1)-st receiving signals, corresponding to the 1-st transmitting signals, acquired through the radar module; and (II) if a 2-nd trigger that a 2-nd timing corresponding to a later time from the 1-st timing is included in a 2-nd time slot is detected, instructing the radar module to transmit one or more 2-nd transmitting signals by referring to at least one 2-nd schedule, corresponding to V2V transmitting information on the subject vehicle.\n | 12. The computing device of Claim 11, wherein, at the process of (I), the processor, if one or more (1-2)-nd receiving signals are acquired through the radar module from at least part of the 1-st surrounding objects, further performs a process of generating 1-st V2V receiving information by referring to the (1-2)-nd receiving signals in parallel with the process of (I), and wherein, at the process of (II), the processor, if one or more 2-nd receiving signals are acquired through the radar module from at least part of 2-nd surrounding vehicles which are located closer than the first threshold from the subject vehicle at the 2-nd timing, performs a process of generating 2-nd V2V receiving information on at least part of the 2-nd surrounding vehicles by referring to the 2-nd receiving signals in parallel with the process of (II).\n | 13. The computing device of Claim 12, wherein, at the process of (I), the processor (i) detects at least one starting signal and at least one ending signal among the (1-2)-nd receiving signals by referring to a reference length included in a communication rule of the V2V communication, (ii) generates at least one meta data permutation including at least part of the (1-2)-nd receiving signals between the starting signal and the ending signal, and then (iii) generates the 1-st V2V receiving information by referring to the meta data permutation.\n | 14. The computing device of Claim 13, wherein, at the process of (I), the processor, if a time gap between receiving timings of its two inputted receiving signals is smaller than a second threshold so that the inputted receiving signals cannot be distinguished as being included in the (1-1)-st receiving signals or the (1-2)-nd receiving signals, instructs the radar module to continuously receive receiving signals until the meta data permutation is generated.\n | 15. The computing device of Claim 13, wherein, at the process of (I), the processor compares the meta data permutation and each of one or more reference data permutations corresponding to each of driving circumstances, to find a specific reference data permutation whose similarity score with the meta data permutation is larger than a third threshold, to thereby generate the 1-st V2V receiving information by referring to information on a specific driving circumstance corresponding to the specific reference data permutation.\n | 16. The computing device of Claim 12, wherein, at the process of (I), the processor generates 1-st circumstance information on at least part of the 1-st surrounding vehicles by referring to the 1-st V2V receiving information, to thereby support an autonomous driving of the subject vehicle by referring to the 1-st circumstance information, and wherein, at the process of (II), the processor generates 2-nd circumstance information on at least part of the 2-nd surrounding vehicles by referring to the 2-nd V2V receiving information, to thereby support the autonomous driving of the subject vehicle by referring to the 2-nd circumstance information, wherein (2-1)-st circumstance information among the 2-nd circumstance information is acquired by updating at least part of the 1-st circumstance information on one or more specific vehicles included in both of the 1-st surrounding vehicles and the 2-nd surrounding vehicles, using at least part of the 2-nd V2V receiving information thereon, and wherein (2-2)-nd circumstance information on another vehicles among the 2-nd surrounding vehicles other than the specific vehicles is acquired by another part of the 2-nd V2V receiving information.\n | 17. The computing device of Claim 12, wherein, at the process of (II), the processor transmits the 2-nd transmitting signals with its transmitting timings determined by referring to the 2-nd schedule to thereby deliver the V2V transmitting information to at least part of the 2-nd surrounding vehicles, wherein the 2-nd schedule has been acquired by referring to a specific reference data permutation corresponding to the V2V transmitting information among each of one or more reference data permutations for each of driving circumstances and a reference length included in a communication rule of the V2V communication.\n | 18. The computing device of Claim 12, wherein, at the process of (I), the processor, if a time gap between receiving timings of its two inputted receiving signals is larger than or same as a second threshold, (i) generates Intermediate Frequency(IF) signals between one of the 1-st transmitting signals and said inputted receiving signals, (ii) determines whether each of center frequencies of each of the IF signals is included in a 1-st frequency range or is included in a 2-nd frequency range, to thereby determine whether each of said inputted receiving signals is included in the (1-1)-st receiving signals or the (1-2)-nd receiving signals.\n | 19. The computing device of Claim 12, wherein the processor uses a frequency interference prevention filter to acquire the (1-1)-st receiving signals and the (1-2)-nd receiving signals at a timing included in the 1-st time slot, and wherein the processor does not use the frequency interference prevention filter to acquire the 2-nd receiving signals at a timing included in the 2-nd time slot.\n | 20. The computing device of Claim 11, wherein, at the process of (I), as the processor instructs the radar module to transmit the 1-st transmitting signals built as chirp signals, the processor (i) acquires the (1-1)-st receiving signals, which are reflected signals of the 1-st transmitting signals, through the radar module, (ii) generates each of one or more Intermediate Frequency(IF) signals between each of the 1-st transmitting signals and each of the (1-1)-st receiving signals, and (iii) generates the RVA information by applying a Fourier transform to the IF signals.", "output": "The method involves performing a process of instructing (S01-1) the radar module to transmit 1-st transmitting signals to be used for acquiring RVA information on head directions, relative locations and relative velocities of 1-st surrounding objects including 1-st surrounding vehicles which are located closer than a first threshold from a subject vehicle interworking with the computing device at 1-st timing by referring to 1-st schedule if a 1-st trigger that a 1-st timing corresponding to a current time. A process of generating (S01-2) RVA information by using (1-1)-st receiving signals corresponding to the 1-st transmitting signals are acquired through radar module. A process of instructing (S02-1) radar module is performed to transmit 2-nd transmitting signals by referring to 2-nd schedule corresponding to V2V transmitting information on subject vehicle if a 2-nd trigger that a 2-nd timing corresponding to a later time from 1-st timing is included in a 2-nd time slot is detected. An INDEPENDENT CLAIM is included for a computing device for a V2V communication. Method for vehicle-to-vehicle (V2V) communication. The radar module is allowed to perform the V2V communication while performing original functions by transmitting different signals according to time slots. The drawing shows a flowchart illustrating the method for the V2V communication by using the radar module used for detecting objects nearby. S01-1Step for instructing the radar module to transmit 1-st transmitting signalsS01-2Step for generating the RVA informationS01-3Step for generating 1-st V2V receiving informationS02-1Step for instructing the radar module to transmit 2-nd transmitting signalsS02-2Step for generating 2-nd V2V receiving information", "instruction": "Please summarize the input" }, { "input": "METHOD AND DEVICE FOR SIGNALING PRESENT DRIVING INTENSION OF AUTONOMOUS VEHICLE TO HUMANS BY USING VARIOUS V2X-ENABLED APPLICATIONThe present invention provides a method for signaling a driving intention of an autonomous vehicle, wherein a driving intention signaling device (a) autonomously using a surrounding video image (Surrounding Video Image) detecting a pedestrian located in front of the driving vehicle and determining whether the pedestrian crosses the road using a virtual crosswalk; (b) If it is determined that the pedestrian is crossing the road, the crossing trajectory corresponding to the expected path of the pedestrian is predicted by referring to the moving trajectory of a specific pedestrian, and the driving information and the crossing trajectory are referenced. setting a driving plan for the autonomous vehicle, and allowing the autonomous vehicle to autonomously drive according to the driving plan; and (c) determining whether a specific pedestrian is paying attention to the autonomous vehicle by referring to the gaze pattern, and if not, transmitting the driving intention to the pedestrian and surrounding drivers through an external display and an external speaker. Provides a method including ;.|1. A method of signaling at least one driving intention of an autonomous vehicle, comprising: (a) a Driving Intention Signaling Device, at least one peripheral image image of the autonomous vehicle ( A process of detecting at least one pedestrian located in a nearby front area of the autonomous vehicle using a Surrounding Video Image, and a virtual crosswalk corresponding to one of the positions of the pedestrian among the pedestrians. performing a process of determining whether a specific pedestrian crosses a road on which the autonomous vehicle travels;\n(b) when it is determined that the specific pedestrian crosses the road, the driving intention signaling device refers to at least one moving trajectory of the specific pedestrian, and at least the specific pedestrian attempts to cross the road. A process of predicting at least one crosswalking trajectory corresponding to one predicted path, and at least one driving plan of the autonomous vehicle by referring to the driving information of the autonomous vehicle and the trajectory. performing a setting process and a process for allowing the autonomous vehicle to drive according to the driving plan; and (c) the driving intention signaling device determines whether the specific pedestrian is paying attention to the autonomous vehicle by referring to at least one gaze pattern of the specific pedestrian by using the surrounding image image. process, and when it is determined that the specific pedestrian is not paying attention to the autonomous vehicle, the driving intention of the autonomous vehicle corresponding to the driving plan through at least one of an external display and an external speaker installed in the autonomous vehicle performing a process to be delivered to at least one of the specific pedestrian and at least one driver of at least one surrounding vehicle; \nIncluding, in step (a), the driving intention signaling device, (i) corresponding to the road width of the road, both sides from each boundary line between the road and each sidewalk in each direction of the sidewalk create the virtual crosswalk to include a first area extending a distance and a second area being an additional area selected with reference to each predetermined point on each first area, (ii) the virtual crosswalk; a process in which the longitudinal central axis of the sidewalk corresponds to the location of the specific pedestrian, and (iii) determining that the specific pedestrian is crossing the road when the specific pedestrian is located in one of the first areas; and performing a process of determining that the specific pedestrian intends to cross the road when the specific pedestrian is located in one of the second areas. \n | 2. The method according to claim 1, wherein, in step (c), the driving intention signaling device means a time for which the gaze of the specific pedestrian is located in the autonomous vehicle during (i) a preset reference time When the cumulative sum of each of at least one Gaze Time Section is less than or equal to a preset first threshold, or (ii) a state in which the gaze of the specific pedestrian is located in the autonomous vehicle and a state in which it is not repeated are repeated When the time period during which the gaze of the specific pedestrian is not located in the autonomous vehicle is greater than or equal to a preset second threshold after A method characterized in that performing a process for determining that there is not.\n | 3. According to claim 2, wherein the driving intention signaling device, (iii) at least one of each of the gaze time section, which means a time during which the gaze of the specific pedestrian is located in the autonomous vehicle during the preset reference time. When the cumulative sum exceeds the first preset threshold, or (iv) the state in which the gaze of the specific pedestrian is located in the autonomous vehicle and the state in which it is not located in the autonomous vehicle are repeated, the gaze of the specific pedestrian becomes the autonomous driving When the time in the vehicle is equal to or greater than a third threshold, a process of determining that the specific pedestrian is paying attention to the autonomous vehicle is performed.\n | 4. delete\n | 5. The primary space of claim 1, wherein, in step (b), the driving intention signaling device refers to the movement speed and acceleration of the specific pedestrian, (i) a second distance or more in the direction of the road from the crossing end point. In the Primary Spatial Section, a constant acceleration model with limited maximum speed is used, and (ii) in a second spatial section that is less than the second distance from the crossing end point in the direction of the road, the direction of movement of the specific pedestrian is maintained, but Method of performing a process of predicting the crossing trajectory of the specific pedestrian by using the constant acceleration model according to the set negative acceleration.\n | 6. The method of claim 1, wherein the driving intention signaling device causes the autonomous vehicle to avoid the traversing trajectory of the specific pedestrian, Method of performing a process of setting the driving plan by predicting a driving route, acceleration information, deceleration information, and steering information so that the specific pedestrian does not feel threatened or violates traffic laws.\n | 7. The state of claim 6 , wherein a first action plan to a j-th action plan corresponding to at least a part of an acceleration operation, a deceleration operation, and a steering operation of the autonomous vehicle, where j is an integer greater than or equal to 1, are set. , n steps at a preset time interval - where n is an integer greater than or equal to 1 - In the set state, the driving intention signaling device, in the (k-1)-th step - where k is an integer greater than or equal to 1 and less than or equal to n - predicted in step When each of the first action plan to the j-th action plan is performed based on the (k-1)th driving information of the autonomous vehicle corresponding to the (k-1)th optimal action plan Pedestrian injury cost, pedestrian threat cost, law violation cost, and repeating the process of predicting a specific action plan that minimizes the ride comfort cost as a k-th optimal action plan in the k-th step, so that each second action plan selected from the first action plan to the j-th action plan in each of the n steps A method of performing a process of setting the driving plan of the autonomous vehicle with reference to a first optimal action plan to an n-th optimal action plan.\n | 8. The method of claim 1, wherein in step (c), the driving intention signaling device displays at least a portion of a color, a symbol, a text, and an emoji corresponding to the driving intention through the external display. A method comprising performing a process for displaying.\n | 9. The autonomous vehicle according to claim 1, wherein in step (c), the driving intention signaling device refers to at least one gaze pattern of the occupant by using at least one interior image of the autonomous vehicle. a process of determining whether at least one said occupant of a driving vehicle is paying attention to the nearby front area or the specific pedestrian of the autonomous vehicle, and the occupant is paying attention to the nearby front area or the specific pedestrian of the autonomous vehicle When it is determined that the vehicle is not in operation, a process of transmitting the driving intention of the autonomous driving vehicle corresponding to the driving plan to the occupant is performed through at least one of an internal display and an internal speaker installed in the autonomous driving vehicle how to do it with\n | 10. A driving intention signaling device for signaling at least one driving intention of an autonomous vehicle, comprising: at least one memory for storing instructions; and (I) a process of detecting at least one pedestrian located in a nearby forward area of the autonomous vehicle using at least one Surrounding Video Image of the autonomous vehicle, and a location of the pedestrian. A process of determining whether a specific pedestrian among the pedestrians crosses a road on which the autonomous vehicle travels by using a virtual crosswalk corresponding to (II) determining that the specific pedestrian is crossing the road , a process of predicting at least one crosswalking trajectory corresponding to at least one expected path on which the specific pedestrian intends to cross the road with reference to at least one moving trajectory of the specific pedestrian, a process of setting at least one driving plan of the autonomous vehicle with reference to the driving information of the autonomous vehicle and the traversing trajectory, and a process of allowing the autonomous vehicle to drive according to the driving plan; and (III) a process of determining whether the specific pedestrian is paying attention to the autonomous vehicle by referring to at least one gaze pattern of the specific pedestrian by using the surrounding image image; and If it is determined that the autonomous driving vehicle is not paying attention, the driving intention of the autonomous driving vehicle corresponding to the driving plan is determined through at least one of an external display and an external speaker installed in the autonomous driving vehicle. at least one processor configured to execute said instructions for performing a process to be communicated to at least one of at least one driver of a surrounding vehicle;\nwherein, in the process (I), (i) corresponding to the road width of the road, both sides of the road by a first distance from each boundary line between the road and each sidewalk in each sidewalk direction create the virtual crosswalk to include a first area that extends and a second area that is an additional area selected with reference to each predetermined point on each first area; (ii) the length of the virtual crosswalk; a process of determining that the specific pedestrian is crossing the road when the specific pedestrian is located in one of the first areas, and (iii) a process of determining that the specific pedestrian is crossing the road; and performing a process of determining that the specific pedestrian intends to cross the road when the pedestrian is located in one of the second areas. \n | 11. The method of claim 10, wherein in the process (III), the processor is configured to: (i) at least each of the at least When the cumulative sum of one Gaze Time Section is less than or equal to a preset first threshold, or (ii) after the state in which the gaze of the specific pedestrian is located in the autonomous vehicle and the state in which it is not located are repeated, the If the time period during which the gaze of the specific pedestrian is not located in the autonomous vehicle is equal to or greater than a preset second threshold, it is determined that the specific pedestrian is not paying attention to the autonomous vehicle with reference to the gaze pattern of the specific pedestrian. A device, characterized in that performing a process of determining.\n | 12. The method of claim 11, wherein the processor is configured to: (iii) a cumulative sum of each of the at least one gaze time section, which means a time during which the gaze of the specific pedestrian is located in the autonomous vehicle during the preset reference time When the predetermined first threshold is exceeded, or (iv) the state in which the gaze of the specific pedestrian is positioned on the autonomous vehicle and the state in which it is not located are repeated, the gaze of the specific pedestrian is located in the autonomous vehicle and performing a process for determining that the specific pedestrian is paying attention to the autonomous vehicle when the time is equal to or greater than a third threshold.\n | 13. delete\n | 14. The primary space section (Primary) according to claim 10, wherein, in the process (II), the processor, with reference to the moving speed and acceleration of the specific pedestrian, (i) a second distance or more in the road direction from the crossing end point In the Spatial Section), a constant acceleration model with a limited maximum speed is used, and (ii) in a second spatial section that is less than the second distance from the crossing end point in the direction of the road, the direction of movement of the specific pedestrian is maintained but a preset negative value is used. and performing a process of predicting the crossing trajectory of the specific pedestrian by using the constant acceleration model according to the acceleration.\n | 15. The method of claim 10 , wherein the processor causes the autonomous vehicle to avoid the crossing trajectory of the specific pedestrian, or The apparatus of claim 1, wherein a process of setting the driving plan is performed by predicting a driving route, acceleration information, deceleration information, and steering information so that the vehicle does not feel threatened or violates traffic laws.\n | 16. The method according to claim 15, wherein a first action plan to a j-th action plan corresponding to at least a part of an acceleration operation, a deceleration operation, and a steering operation of the autonomous vehicle, where j is an integer greater than or equal to 1, are set. , n steps at a preset time interval - where n is an integer greater than or equal to 1 - In this set state, the processor is configured to: -1) The result of performing each of the first action plan to the j-th action plan based on the (k-1)th driving information of the autonomous vehicle corresponding to the Optimal Action Plan A process of predicting a specific action plan that minimizes pedestrian injury cost, pedestrian threat cost, law violation cost, and ride comfort cost among the first to j-th driving information of the driving information as the k-th optimal action plan in the k-th step By repeating performing a process of setting the driving plan of the autonomous vehicle with reference to each of the first optimal action plan to the n-th optimal action plan selected from the first action plan to the j-th action plan in each of the n steps Device characterized in that.\n | 17. The method of claim 10, wherein in the process (III), the processor displays at least a part of a color, a symbol, a text, and an emoji corresponding to the driving intention through the external display. A device for performing a process.\n | 11. The method of claim 10, wherein in the process (III), the processor refers to at least one gaze pattern of the occupant by using at least one interior image of the autonomous vehicle, a process of determining whether the at least one occupant is paying attention to the nearby front area or the specific pedestrian of the autonomous vehicle, and wherein the occupant is not paying attention to the nearby front area or the specific pedestrian of the autonomous vehicle If it is determined that the vehicle is determined to be driving, a process of transmitting the driving intention of the autonomous driving vehicle corresponding to the driving plan to the occupant is performed through at least one of an internal display and an internal speaker installed in the autonomous driving vehicle .", "output": "The method involves detecting multiple pedestrians which is located in a nearby front area of the autonomous vehicle by using surroundings video image of the autonomous vehicle. A specific pedestrian is determined among the pedestrians crosses a roadway where the autonomous vehicle is traveling by using a virtual crosswalk corresponding to one of locations of the pedestrians. The crosswalking trajectory is estimated corresponding to the expected path by which the specific pedestrian is to cross the roadway. The specific pedestrian is paying attention to the autonomous vehicle is determined by referring to multiple gaze patterns of the specific pedestrian using the surroundings video image. An INDEPENDENT CLAIM is included for a driving intention signaling device for signaling at least one driving intention of an autonomous vehicle. Method for signaling a driving intention of an autonomous vehicle. The method allows the pedestrian or the passenger to recognize a current driving intention of the autonomous vehicle by acquiring optimal information to be used for signaling the current driving intention of the autonomous vehicle and allowing the pedestrian, the passenger, the drivers of other vehicles to calm down, figure out and expect behaviors of the autonomous vehicle. The drawing shows a block representation of a driving intention signaling device. 100Driving intention signaling device110Memory120Processor", "instruction": "Please summarize the input" }, { "input": "Autonomous vehicle technology effectiveness determination for insurance pricingMethods and systems for determining the effectiveness of one or more autonomous (and/or semi-autonomous) operation features of a vehicle are provided. According to certain aspects, information regarding autonomous operation features of the vehicle may be used to determine an effectiveness metric indicative of the ability of each autonomous operation feature to avoid or mitigate accidents or other losses. The information may include operating data from the vehicle or other vehicles having similar autonomous operation features, test data, or loss data from other vehicles. The determined effectiveness metric may then be used to determine part or all of an insurance policy, which may be reviewed by an insured and updated based upon the effectiveness metric.What is claimed is: \n | 1. A computer-implemented method for evaluating a vehicle having a plurality of autonomous or semi-autonomous vehicle technologies, the method comprising: \nimplementing, by a test computing system, the plurality of autonomous or semi-autonomous vehicle technologies within a virtual test environment; \npresenting, by the test computing system, virtual test sensor data to the virtual test environment, wherein the virtual test sensor data simulates sensor data for operating conditions associated with a plurality of test scenarios within the virtual test environment; \nin response to the virtual test sensor data, generating, by the test computing system, test responses of the plurality of autonomous or semi-autonomous vehicle technologies; and \nbased upon the test responses, determining, by the test computing system, an effectiveness metric for the plurality of autonomous or semi-autonomous vehicle technologies, wherein the effectiveness metric indicates a combined reliability of operating the vehicle by using different combinations of the plurality of autonomous or semi-autonomous vehicle technologies. \n | 2. The computer-implemented method of claim 1, wherein: \nthe plurality of autonomous or semi-autonomous vehicle technologies includes an updated version of at least one of the plurality of autonomous or semi-autonomous vehicle technologies; and \ndetermining the effectiveness metric includes determining an update to the effectiveness metric based at least in part upon a compatibility of the updated version of the at least one of the plurality of autonomous or semi-autonomous vehicle technologies with at least another one of the plurality of autonomous or semi-autonomous vehicle technologies. \n | 3. The computer-implemented method of claim 2, wherein determining the update to the effectiveness metric includes determining a change in accident avoidance effectiveness for the updated version of the at least one of the plurality of autonomous or semi-autonomous vehicle technologies.\n | 4. The computer-implemented method of claim 1, wherein the virtual test sensor data includes virtual test communication data simulating autonomous vehicle-to-vehicle communication data.\n | 5. The computer-implemented method of claim 1, wherein the operating conditions are associated with one or more of: a road type, a time of day, or a weather condition.\n | 6. The computer-implemented method of claim 1, wherein: \ndetermining the effectiveness metric includes generating a plurality of effectiveness metrics associated with a plurality of vehicle types. \n | 7. The computer-implemented method of claim 1, wherein: \nthe plurality of test scenarios include test scenarios associated with points of impact during virtual vehicle collisions; and \nthe effectiveness metric further indicates the combined reliability of operating the vehicle by using different combinations of the plurality of autonomous or semi-autonomous vehicle technologies to mitigate damages during the virtual vehicle collisions. \n | 8. A computer system for evaluating a vehicle having a plurality of autonomous or semi-autonomous vehicle technologies, the system comprising: \na test computing system including a processor and a memory storing executable instructions that, when executed by the processor, cause the test computing system to: \nimplement the plurality of autonomous or semi-autonomous vehicle technologies within a virtual test environment; \npresent virtual test sensor data to the virtual test environment, wherein the virtual test sensor data simulates sensor data for operating conditions associated with a plurality of test scenarios within the virtual test environment; \nin response to the virtual test sensor data, generate test responses of the plurality of autonomous or semi-autonomous vehicle technologies; and \nbased upon the test responses, determine an effectiveness metric for the plurality of autonomous or semi-autonomous vehicle technologies, wherein the effectiveness metric indicates a combined reliability of operating the vehicle by using different combinations of the plurality of autonomous or semi-autonomous vehicle technologies. \n | 9. The computer system of claim 8, wherein: \nthe plurality of autonomous or semi-autonomous vehicle technologies includes an updated version of at least one of the plurality of autonomous or semi-autonomous vehicle technologies; and \nthe executable instructions that cause the test computing system to determine the effectiveness metric further cause the test computing system to determine an update to the effectiveness metric based at least in part upon a compatibility of the updated version of the at least one of the plurality of autonomous or semi-autonomous vehicle technologies with at least another one of the plurality of autonomous or semi-autonomous vehicle technologies. \n | 10. The computer system of claim 9, wherein the executable instructions that cause the test computing system to determine the update to the effectiveness metric further cause the test computing system to determine a change in accident avoidance effectiveness for the updated version of the at least one of the plurality of autonomous or semi-autonomous vehicle technologies.\n | 11. The computer system of claim 8, wherein the virtual test sensor data includes virtual test communication data simulating autonomous vehicle-to-vehicle communication data.\n | 12. The computer system of claim 8, wherein the operating conditions are associated with one or more of: a road type, a time of day, or a weather condition.\n | 13. The computer system of claim 8, wherein: \nthe executable instructions that cause the test computing system to determine the effectiveness metric further cause the test computing system to generate a plurality of effectiveness metrics associated with a plurality of vehicle types. \n | 14. The computer system of claim 8, wherein: \nthe plurality of test scenarios include test scenarios associated with points of impact during virtual vehicle collisions; and \nthe effectiveness metric further indicates the combined reliability of operating the vehicle by using different combinations of the plurality of autonomous or semi-autonomous vehicle technologies to mitigate damages during the virtual vehicle collisions. \n | 15. A non-transitory computer-readable medium storing instructions for evaluating a vehicle having a plurality of autonomous or semi-autonomous vehicle technologies that, when executed by at least one processor of a computer system, cause the computer system to: \nimplement the plurality of autonomous or semi-autonomous vehicle technologies within a virtual test environment; \npresent virtual test sensor data to the virtual test environment, wherein the virtual test sensor data simulates sensor data for operating conditions associated with a plurality of test scenarios within the virtual test environment; \nin response to the virtual test sensor data, generate test responses of the plurality of autonomous or semi-autonomous vehicle technologies; and \nbased upon the test responses, determine an effectiveness metric for the plurality of autonomous or semi-autonomous vehicle technologies, wherein the effectiveness metric indicates a combined reliability of operating the vehicle by using different combinations of the plurality of autonomous or semi-autonomous vehicle technologies. \n | 16. The computer-readable medium of claim 15, wherein: \nthe plurality of autonomous or semi-autonomous vehicle technologies includes an updated version of at least one of the plurality of autonomous or semi-autonomous vehicle technologies; and \nthe executable instructions that cause the computer system to determine the effectiveness metric further cause the computer system to determine an update to the effectiveness metric based at least in part upon a compatibility of the updated version of the at least one of the plurality of autonomous or semi-autonomous vehicle technologies with at least another one of the plurality of autonomous or semi-autonomous vehicle technologies. \n | 17. The computer-readable medium of claim 15, wherein the virtual test sensor data includes virtual test communication data simulating autonomous vehicle-to-vehicle communication data.\n | 18. The computer-readable medium of claim 15, wherein the operating conditions are associated with one or more of: a road type, a time of day, or a weather condition.\n | 19. The computer-readable medium of claim 15, \nthe executable instructions that cause the computer system to determine the effectiveness metric further cause the computer system to generate a plurality of effectiveness metrics associated with a plurality of vehicle types. \n | 20. The computer-readable medium of claim 15, \nthe plurality of test scenarios include test scenarios associated with points of impact during virtual vehicle collisions; and \nthe effectiveness metric further indicates the combined reliability of operating the vehicle by using different combinations of the plurality of autonomous or semi-autonomous vehicle technologies to mitigate damages during the virtual vehicle collisions.", "output": "The method involves implementing the multiple of autonomous or semi-autonomous vehicle technologies within a virtual test environment by a test computing system. The virtual test sensor data is presented to the virtual test environment, such that the virtual test sensor data simulates sensor data for operating the conditions associated with a multiple of test scenarios within the virtual test environment. The test responses of the multiple of autonomous or semi-autonomous vehicle technologies are generated in response to the virtual test sensor data. An effectiveness metric for the multiple of autonomous or semi-autonomous vehicle technologies are determined based upon the test responses. The effectiveness metric indicates a combined reliability of operating the vehicle by using different combinations of the multiple of autonomous or semi-autonomous vehicle technologies. INDEPENDENT CLAIMS are included for the following :a computer system for evaluating a vehicle having a multiple of autonomous or semi-autonomous vehicle technologies anda computer-readable medium for storing instructions for evaluating a vehicle having a multiple of autonomous or semi-autonomous vehicle technologies. Method for evaluating a vehicle having a multiple of autonomous or semi-autonomous vehicle technologies. The information to the vehicle operator may improve the effective use of the autonomous operation features and reduce the risks associated with vehicle operation. The drawing shows a block diagram of an exemplary computer network, a computer server, a mobile device, and an on-board computer for implementing autonomous vehicle operation. 100Autonomous Vehicle Insurance System102Front End Components104Back-end Components110Mobile Device112Links", "instruction": "Please summarize the input" }, { "input": "Vehicular traffic alerts for avoidance of abnormal traffic conditionsMethods and systems are described for generating a vehicle-to-vehicle traffic alert and updating a vehicle-usage profile. Various aspects include detecting, via one or more processors associated with a first vehicle, that an abnormal traffic condition exists in an operating environment of the first vehicle. An electronic message is generated and transmitted wirelessly, via a vehicle-mounted transceiver associated with the first vehicle, to alert a nearby vehicle of the abnormal traffic condition and to allow the nearby vehicle to avoid the abnormal traffic condition. The first vehicle receives telematics data regarding operation of the nearby vehicle after the nearby vehicle received the electronic message, and transmits the telematics data to a remote server for updating a vehicle-usage profile associated with the nearby vehicle.What is claimed is: \n | 1. A computer-implemented method of generating a vehicle-to-vehicle traffic alert and updating a vehicle-usage profile, the method comprising: \ndetecting, via one or more processors associated with a first vehicle, that an abnormal traffic condition exists in an operating environment of the first vehicle; \ngenerating, via the one or more processors associated with the first vehicle, an electronic message regarding the abnormal traffic condition; \ntransmitting, via a vehicle-mounted transceiver associated with the first vehicle, the electronic message to a nearby vehicle, wherein the electronic message is transmitted via wireless communication to alert the nearby vehicle of the abnormal traffic condition and to allow the nearby vehicle to avoid the abnormal traffic condition; \nreceiving, via the one or more processors associated with the first vehicle, telematics data regarding operation of the nearby vehicle after the nearby vehicle received the electronic message; and \ntransmitting, via the one or more processors associated with the first vehicle, the telematics data to a remote server, wherein the remote server updates a vehicle-usage profile associated with the nearby vehicle. \n | 2. The computer-implemented method of claim 1, wherein the abnormal traffic condition is one or more of the following: an erratic vehicle, an erratic driver, road construction, a closed highway exit, slowed or slowing traffic, slowed or slowing vehicular congestion, or one or more other vehicles braking ahead of the first vehicle.\n | 3. The computer-implemented method of claim 1, wherein the abnormal traffic condition is bad weather and the electronic message indicates a GPS location of the bad weather.\n | 4. The computer-implemented method of claim 1, wherein updating the vehicle-usage profile causes an insurance premium adjustment to an insurance policy associated with an operator of the nearby vehicle.\n | 5. The computer-implemented method of claim 1, wherein the nearby vehicle comprises one or more of the following: an autonomous vehicle, a semi-autonomous vehicle or a self-driving vehicle, and wherein the nearby vehicle includes one or more processors for receiving the transmitted electronic message.\n | 6. The computer-implemented method of claim 1, wherein the transmitting the electronic message to the nearby vehicle requires transmitting the electronic message to one or more remote processors.\n | 7. The computer-implemented method of claim 1, wherein the abnormal traffic condition is detected by analyzing vehicular telematics data.\n | 8. The computer-implemented method of claim 1, wherein the nearby vehicle travels to the operating environment of the first vehicle.\n | 9. The computer-implemented method of claim 1, the method further comprising transmitting the electronic message to a smart infrastructure component, wherein the smart infrastructure component: \nanalyzes the electronic message to determine a type of anomalous condition for the abnormal traffic condition; and \nperforms an action based on the type of anomalous condition in order to modify the anomalous condition. \n | 10. The computer-implemented method of claim 1, wherein the electronic message contains location information of the abnormal traffic condition, and wherein the nearby vehicle ignores the electronic message when the location information indicates that the abnormal traffic condition is beyond a threshold distance from the nearby vehicle.\n | 11. A computer system configured to generate a vehicle-to-vehicle traffic alert and update a vehicle-usage profile, the computer system comprising one or more processors, the one or more processors configured to: \ndetect that an abnormal traffic condition exists in an operating environment of a first vehicle; \ngenerate an electronic message regarding the abnormal traffic condition; \ntransmit, via a vehicle-mounted transceiver associated with the first vehicle, the electronic message to a nearby vehicle, wherein the electronic message is transmitted via wireless communication to alert the nearby vehicle of the abnormal traffic condition and to allow the nearby vehicle to avoid the abnormal traffic condition; \nreceive telematics data regarding operation of the nearby vehicle after the nearby vehicle received the electronic message; and \ntransmit the telematics data to a remote server, wherein the remote server updates a vehicle-usage profile associated with the nearby vehicle. \n | 12. The computer system of claim 11, wherein the abnormal traffic condition is one or more of the following: an erratic vehicle, an erratic driver, road construction, a closed highway exit, slowed or slowing traffic, slowed or slowing vehicular congestion, or one or more other vehicles braking ahead of the first vehicle.\n | 13. The computer system of claim 11, wherein the abnormal traffic condition is bad weather and the electronic message indicates a GPS location of the bad weather.\n | 14. The computer system of claim 11, the system further configured to generate an alternate route for the nearby vehicle to take to avoid the abnormal traffic condition.\n | 15. The computer system of claim 11, wherein updating the vehicle-usage profile causes an insurance premium adjustment to an insurance policy associated with an operator of the nearby vehicle.\n | 16. The computer system of claim 11, wherein the one or more processors is one or more of the following: vehicle-mounted sensors or vehicle-mounted processors.\n | 17. The computer system of claim 11, wherein the nearby vehicle comprises one or more of the following: an autonomous vehicle, a semi-autonomous vehicle or a self-driving vehicle, and wherein the nearby vehicle includes one or more processors for receiving the transmitted electronic message.\n | 18. The computer system of claim 11, wherein the transmission of the electronic message to the nearby vehicle requires transmission of the electronic message to one or more remote processors.\n | 19. The computer system of claim 11, wherein the abnormal traffic condition is detected by analyzing vehicular telematics data.\n | 20. The computer system of claim 11, wherein the nearby vehicle travels to the operating environment of the first vehicle.", "output": "The method involves detecting that an abnormal traffic condition exists in an operating environment of the first vehicle, generating an electronic message regarding the abnormal traffic condition, transmitting the electronic message to a nearby vehicle, in which the electronic message is transmitted via wireless communication to alert the nearby vehicle of the abnormal traffic condition and to allow the nearby vehicle to avoid the abnormal traffic condition, receiving telematics data regarding operation of the nearby vehicle after the nearby vehicle received the electronic message, and transmitting the telematics data to a remote server via the processors associated with the first vehicle, in which the remote server updates a vehicle-usage profile associated with the nearby vehicle. An INDEPENDENT CLAIM is also included for a computer system. Computer-implemented method of generating vehicle-to-vehicle traffic alert and updating vehicle-usage profile. Helps improve driving behavior by providing for feedback to the evaluated driver. Saves processing power and battery life since the second computing device ignores the telematics data. The drawing shows the block diagram of the telematics collection system. 100Telematics collection system106External computing device108Vehicle110Computing device114On-board computer", "instruction": "Please summarize the input" }, { "input": "Vehicular traffic alerts for avoidance of abnormal traffic conditionsMethods and systems are described for generating a vehicle-to-vehicle traffic alert and updating a vehicle-usage profile. Various aspects include detecting, via one or more processors associated with a first vehicle, that an abnormal traffic condition exists in an operating environment of the first vehicle. An electronic message is generated and transmitted wirelessly, via a vehicle-mounted transceiver associated with the first vehicle, to alert a nearby vehicle of the abnormal traffic condition and to allow the nearby vehicle to avoid the abnormal traffic condition. The first vehicle receives telematics data regarding operation of the nearby vehicle after the nearby vehicle received the electronic message, and transmits the telematics data to a remote server for updating a vehicle-usage profile associated with the nearby vehicle.What is claimed is: \n | 1. A computer-implemented method of generating a vehicle traffic alert, the method comprising: \ndetecting, via one or more processors, that an abnormal traffic condition exists in a vehicle operating environment; \ngenerating, via the one or more processors, an electronic message regarding the abnormal traffic condition; \ntransmitting the electronic message to a smart infrastructure component within a proximity of the vehicle operating environment, wherein the smart infrastructure component analyzes the electronic message to determine a type of anomalous condition for the abnormal traffic condition, the abnormal traffic condition having already occurred in the vehicle operating environment, wherein the type of anomalous condition is selected from at least one of a set of transient conditions or non-transient conditions, and wherein determining the type of anomalous condition comprises comparing sensor data with previously recorded data for the operating environment, and wherein the smart infrastructure component performs an action based upon the type of anomalous condition in order to modify the anomalous condition into an altered roadway condition; and \ntransmitting, via the one or more processors, the electronic message to a nearby vehicle, wherein the electronic message is transmitted via wireless communication to alert the nearby vehicle of the altered roadway condition, to allow the nearby vehicle to avoid or approach the altered roadway condition. \n | 2. The computer-implemented method of claim 1, wherein the abnormal traffic condition is one or more of the following: an erratic vehicle, an erratic driver, road construction, a closed highway exit, slowed or slowing traffic, slowed or slowing vehicular congestion, or one or more other vehicles braking ahead of the nearby vehicle.\n | 3. The computer-implemented method of claim 1, wherein the abnormal traffic condition is bad weather and the electronic message indicates a GPS location of the bad weather.\n | 4. The computer-implemented method of claim 1 further comprising updating, via the one or more processors, a vehicle-usage profile associated with the nearby vehicle based upon received telematics data regarding operation of the nearby vehicle, wherein updating the vehicle-usage profile causes an insurance premium adjustment to an insurance policy associated with an operator of the nearby vehicle.\n | 5. The computer-implemented method of claim 1, wherein the nearby vehicle comprises one or more of the following: an autonomous vehicle, a semi-autonomous vehicle or a self-driving vehicle, and wherein the nearby vehicle includes one or more processors for receiving the transmitted electronic message.\n | 6. The computer-implemented method of claim 1, wherein the transmitting the electronic message to the nearby vehicle requires transmitting the electronic message to one or more remote processors.\n | 7. The computer-implemented method of claim 1, wherein the abnormal traffic condition is detected by analyzing vehicular telematics data.\n | 8. The computer-implemented method of claim 1, wherein the nearby vehicle travels to the vehicle operating environment.\n | 9. The computer-implemented method of claim 1, wherein the smart infrastructure component comprises a smart traffic light.\n | 10. The computer-implemented method of claim 1, wherein the electronic message contains location information of the abnormal traffic condition, and the nearby vehicle ignores the electronic message when the location information indicates that the abnormal traffic condition is beyond a threshold distance from the nearby vehicle.\n | 11. A computer system configured to generate a vehicle traffic alert, the computer system comprising one or more processors, the one or more processors configured to: \ndetect that an abnormal traffic condition exists in a vehicle operating environment; \ngenerate an electronic message regarding the abnormal traffic condition; \ntransmit the electronic message to a smart infrastructure component within a proximity of the vehicle operating environment, wherein the smart infrastructure component analyzes the electronic message to determine a type of anomalous condition for the abnormal traffic condition, the abnormal traffic condition having already occurred in the vehicle operating environment, wherein the type of anomalous condition is selected from at least one of a set of transient conditions or non-transient conditions, and wherein determining the type of anomalous condition comprises comparing sensor data with previously recorded data for the operating environment, and wherein the smart infrastructure component performs an action based upon the type of anomalous condition in order to modify the anomalous condition into an altered roadway condition; and \ntransmit the electronic message to a nearby vehicle, wherein the electronic message is transmitted via wireless communication to alert the nearby vehicle of the altered roadway condition, to allow the nearby vehicle to avoid or approach the altered roadway condition. \n | 12. The computer system of claim 11, wherein the abnormal traffic condition is one or more of the following: an erratic vehicle, an erratic driver, road construction, a closed highway exit, slowed or slowing traffic, slowed or slowing vehicular congestion, or one or more other vehicles braking ahead of the nearby vehicle.\n | 13. The computer system of claim 11, wherein the abnormal traffic condition is bad weather, and the electronic message indicates a GPS location of the bad weather.\n | 14. The computer system of claim 11, the system further configured to generate an alternate route for the nearby vehicle to take to avoid the abnormal traffic condition.\n | 15. The computer system of claim 11, the system further configured to update a vehicle-usage profile associated with the nearby vehicle based upon received telematics data regarding operation of the nearby vehicle, wherein updating the vehicle-usage profile causes an insurance premium adjustment to an insurance policy associated with an operator of the nearby vehicle.\n | 16. The computer system of claim 11, wherein the one or more processors include one or more of the following: vehicle-mounted sensors or vehicle-mounted processors.\n | 17. The computer system of claim 11, wherein the nearby vehicle comprises one or more of the following: an autonomous vehicle, a semi-autonomous vehicle or a self-driving vehicle, and the nearby vehicle includes one or more processors for receiving the transmitted electronic message.\n | 18. The computer system of claim 11, wherein the transmission of the electronic message to the nearby vehicle requires transmission of the electronic message to one or more remote processors.\n | 19. The computer system of claim 11, wherein the abnormal traffic condition is detected by analyzing vehicular telematics data.\n | 20. The computer system of claim 11, wherein the nearby vehicle travels to the vehicle operating environment.", "output": "The method involves detecting that an abnormal traffic condition exists in a vehicle operating environment, and generating an electronic message regarding the condition. The message is transmitted to a smart infrastructure component within a proximity of the environment. The component analyzes the message to determine a type of anomalous condition for the condition, where the component performs an action based upon the type of condition to modify the condition into an altered roadway condition, and transmits the message via wireless communication to alert a nearby vehicle of the altered condition to allow the vehicle to avoid or approach the roadway condition. An INDEPENDENT CLAIM is included for a computer system configured to generate a vehicle traffic alert. Computer-implemented method for generating a vehicle traffic alert. The data collected may be used to generate vehicle-usage profiles that more accurately reflect vehicle risk, or lack thereof, and facilitate more appropriate auto insurance pricing. The electronic message may then be transmitted through the vehicle's transceiver using a wireless communication to the nearby vehicle to alert the nearby vehicles of the abnormal traffic condition and to allow the neighboring vehicles to avoid the abnormally occurring traffic condition. The drawing shows a schematic diagram of a telematics collection system.", "instruction": "Please summarize the input" }, { "input": "Accident risk model determination using autonomous vehicle operating dataMethods and systems for evaluating the effectiveness of autonomous operation features of autonomous vehicles using an accident risk model are provided. According to certain aspects, an accident risk model may be determined using effectiveness information regarding autonomous operation features associated with a vehicle. The effectiveness information may indicate a likelihood of an accident for the vehicle and may include test data or actual loss data. Determining the likelihood of an accident may include determining risk factors for the features related to the ability of the features to make control decisions that successfully avoid accidents. The accident risk model may further include information regarding effectiveness of the features relative to location or operating conditions, as well as types and severity of accidents. The accident risk model may further be used to determine or adjust aspects of an insurance policy associated with an autonomous vehicle.What is claimed is: \n | 1. A computer-implemented method of evaluating effectiveness of an autonomous or semi-autonomous vehicle technology, the method comprising: \ngenerating, by one or more computing systems configured to evaluate the autonomous or semi-autonomous vehicle technology operating within a virtual test environment configured to simultaneously test at least one additional autonomous or semi-autonomous vehicle technologies, test data regarding results of virtual testing within the virtual test environment in which responses of the autonomous or semi-autonomous vehicle technology to virtual test sensor data are simulated; \nreceiving, by one or more processors, effectiveness information regarding the autonomous or semi-autonomous vehicle technology, the effectiveness information including both (i) actual accident data associated with vehicles having the autonomous or semi-autonomous vehicle technology and (ii) the test data associated with the autonomous or semi-autonomous vehicle technology; \ndetermining, by one or more processors, an indication of reliability of the autonomous or semi-autonomous vehicle technology based at least in part upon compatibility of a version of or an update to computer-readable instructions involved in implementation of part or all of the autonomous or semi-autonomous vehicle technology with one or more versions of the at least one additional autonomous or semi-autonomous vehicle technologies tested; \ndetermining, by one or more processors, an accident risk model based upon the received effectiveness information and the determined indication of reliability; \ndetermining, by one or more processors, an insurance policy for a vehicle equipped with the autonomous or semi-autonomous vehicle technology based at least in part upon the accident risk model; and \ncausing, by one or more processors, information regarding all or a portion of the determined insurance policy for the vehicle to be presented to a customer for review by the customer via a display of a computing device associated with the customer. \n | 2. The computer-implemented method of claim 1, wherein the accident risk model is associated with a likelihood that vehicles having the autonomous or semi-autonomous vehicle technology will be involved in vehicle accidents.\n | 3. The computer-implemented method of claim 1, wherein the accident risk model comprises a data structure containing entries associated with at least one of (1) the autonomous or semi-autonomous vehicle technology or (2) a likelihood of a vehicle accident.\n | 4. The computer-implemented method of claim 1, further comprising: \nstoring, by a non-transient computer-readable medium, the accident risk model; \nreceiving, by one or more processors, a request to determine the insurance policy for the vehicle; and \naccessing, by one or more processors, the accident risk model based upon the received request. \n | 5. The computer-implemented method of claim 1, wherein the autonomous or semi-autonomous vehicle technology involves at least one of a vehicle self-braking functionality or a vehicle self-steering functionality.\n | 6. The computer-implemented method of claim 1, wherein determining the insurance policy includes calculating at least one of the following based upon the autonomous or semi-autonomous vehicle technology and the accident risk model: an automobile insurance premium, a discount, or a reward.\n | 7. The computer-implemented method of claim 1, wherein the autonomous or semi-autonomous vehicle technology is related to at least one of the following: \ndriver alertness monitoring; \ndriver responsiveness monitoring; \npedestrian detection; \nartificial intelligence; \na back-up system; \na navigation system; \na positioning system; \na security system; \nan anti-hacking measure; \na theft prevention system; or \nremote vehicle location determination. \n | 8. The computer-implemented method of claim 1, wherein the autonomous or semi-autonomous vehicle technology is related to at least one of the following: \na point of impact; \na type of road; \na time of day; \na weather condition; \na type of a trip; \na length of a trip; \na vehicle style; \na vehicle-to-vehicle communication; or \na vehicle-to-infrastructure communication. \n | 9. The computer-implemented method of claim 1, wherein causing information regarding all or a portion of the determined insurance policy for the vehicle to be presented to the customer includes causing to be presented on the display a cost of automobile insurance coverage.\n | 10. The computer-implemented method of claim 1, wherein determining the accident risk model includes determining at least one risk level associated with the autonomous or semi-autonomous vehicle technology based upon observed responses of the autonomous or semi-autonomous vehicle technology in other vehicles.\n | 11. The computer-implemented method of claim 1, wherein determining the insurance policy for the vehicle includes at least one of generating a new insurance policy associated with the vehicle or updating an existing insurance policy associated with the vehicle.\n | 12. The computer-implemented method of claim 1, wherein the accident risk model further accounts for an effect of one or more of the following on the effectiveness information: weather, road type, or vehicle type.\n | 13. A computer system for evaluating effectiveness of an autonomous or semi-autonomous vehicle technology, comprising: \none or more processors; \none or more communication modules adapted to communicate data; \none or more computing systems configured to evaluate the autonomous or semi-autonomous vehicle technology operating within a virtual test environment configured to simultaneously test at least one additional autonomous or semi-autonomous vehicle technologies to generate test data regarding results of virtual testing within the virtual test environment in which responses of the autonomous or semi-autonomous vehicle technology to virtual test sensor data are simulated, and wherein the test results are communicated to the one or more processors via the one or more communication modules; and \na program memory coupled to the one or more processors and storing executable instructions that when executed by the one or more processors cause the computer system to: \nreceive effectiveness information regarding the autonomous or semi-autonomous vehicle technology, the effectiveness information including both (i) actual accident data associated with vehicles having the autonomous or semi-autonomous vehicle technology and (ii) the test data associated with the autonomous or semi-autonomous vehicle technology; \ndetermine an indication of reliability of the autonomous or semi-autonomous vehicle technology based at least in part upon compatibility of a version of or an update to computer-readable instructions involved in implementation of part or all of the autonomous or semi-autonomous vehicle technology with one or more versions of the at least one additional autonomous or semi-autonomous vehicle technologies tested; \ndetermine an accident risk model based upon the received effectiveness information and the determined indication of reliability; \ndetermine an insurance policy for a vehicle equipped with the autonomous or semi-autonomous vehicle technology based at least in part upon the accident risk model; and \ncause, via the one or more communication modules, information regarding all or a portion of the determined insurance policy for the vehicle to be presented to a customer for review by the customer via a display of a computing device associated with the customer. \n | 14. The computer system of claim 13, wherein the accident risk model is associated with a likelihood that vehicles having the autonomous or semi-autonomous vehicle technology will be involved in vehicle accidents.\n | 15. The computer system of claim 13, wherein the accident risk model comprises a data structure containing entries associated with at least one of (1) the autonomous or semi-autonomous vehicle technology or (2) a likelihood of a vehicle accident.\n | 16. The computer system of claim 13, wherein the executable instructions further cause the computer system to: \nstore the accident risk model; \nreceive, via the one or more communication modules, a request to determine the insurance policy for the vehicle; and \naccess the accident risk model based upon the received request. \n | 17. A tangible, non-transitory computer-readable medium storing instructions for evaluating effectiveness of an autonomous or semi-autonomous vehicle technology that, when executed by at least one processor of a computer system, cause the computer system to: \ngenerate, using one or more computing systems configured to evaluate the autonomous or semi-autonomous vehicle technology operating within a virtual test environment configured to simultaneously test at least one additional autonomous or semi-autonomous vehicle technologies, test data regarding results of virtual testing within the virtual test environment in which responses of the autonomous or semi-autonomous vehicle technology to virtual test sensor data are simulated; \nreceive effectiveness information regarding the autonomous or semi-autonomous vehicle technology, the effectiveness information including both (i) actual accident data associated with vehicles having the autonomous or semi-autonomous vehicle technology and (ii) the test data associated with the autonomous or semi-autonomous vehicle technology; \ndetermine an indication of reliability of the autonomous or semi-autonomous vehicle technology based at least in part upon compatibility of a version of or an update to computer-readable instructions involved in implementation of part or all of the autonomous or semi-autonomous vehicle technology with one or more versions of the at least one additional autonomous or semi-autonomous vehicle technologies tested; \ndetermine an accident risk model based upon the received effectiveness information and the determined indication of reliability; \ndetermine an insurance policy for a vehicle equipped with the autonomous or semi-autonomous vehicle technology based at least in part upon the accident risk model; and \ncause information regarding all or a portion of the determined insurance policy for the vehicle to be presented to a customer for review by the customer via a display of a computing device associated with the customer. \n | 18. The tangible, non-transitory computer-readable medium of claim 17, wherein the accident risk model is associated with a likelihood that vehicles having the autonomous or semi-autonomous vehicle technology will be involved in vehicle accidents.\n | 19. The tangible, non-transitory computer-readable medium of claim 17, wherein the accident risk model comprises a data structure containing entries associated with at least one of (1) the autonomous or semi-autonomous vehicle technology or (2) a likelihood of a vehicle accident.\n | 20. The tangible, non-transitory computer-readable medium of claim 17, further comprising executable instructions further cause the computer system to: \nstore the accident risk model; \nreceive, via one or more communication modules, a request to determine the insurance policy for the vehicle; and \naccess the accident risk model based upon the received request.", "output": "The computer-based method involves generating test data regarding results of virtual testing. The effectiveness information regarding the autonomous or semi-autonomous vehicle technology is received. An indication of reliability of the autonomous or semi-autonomous vehicle technology is determined. An insurance policy is determined for a vehicle (108) equipped with the autonomous or semi-autonomous vehicle technology. The information regarding all or a portion of the determined insurance policy is caused for the vehicle to be presented to a customer for review. INDEPENDENT CLAIMS are included for the following:a computer system for evaluating effectiveness of an autonomous or semi-autonomous vehicle technology; anda tangible, non-transitory computer-readable medium storing instructions for evaluating effectiveness of an autonomous or semi-autonomous vehicle technology. Computer-based method of evaluating effectiveness of autonomous or semi-autonomous vehicle technology. The autonomous vehicle operation features either assist the vehicle operator to more safely or efficiently operate a vehicle or takes full control of vehicle operation under some or all circumstances. An automobile insurance premium is determined by evaluating how effectively the vehicle is able to avoid and mitigate crashes and the extent to which the driver's control of the vehicle is enhanced or replaced by the vehicle's software and artificial intelligence. The drawing shows a block diagram of a computer network, a computer server, a mobile device and an on-board computer for implementing autonomous vehicle operation, monitoring, evaluation and insurance processes. 100Autonomous vehicle insurance system102Front end component104Back end component108Vehicle110Mobile device", "instruction": "Please summarize the input" }, { "input": "Fully autonomous vehicle insurance pricingMethods and systems for determining risk associated with operation of fully autonomous vehicles are provided. According to certain aspects, autonomous operation features associated with a vehicle may be determined, including types and version of sensors, control systems, and software. This information may be used to determine a risk profile reflecting risk levels for a plurality of features, which may be based upon test data regarding the features or actual loss data. Expected use levels may further be determined and used with the risk profile to determine a total risk level associated with operation of the vehicle by the autonomous operation features. The expected use levels may indicate expected vehicle use, as well as traffic, weather, or other conditions in which the vehicle is likely to operate. The total risk level may be used to determine or adjust aspects of an insurance policy associated with the vehicle.What is claimed is: \n | 1. A computer system for monitoring usage of a vehicle having one or more autonomous operation features for controlling the vehicle, the computer system comprising one or more processors, one or more transceivers coupled to the one or more processors, and one or more program memories coupled to the one or more processors and storing executable instructions that cause the one or more processors to: \ndetermine a risk profile associated with operation of the vehicle that includes a plurality of risk levels associated with operation of the vehicle (i) under a plurality of weather and road operating conditions with the one or more autonomous operation features of the vehicle engaged, and (ii) under the plurality of weather and road operating conditions with the one or more autonomous operation features of the vehicle disengaged; \nreceive a log of usage data regarding previous use of the one or more autonomous operation features of the vehicle by a vehicle operator during the plurality of weather and road conditions, via wireless communication or data transmission from a mobile device of the vehicle operator in communication with an on-board computer of the vehicle, wherein the mobile device generates the log of usage data from the data received from the on-board computer during a vehicle trip, the log of usage data including: a timestamp indicating a beginning of the vehicle trip, a timestamp indicating an end of the vehicle trip, one or more timestamps associated with engagement or disengagement of the one or more autonomous operation features, and configuration data associated with the one or more autonomous operation features when engaged, and wherein the log of usage data further includes current weather and road conditions during the vehicle trip; \nreceive sensor data associated with the vehicle; \ndetermine a plurality of expected use levels of the vehicle during the plurality of weather and road operating conditions, wherein the expected use levels indicate whether or not the vehicle operator is expected to engage or disengage the one or more autonomous operation features during each of the plurality of weather and road operating conditions as determined from processor analysis of the usage data received in the log of usage data; \ndetermine a total risk level associated with overall operation of the vehicle based at least in part upon (a) the determined risk profile, and (b) the determined expected use levels that indicate whether or not the vehicle operator is expected to engage or disengage the one or more autonomous operation features during each of the plurality of weather and road operating conditions; \ndetermine types of one or more sensors installed in the vehicle based upon the sensor data associated with the vehicle; \nadjust the total risk level associated with autonomous operation of the vehicle based at least in part upon the types of sensors installed in the vehicle; and \ncause one or more of the following actions to be performed based upon the determined total risk level: adjust an insurance policy associated with the vehicle, determine a coverage level associated with the insurance policy, present information regarding the determined total risk level to a reviewer via a display of a reviewer computing device to verify the determined total risk level, or present the determination to a customer via a display of a customer computing device for review of an adjustment to the insurance policy associated with the vehicle. \n | 2. The system of claim 1, wherein the executable instructions further cause the one or more processors to: \nestimate future usage or operation of the vehicle, either by time or mileage, during each of the plurality of weather and road conditions and with each of the one or more autonomous operation features engaged or disengaged based upon the log of usage data; and \nadjust the total risk level for the vehicle based upon (1) the determined risk profile, (2) the determined expected use levels, and (3) the estimated future usage or operation of the vehicle, either by time or mileage, the vehicle is predicted to be operated in each of the plurality of weather and road conditions with each of the one or more autonomous operation features engaged or disengaged. \n | 3. The system of claim 1, wherein the executable instructions further cause the one or more processors to: \nestimate future usage or operation of the vehicle, either by time or mileage, during each of the plurality of weather and road conditions; and \nadjust the total risk level for the vehicle based upon (1) the determined risk profile, (2) the determined expected use levels, and (3) an amount of time or miles that the vehicle is estimated to be operated in each of the plurality of weather and road conditions indicated by the usage data received in the log of usage data. \n | 4. The system of claim 1, wherein: \nthe risk profile associated with autonomous operation of the vehicle is based at least in part upon test result data generated from test units corresponding to the one or more autonomous operation features; \nthe test results include responses of the test units to test inputs corresponding to test scenarios, the test scenarios include vehicle operation with an autonomous feature engaged during each of the plurality of weather and road conditions; and \nthe test results are generated and recorded by the test units disposed within one or more test vehicles in response to sensor data from a plurality of sensors, and video recording devices, within the one or more test vehicles. \n | 5. The system of claim 1, wherein the risk profile associated with autonomous operation of the vehicle is based at least in part upon actual losses associated with insurance policies covering a plurality of other vehicles having at least one of the one or more autonomous operation features, the actual losses incurred through vehicle operation in each of the plurality of weather and road conditions.\n | 6. The system of claim 1, wherein the executable instructions further cause the one or more processors to: \nreceive a request for a quote of a premium associated with a vehicle insurance policy via wireless communication transmitted by the customer computing device; \ndetermine a premium associated with the vehicle insurance policy based upon the total risk level; and \npresent an option to purchase the vehicle insurance policy to the customer associated with the vehicle. \n | 7. The system of claim 1, wherein the log of usage data regarding the one or more autonomous operation features includes a version of autonomous operation feature control software that is currently installed on the vehicle or in an autonomous operation feature system mounted on the vehicle.\n | 8. The system of claim 1, wherein the executable instructions further cause the one or more processors to: \nreceive information regarding a type and version of the one or more autonomous operation features; and \nupdate the total risk level associated with autonomous operation of the vehicle based upon the type and version of the one or more autonomous operation features. \n | 9. The system of claim 1, wherein the one or more autonomous operation features include a vehicle-to-vehicle (V2V) wireless communication capability, and wherein the executable instructions further cause the one or more processors to: \nreceive telematics data generated or broadcast from other vehicles; and \ngenerate and display alternate routes based upon the telematics data received to facilitate safer vehicle travel and avoidance of bad weather, traffic, or road conditions. \n | 10. A computer-implemented method for monitoring usage of a vehicle having one or more autonomous operation features for controlling the vehicle, comprising: \ndetermining, by one or more processors, a risk profile associated with operation of the vehicle that includes a plurality of risk levels associated with operation of the vehicle (i) under a plurality of weather and road operating conditions with the one or more autonomous operation features of the vehicle engaged, and (ii) under the plurality of weather and road operating conditions with the one or more autonomous operation features of the vehicle disengaged; \nreceiving, at the one or more processors or an associated transceiver, a log of usage data regarding previous use of the one or more autonomous operation features of the vehicle by a vehicle operator during the plurality of weather and road conditions, via wireless communication or data transmission transmitted from a mobile device of the vehicle operator in communication with an on-board computer of the vehicle, wherein the mobile device generates the log of usage data from the data received from the on-board computer during a vehicle trip, the log of usage data including: a timestamp indicating a beginning of the vehicle trip, a timestamp indicating an end of the vehicle trip, one or more timestamps associated with engagement or disengagement of the one or more autonomous operation features, and configuration data associated with the one or more autonomous operation features when engaged, and wherein the log of usage data further includes current weather and road conditions during the vehicle trip; \ndetermining, by the one or more processors, a plurality of expected use levels of the vehicle during the plurality of weather and road operating conditions, wherein the expected use levels indicate whether or not the vehicle operator is expected to engage or disengage the one or more autonomous operation features during each of the plurality of weather and road operating conditions as determined from processor analysis of the usage data received in the log of usage data; \ndetermining, by the one or more processors, a total risk level associated with overall operation of the vehicle based at least in part upon (a) the determined risk profile, and (b) the determined expected use levels that indicate whether or not the vehicle operator is expected to engage or disengage the one or more autonomous operation features during each of the plurality of weather and road operating conditions; \ndetermining, via the one or more processors, types of one or more sensors installed in the vehicle based upon sensor data received from the vehicle; and \nadjusting, via the one or more processors, the total risk level associated with autonomous operation of the vehicle based at least in part upon the types of sensors installed in the vehicle; and \ncausing, by the one or more processors, one or more of the following actions to be performed based upon the determined total risk level: adjust an insurance policy associated with the vehicle, determine a coverage level associated with the insurance policy, present information regarding the determined total risk level to a reviewer via a display of a reviewer computing device to verify the determined total risk level, or present the determination to a customer via a display of a customer computing device for review of an adjustment to the insurance policy associated with the vehicle. \n | 11. The method of claim 10, the method comprising: \nestimating future usage or operation of the vehicle, either by time or mileage, during each of the plurality of weather and road conditions and with each of the one or more autonomous operation features engaged or disengaged; and \nadjusting, via the one or more processors, the total risk level for the vehicle based upon (1) the determined risk profile, (2) the determined expected use levels, and (3) an amount of time or miles that the vehicle is operated in each of the plurality of weather and road conditions with each of the one or more autonomous operation features engaged or disengaged indicated by the usage data received in the log of usage data. \n | 12. The method of claim 10, the method comprising: \nestimating future usage or operation of the vehicle, either by time or mileage, during each of the plurality of weather and road conditions; and \nadjusting, via the one or more processors, the total risk level for the vehicle based upon (1) the determined risk profile, (2) the determined expected use levels, and (3) the amount of time or miles that the vehicle is expected to be operated in the future in each of the plurality of weather and road conditions for a given time period. \n | 13. The method of claim 10, wherein: \nthe risk profile associated with autonomous operation of the vehicle is based at least in part upon test result data generated from test units corresponding to the one or more autonomous operation features; \nthe test results include responses of the test units to test inputs corresponding to test scenarios, the test scenarios including vehicle operation with an autonomous feature engaged during each of the plurality of weather and road conditions; and \nthe test results are generated and recorded by the test units disposed within one or more test vehicles in response to sensor data from a plurality of sensors, and video recording devices, within the one or more test vehicles. \n | 14. The method of claim 10, wherein the risk profile associated with autonomous operation of the vehicle is based at least in part upon actual losses associated with insurance policies covering a plurality of other vehicles having at least one of the one or more autonomous operation features, the actual losses incurred through vehicle operation in each of the plurality of weather and road conditions.\n | 15. The method of claim 10, further comprising: \nreceiving, at the one or more processors or the associated transceiver, a request for a quote of a premium associated with a vehicle insurance policy via wireless communication transmitted by the customer computing device; \ndetermining, by one or more processors, a premium associated with the vehicle insurance policy based upon the total risk level; and \npresenting, by one or more processors, an option to purchase the vehicle insurance policy to the customer associated with the vehicle. \n | 16. The method of claim 10, the method further comprising: \nreceiving, via the one or more processors or the associated transceiver, information regarding a type and version of the one or more autonomous operation features; and \nupdating the total risk level associated with autonomous operation of the vehicle, via the one or more processors, based upon the type and version of the one or more autonomous operation features. \n | 17. The method of claim 10, wherein the autonomous operation feature is a vehicle-to-vehicle (V2V) wireless communication capability, and the method comprises: \nreceiving, via one or more vehicle-mounted processors or associated transceiver, telematics data generated or broadcast from other vehicles; and \ngenerating and displaying alternate routes, via the one or more vehicle-mounted processors, based upon the telematics data received to facilitate safer vehicle travel and avoidance of bad weather, traffic, or road conditions. \n | 18. A computer system for monitoring usage of a vehicle having one or more autonomous operation features for controlling the vehicle, the system comprising one or more processors, one or more transceivers coupled to the one or more processors, and one or more program memories coupled to the one or more processors and storing executable instructions that cause the one or more processors to: \ndetermine a risk profile associated with operation of the vehicle that includes a plurality of risk levels associated with operation of the vehicle (i) under a plurality of weather and road operating conditions with the one or more autonomous operation features of the vehicle engaged, and (ii) under the plurality of weather and road operating conditions with the one or more autonomous operation features of the vehicle disengaged; \nreceive a log of usage data regarding previous use of the one or more autonomous operation features of the vehicle by a vehicle operator during the plurality of weather and road conditions, via wireless communication or data transmission from a mobile device of the vehicle operator in communication with an on-board computer of the vehicle, wherein the mobile device generates the log of usage data from the data received from the on-board computer during a vehicle trip, the log of usage data including: a timestamp indicating a beginning of the vehicle trip, a timestamp indicating an end of the vehicle trip, one or more timestamps associated with engagement or disengagement of the one or more autonomous operation features, and configuration data associated with the one or more autonomous operation features when engaged, and wherein the log of usage data further includes current weather and road conditions during the vehicle trip; \ndetermine from analysis of the usage data received in the log of usage data a plurality of expected use levels of the vehicle during the plurality of weather and road operating conditions, wherein the expected use levels indicate whether or not the vehicle operator is expected to engage or disengage the one or more autonomous operation features during each of the plurality of weather and road operating conditions; \ndetermine from analysis of the usage data received in the log of usage data, or from analysis of other vehicle or telematics data received from the vehicle or mobile device, an average amount of time or miles that the vehicle operator operates the vehicle during each of the plurality of weather and road operating conditions for a period of time; \ndetermine a total risk level associated with overall operation of the vehicle based at least in part upon (a) the determined risk profile, (b) the determined expected use levels that indicate whether or not the vehicle operator is expected to engage or disengage the one or more autonomous operation features during each of the plurality of weather and road operating conditions, and (c) the average amount of time or miles that the vehicle operator operates the vehicle during each of the plurality of weather and road operating conditions for the period of time to facilitate more accurate risk assessment and auto insurance pricing; \ndetermine types of one or more sensors installed in the vehicle based upon sensor data received from the vehicle; and \nadjust the total risk level associated with autonomous operation of the vehicle based at least in part upon the types of sensors installed in the vehicle; and \ncause one or more of the following actions to be performed based upon the determined total risk level: adjust an insurance policy associated with the vehicle, determine a coverage level associated with the insurance policy, present information regarding the determined total risk level to a reviewer via a display of a reviewer computing device to verify the determined total risk level, or present the determination to a customer via a display of a customer computing device for review of an adjustment to the insurance policy associated with the vehicle.", "output": "The computer system comprises one or more processors (162), transceivers coupled to processors, and program memories (160) coupled to the processors and storing executable instructions that cause the one or more processors to determine a risk profile associated with operation of the vehicle that includes multiple risk levels associated with operation of the vehicle under multiple weather and road operating conditions with the one or more autonomous operation features of the vehicle engaged, and under multiple weather and road operating conditions with the one or more autonomous operation features of the vehicle disengaged. A log of usage data regarding previous use of the one or more autonomous operation features of the vehicle is received by a vehicle operator during multiple weather and road conditions, via wireless communication or data transmission from a mobile device of the vehicle operator in communication with an on-board computer of the vehicle, where the mobile device (110) generates the log of usage data from the data received from the on-board computer during a vehicle trip. The log of usage data includes timestamp indicating a beginning of the vehicle trip, one or more timestamps associated with engagement or disengagement of the one or more autonomous operation features, and configuration data associated with one or more autonomous operation features when engaged, and where the log of usage data further includes current weather and road conditions during the vehicle trip, receive sensor data associated with the vehicle, determine multiple expected use levels of the vehicle during multiple weather and road operating conditions, where the expected use levels indicate whether or not the vehicle operator is expected to engage or disengage the one or more autonomous operation features during each of multiple weather and road operating conditions as determined from processor analysis of the usage data received in the log of usage data. The total risk level associated with overall operation of the vehicle based a portion upon the determined risk profile, and expected use levels that indicate whether or not the vehicle operator is expected to engage or disengage the one or more autonomous operation features during each of multiple weather and road operating conditions are determine. The determine types of one or more sensors installed in the vehicle based upon the sensor data associated with the vehicle, adjusting the total risk level associated with autonomous operation of the vehicle based in portion upon the types of sensors installed in the vehicle. An INDEPENDENT CLAIM is included for a computer-implemented method for monitoring usage of vehicle having autonomous operation features for controlling the vehicle, which involves:determining, by one or more processors, a risk profile associated with operation of the vehicle that includes multiple risk levels associated with operation of the vehicle under multiple weather and road operating conditions with the one or more autonomous operation features of the vehicle engaged, and under multiple weather and road operating conditions with the one or more autonomous operation features of the vehicle disengaged;receiving, at the one or more processors or an associated transceiver, a log of usage data regarding previous use of the one or more autonomous operation features of the vehicle by a vehicle operator during multiple weather and road conditions, via wireless communication or data transmission transmitted from a mobile device of the vehicle operator in communication with an on-board computer of the vehicle, where the mobile device generates the log of usage data from the data received from the on-board computer during a vehicle trip, the log of usage data including a timestamp indicating a beginning of the vehicle trip, a timestamp indicating an end of the vehicle trip, one or more timestamps associated with engagement or disengagement of the one or more autonomous operation features, and configuration data associated with the one or more autonomous operation features when engaged, and where the log of usage data further includes current weather and road conditions during the vehicle trip;determining, by the one or more processors, multiple expected use levels of the vehicle during multiple weather and road operating conditions, where the expected use levels indicate whether or not the vehicle operator is expected to engage or disengage the one or more autonomous operation features during each of multiple weather and road operating conditions as determined from processor analysis of the usage data received in the log of usage data;determining, by the one or more processors, a total risk level associated with overall operation of the vehicle based portion upon the determined risk profile, and determined expected use levels that indicate whether or not the vehicle operator is expected to engage or disengage the one or more autonomous operation features during each of multiple weather and road operating conditions;determining, via the one or more processors, types of one or more sensors installed in the vehicle based upon sensor data received from the vehicle;adjusting, via the one or more processors, total risk level associated with autonomous operation of the vehicle based portion upon the types of sensors installed in the vehicle; andcausing, by the one or more processors, one or more of the following actions to be performed based upon the determined total risk level to adjust an insurance policy associated with the vehicle, determine a coverage level associated with the insurance policy, present information regarding the determined total risk level to a reviewer via a display of a reviewer computing device to verify the determined total risk level, or present the determination to a customer via a display of a customer computing device for review of an adjustment to the insurance policy associated with the vehicle. Computer system for monitoring usage of vehicle having one or more autonomous operation features for controlling vehicle. The computer system allows to monitor the driving experience and/or usage of the autonomous or semi-autonomous vehicle technology, small time frames, and/or periodically to provide feedback to the driver, insurance provider, and/or adjust insurance policies or premiums, determine the automobile insurance premium by evaluating how effectively the vehicle to avoid and/or mitigate crashes and/or the extent to which the driver's control of the vehicle is enhanced or replaced by the vehicle's software and artificial intelligence. The drawing shows a block diagram of a computer system 102Front End Components110Mobile device160Program memories162Processors165Address/data Bus", "instruction": "Please summarize the input" }, { "input": "Autonomous vehicle technology effectiveness determination for insurance pricingMethods and systems for determining the effectiveness of one or more autonomous (and/or semi-autonomous) operation features of a vehicle are provided. According to certain aspects, information regarding autonomous operation features of the vehicle may be used to determine an effectiveness metric indicative of the ability of each autonomous operation feature to avoid or mitigate accidents or other losses. The information may include operating data from the vehicle or other vehicles having similar autonomous operation features, test data, or loss data from other vehicles. The determined effectiveness metric may then be used to determine part or all of an insurance policy, which may be reviewed by an insured and updated based upon the effectiveness metric.What is claimed is: \n | 1. A computer-implemented method for evaluating a vehicle having a plurality of autonomous or semi-autonomous vehicle technologies, the method comprising: \nimplementing, by a test computing system, the plurality of autonomous or semi-autonomous vehicle technologies within a virtual test environment; \npresenting, by the test computing system, virtual test sensor data to the virtual test environment, wherein the virtual test sensor data simulates sensor data for operating conditions associated with a plurality of test scenarios within the virtual test environment; \nin response to the virtual test sensor data generating, by the test computing system, test responses of the plurality of autonomous or semi-autonomous vehicle technologies; \nbased upon the test responses, determining, by the test computing system, an effectiveness metric for the plurality of autonomous or semi-autonomous vehicle technologies, wherein the effectiveness metric indicates a combined reliability of operating the vehicle by using different combinations of the plurality of autonomous or semi-autonomous vehicle technologies; \nreceiving, by a computing system, an indication of the vehicle having the plurality of autonomous or semi-autonomous vehicle technologies; and \nupdating, by the computing system, an insurance policy associated with the vehicle based upon the determined effectiveness metric for the plurality of autonomous or semi-autonomous vehicle technologies. \n | 2. The computer-implemented method of claim 1, wherein: \nthe plurality of autonomous or semi-autonomous vehicle technologies includes an updated version of at least one of the plurality of autonomous or semi-autonomous vehicle technologies; and \ndetermining the effectiveness metric includes determining an update to the effectiveness metric based at least in part upon a compatibility of the updated version of the at least one of the plurality of autonomous or semi-autonomous vehicle technologies with at least another one of the plurality of autonomous or semi-autonomous vehicle technologies. \n | 3. The computer-implemented method of claim 2, wherein determining the update to the effectiveness metric includes determining a change in accident avoidance effectiveness for the updated version of the at least one of the plurality of autonomous or semi-autonomous vehicle technologies.\n | 4. The computer-implemented method of claim 1, wherein the virtual test sensor data includes virtual test communication data simulating autonomous vehicle-to-vehicle communication data.\n | 5. The computer-implemented method of claim 1, wherein the operating conditions are associated with one or more of: a road type, a time of day, or a weather condition.\n | 6. The computer-implemented method of claim 1, wherein: \ndetermining the effectiveness metric includes generating a plurality of effectiveness metrics associated with a plurality of vehicle types; \nthe indication of the vehicle includes an indication of a vehicle type of the vehicle; and \nupdating the insurance policy associated with the vehicle is further based upon a corresponding effectiveness metric of the plurality of effectiveness metrics that is associated with the vehicle type of the vehicle. \n | 7. The computer-implemented method of claim 1, wherein: \nthe plurality of test scenarios include test scenarios associated with points of impact during virtual vehicle collisions; and \nthe effectiveness metric further indicates the combined reliability of operating the vehicle by using different combinations of the plurality of autonomous or semi-autonomous vehicle technologies to mitigate damages during the virtual vehicle collisions. \n | 8. A computer system for evaluating a vehicle having a plurality of autonomous or semi-autonomous vehicle technologies, the system comprising: \na test computing system including a processor and a memory storing executable instructions that, when executed by the processor, cause the test computing system to: \nimplement the plurality of autonomous or semi-autonomous vehicle technologies within a virtual test environment; \npresent virtual test sensor data to the virtual test environment, wherein the virtual test sensor data simulates sensor data for operating conditions associated with a plurality of test scenarios within the virtual test environment; \nin response to the virtual test sensor data, generate test responses of the plurality of autonomous or semi-autonomous vehicle technologies; and \nbased upon the test responses, determining an effectiveness metric for the plurality of autonomous or semi-autonomous vehicle technologies, wherein the effectiveness metric indicates a combined reliability of operating the vehicle by using different combinations of the plurality of autonomous or semi-autonomous vehicle technologies; and \na computing system including a processor and a memory storing executable instructions that, when executed by the processor, cause the computing system to: \nreceive an indication of the vehicle having the plurality of autonomous or semi-autonomous vehicle technologies; and \nupdate an insurance policy associated with the vehicle based upon the determined effectiveness metric for the plurality of autonomous or semi-autonomous vehicle technologies. \n | 9. The computer system of claim 8, wherein: \nthe plurality of autonomous or semi-autonomous vehicle technologies includes an updated version of at least one of the plurality of autonomous or semi-autonomous vehicle technologies; and \nthe executable instructions that cause the test computing system to determine the effectiveness metric further cause the test computing system to determine an update to the effectiveness metric based at least in part upon a compatibility of the updated version of the at least one of the plurality of autonomous or semi-autonomous vehicle technologies with at least another one of the plurality of autonomous or semi-autonomous vehicle technologies. \n | 10. The computer system of claim 9, wherein the executable instructions that cause the test computing system to determine the update to the effectiveness metric further cause the test computing system to determine a change in accident avoidance effectiveness for the updated version of the at least one of the plurality of autonomous or semi-autonomous vehicle technologies.\n | 11. The computer system of claim 8, wherein the virtual test sensor data includes virtual test communication data simulating autonomous vehicle-to-vehicle communication data.\n | 12. The computer system of claim 8, wherein the operating conditions are associated with one or more of: a road type, a time of day, or a weather condition.\n | 13. The computer system of claim 8, wherein: \nthe executable instructions that cause the test computing system to determine the effectiveness metric further cause the test computing system to generate a plurality of effectiveness metrics associated with a plurality of vehicle types; \nthe indication of the vehicle includes an indication of a vehicle type of the vehicle; and \nthe executable instructions that cause the computing system to update the insurance policy associated with the vehicle further cause the computing system to update the insurance policy based upon a corresponding effectiveness metric of the plurality of effectiveness metrics that is associated with the vehicle type of the vehicle. \n | 14. The computer system of claim 8, wherein: \nthe plurality of test scenarios include test scenarios associated with points of impact during virtual vehicle collisions; and \nthe effectiveness metric further indicates the combined reliability of operating the vehicle by using different combinations of the plurality of autonomous or semi-autonomous vehicle technologies to mitigate damages during the virtual vehicle collisions. \n | 15. A non-transitory computer-readable medium storing instructions for evaluating a vehicle having a plurality of autonomous or semi-autonomous vehicle technologies that, when executed by at least one processor of a computer system, cause the computer system to: \nimplement the plurality of autonomous or semi-autonomous vehicle technologies within a virtual test environment; \npresent virtual test sensor data to the virtual test environment, wherein the virtual test sensor data simulates sensor data for operating conditions associated with a plurality of test scenarios within the virtual test environment; \nin response to the virtual test sensor data, generate test responses of the plurality of autonomous or semi-autonomous vehicle technologies; \nbased upon the test responses, determine an effectiveness metric for the plurality of autonomous or semi-autonomous vehicle technologies, wherein the effectiveness metric indicates a combined reliability of operating the vehicle by using different combinations of the plurality of autonomous or semi-autonomous vehicle technologies; \nreceive an indication of the vehicle having the plurality of autonomous or semi-autonomous vehicle technologies; and \nupdate an insurance policy associated with the vehicle based upon the determined effectiveness metric for the plurality of autonomous or semi-autonomous vehicle technologies. \n | 16. The computer-readable medium of claim 15, wherein: \nthe plurality of autonomous or semi-autonomous vehicle technologies includes an updated version of at least one of the plurality of autonomous or semi-autonomous vehicle technologies; and \nthe executable instructions that cause the computer system to determine the effectiveness metric further cause the computer system to determine an update to the effectiveness metric based at least in part upon a compatibility of the updated version of the at least one of the plurality of autonomous or semi-autonomous vehicle technologies with at least another one of the plurality of autonomous or semi-autonomous vehicle technologies. \n | 17. The computer-readable medium of claim 15, wherein the virtual test sensor data includes virtual test communication data simulating autonomous vehicle-to-vehicle communication data.\n | 18. The computer-readable medium of claim 15, wherein the operating conditions are associated with one or more of: a road type, a time of day, or a weather condition.\n | 19. The computer-readable medium of claim 15, \nthe executable instructions that cause the computer system to determine the effectiveness metric further cause the computer system to generate a plurality of effectiveness metrics associated with a plurality of vehicle types; \nthe indication of the vehicle includes an indication of a vehicle type of the vehicle; and \nthe executable instructions that cause the computer system to update the insurance policy associated with the vehicle further cause the computer system to update the insurance policy based upon a corresponding effectiveness metric of the plurality of effectiveness metrics that is associated with the vehicle type of the vehicle. \n | 20. The computer-readable medium of claim 15, \nthe plurality of test scenarios include test scenarios associated with points of impact during virtual vehicle collisions; and \nthe effectiveness metric further indicates the combined reliability of operating the vehicle by using different combinations of the plurality of autonomous or semi-autonomous vehicle technologies to mitigate damages during the virtual vehicle collisions.", "output": "The method involves generating an effectiveness metric associated with autonomous or semi-autonomous vehicle technologies based upon test responses by processors of a test computing system. An indication of a vehicle (108) including the autonomous or semi-autonomous vehicle technologies is received by the processors of a computing system. An insurance policy associated with the vehicle is updated based upon the effectiveness metric associated with the autonomous or semi-autonomous vehicle technologies by the processors of the computing system. INDEPENDENT CLAIMS are also included for the following:a computer system for evaluating effectiveness of autonomous or semi-autonomous vehicle technologies for controlling a vehicle to avoid or mitigate vehicle accidentsa tangible, non-transitory computer-readable medium comprising a set of instructions for evaluating effectiveness of autonomous or semi-autonomous vehicle technologies for controlling a vehicle to avoid or mitigate vehicle accidents. Method for evaluating effectiveness of autonomous or semi-autonomous vehicle technologies for controlling a vehicle i.e. automobile, to avoid or mitigate vehicle accidents. The method enables allows an insurance provider to adjust or update insurance policies, premiums, rates, discounts, and/or other insurance-related items based upon a smart equipment warning functionality that can alert drivers of a vehicle equipment or a vehicle safety equipment that can need replaced or repaired, and thus reducing collision risk. The method enables allows a vehicle operator to maximize effectiveness of an autonomous operation feature, maximize vehicle insurance coverage, and/or minimize vehicle insurance expense. The drawing shows a schematic block diagram of a computer network, a computer server, a mobile device, and an on-board computer for implementing autonomous vehicle operation, monitoring, evaluation, and insurance processes. 100Autonomous vehicle insurance system102Front-end components108Vehicle110Mobile device120Sensors130Network135Link155Controller164RAM", "instruction": "Please summarize the input" }, { "input": "Accident risk model determination using autonomous vehicle operating dataMethods and systems for evaluating the effectiveness of autonomous operation features of autonomous vehicles using an accident risk model are provided. According to certain aspects, an accident risk model may be determined using effectiveness information regarding autonomous operation features associated with a vehicle. The effectiveness information may indicate a likelihood of an accident for the vehicle and may include test data or actual loss data. Determining the likelihood of an accident may include determining risk factors for the features related to the ability of the features to make control decisions that successfully avoid accidents. The accident risk model may further include information regarding effectiveness of the features relative to location or operating conditions, as well as types and severity of accidents. The accident risk model may further be used to determine or adjust aspects of an insurance policy associated with an autonomous vehicle.The invention claimed is: \n | 1. A computer-implemented method of evaluating effectiveness of an autonomous or semi-autonomous vehicle technology, the method comprising: \npresenting, by the one or more processors, virtual test sensor data to the autonomous or semi-autonomous vehicle technology implemented within a virtual test environment; \ngenerating, by the one or more processors, test responses of the autonomous or semi-autonomous vehicle technology implemented within the virtual test environment in response to the virtual test sensor data; \ngenerating, by the one or more processors, an accident risk model indicating one or more risk levels for vehicles associated with the autonomous or semi-autonomous vehicle technology based upon the test responses; \nreceiving, at the one or more processors, actual accident data associated with accidents involving vehicles using the autonomous or semi-autonomous vehicle technology in a non-test environment, the actual accident data comprising data collected by a vehicle sensor; and \nadjusting, by the one or more processors, the accident risk model based upon the actual accident data by adjusting at least one of the one or more risk levels. \n | 2. The computer-implemented method of claim 1, the method further comprising: \nidentifying, by the one or more processors, a customer vehicle having the autonomous or semi-autonomous vehicle control technology; and \ngenerating or updating, by the one or more processors, an insurance policy associated with the customer vehicle based upon the adjusted at least one of the one or more risk levels of the adjusted accident risk model. \n | 3. The computer-implemented method of claim 2, further comprising: \ncausing, by the one or more processors, information regarding all or a portion of the insurance policy to be presented to a customer associated with the customer vehicle via a display of a customer computing device for review. \n | 4. The computer-implemented method of claim 1, wherein: \ngenerating the test responses includes generating test responses relative to additional test responses of another autonomous or semi-autonomous vehicle technology. \n | 5. The computer-implemented method of claim 4, wherein the compatibility of the test responses and the additional test responses is determined for a plurality of versions of the other autonomous or semi-autonomous vehicle technology.\n | 6. The computer-implemented method of claim 1, wherein generating the accident risk model includes determining the one or more risk levels based upon an effectiveness metric associated with the autonomous or semi-autonomous vehicle technology calculated from the test responses.\n | 7. The computer-implemented method of claim 1, wherein the virtual test sensor data includes virtual test communication data simulating autonomous vehicle-to-vehicle communication data.\n | 8. The computer-implemented method of claim 1, wherein the autonomous or semi-autonomous vehicle technology involves at least one of a vehicle self-braking functionality or a vehicle self-steering functionality.\n | 9. A computer system for evaluating effectiveness of an autonomous or semi-autonomous vehicle technology, comprising: \none or more processors; \none or more program memories coupled to the one or more processors and storing executable instructions that, when executed by the one or more processors, cause the computer system to: \npresent virtual test sensor data to the autonomous or semi-autonomous vehicle technology implemented within the virtual test environment; \ngenerate test responses of the autonomous or semi-autonomous vehicle technology implemented within the virtual test environment in response to the virtual test sensor data; \ngenerate an accident risk model indicating one or more risk levels for vehicles associated with the autonomous or semi-autonomous vehicle technology based upon the test responses; \nreceive actual accident data associated with accidents involving vehicles using the autonomous or semi-autonomous vehicle technology in a non-test environment, the actual accident data comprising data collected by a vehicle sensor; and \nadjust the accident risk model based upon the actual accident data by adjusting at least one of the one or more risk levels of the accident risk level. \n | 10. The computer system of claim 9, wherein the executable instructions further cause the computer system to: \nidentify a customer vehicle having the autonomous or semi-autonomous vehicle control technology; and \ngenerate or update an insurance policy associated with the customer vehicle based upon the adjusted at least one of the one or more risk levels of the adjusted accident risk model. \n | 11. The computer system of claim 9, wherein: \nthe executable instructions that cause the computer system to generate the test responses cause the computer system to generate test responses relative to additional test responses of another autonomous or semi-autonomous vehicle technology. \n | 12. The computer system of claim 11, wherein the compatibility of the test responses and the additional test responses is determined for a plurality of versions of the other autonomous or semi-autonomous vehicle technology.\n | 13. The computer system of claim 9, wherein the executable instructions that cause the computer system to generate the accident risk model further cause the computer system to determine the one or more risk levels based upon an effectiveness metric associated with the autonomous or semi-autonomous vehicle technology calculated from the test responses.\n | 14. The computer system of claim 9, wherein the executable instructions further cause the computer system to: \ncommunicate to a customer computing device, via a communication network, information regarding all or a portion of an insurance policy to be presented to a customer associated with the customer vehicle for review via a display of the customer computing device. \n | 15. The computer system of claim 9, wherein the virtual test sensor data includes virtual test communication data simulating autonomous vehicle-to-vehicle communication data.\n | 16. A tangible, non-transitory computer-readable medium storing executable instructions for evaluating effectiveness of an autonomous or semi-autonomous vehicle technology that, when executed by at least one processor of a computer system, cause the computer system to: \npresent virtual test sensor data to the autonomous or semi-autonomous vehicle technology implemented within the virtual test environment; \ngenerate test responses of the autonomous or semi-autonomous vehicle technology implemented within the virtual test environment in response to the virtual test sensor data; \ngenerate an accident risk model indicating one or more risk levels for vehicles associated with the autonomous or semi-autonomous vehicle technology based upon the test responses; \nreceive actual accident data associated with accidents involving vehicles using the autonomous or semi-autonomous vehicle technology in a non-test environment, the actual accident data comprising data collected by a vehicle sensor; and \nadjust the accident risk model based upon the actual accident data by adjusting at least one of the one or more risk levels of the accident risk level. \n | 17. The tangible, non-transitory computer-readable medium of claim 16, wherein: \nthe executable instructions that cause the computer system to generate the test responses cause the computer system to generate test responses relative to additional test responses of another autonomous or semi-autonomous vehicle technology. \n | 18. The tangible, non-transitory computer-readable medium of claim 17, wherein the compatibility of the test responses and the additional test responses is determined for a plurality of versions of the other autonomous or semi-autonomous vehicle technology.\n | 19. The tangible, non-transitory computer-readable medium of claim 16, wherein the executable instructions that cause the computer system to generate the accident risk model further cause the computer system to determine the one or more risk levels based upon an effectiveness metric associated with the autonomous or semi-autonomous vehicle technology calculated from the test responses.\n | 20. The tangible, non-transitory computer-readable medium of claim 16, wherein the virtual test sensor data includes virtual test communication data simulating autonomous vehicle-to-vehicle communication data.", "output": "The method involves generating test responses of the autonomous or semi-autonomous vehicle technology implemented within the virtual test environment in response to the virtual test sensor data by the one or more processors (162). An accident risk model indicating one or more risk levels for vehicles (108) associated with the autonomous or semi-autonomous vehicle technology is generated based upon the test responses by the one or more processors. The actual accident data associated with accidents involving vehicles is generated using the autonomous or semi-autonomous vehicle technology in a non-test environment at the one or more processors, the actual accident data comprising data collected by a vehicle sensor. The accident risk model is adjusted based upon the actual accident data by adjusting one of the one or more risk levels by the one or more processors. INDEPENDENT CLAIMS are included for:A computer system for evaluating effectiveness of an autonomous or semi-autonomous vehicle technology;Atangible, non-transitory computer-readable medium storing. Computer-implemented method for evaluating effectiveness of autonomous or semi-autonomous vehicle technology such as driverless operation, accident avoidance or collision warning systems. The autonomous vehicle operation features either assist the vehicle operator to more safely or efficiently operate a vehicle or may take full control of vehicle operation under some or all circumstances. The risk assessment and premium determination for vehicle insurance policies covering vehicles with autonomous operation features is facilitated. The insurance premium for automobile insurance coverage or another cost associated with the insurance policy is presented through a display screen to a customer for review, acceptance, and/or approval. The drawing shows a block diagram of the computer network, a computer server, a mobile device, and an on-board computer for implementing autonomous vehicle operation, monitoring, evaluation, and insurance processes.108Vehicle 110Mobile device 114Client device 130Network 162Processor ", "instruction": "Please summarize the input" }, { "input": "Fully autonomous vehicle insurance pricingMethods and systems for determining risk associated with operation of fully autonomous vehicles are provided. According to certain aspects, autonomous operation features associated with a vehicle may be determined, including types and version of sensors, control systems, and software. This information may be used to determine a risk profile reflecting risk levels for a plurality of features, which may be based upon test data regarding the features or actual loss data. Expected use levels may further be determined and used with the risk profile to determine a total risk level associated with operation of the vehicle by the autonomous operation features. The expected use levels may indicate expected vehicle use, as well as traffic, weather, or other conditions in which the vehicle is likely to operate. The total risk level may be used to determine or adjust aspects of an insurance policy associated with the vehicle.What is claimed is: \n | 1. A computer system for monitoring usage of a vehicle having one or more autonomous operation features for controlling the vehicle, the computer system comprising one or more processors, one or more transceivers coupled to the one or more processors, and one or more program memories coupled to the one or more processors and storing executable instructions that cause the one or more processors to: \ndetermine a risk profile associated with operation of the vehicle that includes a plurality of risk levels associated with operation of the vehicle (i) under a plurality of weather and road operating conditions with the one or more autonomous operation features of the vehicle engaged, and (ii) under the plurality of weather and road operating conditions with the one or more autonomous operation features of the vehicle disengaged; \nreceive a log of usage data regarding previous use of the one or more autonomous operation features of the vehicle by a vehicle operator during the plurality of weather and road conditions, via wireless communication or data transmission from a mobile device of the vehicle operator in communication with an on-board computer of the vehicle, wherein the mobile device generates the log of usage data from the data received from the on-board computer during a vehicle trip, the log of usage data including: a timestamp indicating a beginning of the vehicle trip, a timestamp indicating an end of the vehicle trip, one or more timestamps associated with engagement or disengagement of the one or more autonomous operation features, and configuration data associated with the one or more autonomous operation features when engaged, and wherein the log of usage data further includes current weather and road conditions during the vehicle trip; \nreceive sensor data associated with the vehicle; \ndetermine a plurality of expected use levels of the vehicle during the plurality of weather and road operating conditions, wherein the expected use levels indicate whether or not the vehicle operator is expected to engage or disengage the one or more autonomous operation features during each of the plurality of weather and road operating conditions as determined from processor analysis of the usage data received in the log of usage data; \ndetermine a total risk level associated with overall operation of the vehicle based at least in part upon (a) the determined risk profile, and (b) the determined expected use levels that indicate whether or not the vehicle operator is expected to engage or disengage the one or more autonomous operation features during each of the plurality of weather and road operating conditions; \ndetermine types of one or more sensors installed in the vehicle based upon the sensor data associated with the vehicle; and \nadjust the total risk level associated with autonomous operation of the vehicle based at least in part upon the types of sensors in the vehicle. \n | 2. The system of claim 1, wherein the executable instructions further cause the one or more processors to: \nestimate future usage or operation of the vehicle, either by time or mileage, during each of the plurality of weather and road conditions and with each of the one or more autonomous operation features engaged or disengaged based upon the log of usage data; and \nadjust the total risk level for the vehicle based upon (1) the determined risk profile, (2) the determined expected use levels, and (3) the estimated future usage or operation of the vehicle, either by time or mileage, the vehicle is predicted to be operated in each of the plurality of weather and road conditions with each of the one or more autonomous operation features engaged or disengaged. \n | 3. The system of claim 1, wherein the executable instructions further cause the one or more processors to: \nestimate future usage or operation of the vehicle, either by time or mileage, during each of the plurality of weather and road conditions; and \nadjust the total risk level for the vehicle based upon (1) the determined risk profile, (2) the determined expected use levels, and (3) an amount of time or miles that the vehicle is estimated to be operated in each of the plurality of weather and road conditions indicated by the usage data received in the log of usage data. \n | 4. The method of claim 1, wherein: \nthe risk profile associated with autonomous operation of the vehicle is based at least in part upon test result data generated from test units corresponding to the one or more autonomous operation features; \nthe test result data include responses of the test units to test inputs corresponding to test scenarios, the test scenarios include vehicle operation with an autonomous feature engaged during each of the plurality of weather and road conditions; and \nthe test result data are generated and recorded by the test units disposed within one or more test vehicles in response to sensor data from a plurality of sensors, and video recording devices, within the one or more test vehicles. \n | 5. The system of claim 1, wherein the risk profile associated with autonomous operation of the vehicle is based at least in part upon actual losses associated with insurance policies covering a plurality of other vehicles having at least one of the one or more autonomous operation features, the actual losses incurred through vehicle operation in each of the plurality of weather and road conditions.\n | 6. The system of claim 1, wherein the executable instructions further cause the one or more processors to: \ncause one or more of the following actions to be performed based upon the determined total risk level: adjust an insurance policy associated with the vehicle, or present information regarding the determined total risk level to a reviewer via a display of a reviewer computing device to verify the determined total risk level. \n | 7. The system of claim 1, wherein the executable instructions further cause the one or more processors to: \nreceive a request for a quote of a premium associated with a vehicle insurance policy via wireless communication transmitted by the customer computing device; \ndetermine a premium associated with the vehicle insurance policy based upon the total risk level; and \npresent an option to purchase the vehicle insurance policy to the customer associated with the vehicle. \n | 8. The system of claim 1, wherein the log of usage data regarding the one or more autonomous operation features includes a version of autonomous operation feature control software that is currently installed on the vehicle or in the autonomous operation feature system mounted on the vehicle.\n | 9. The system of claim 1, wherein the executable instructions further cause the one or more processors to: \nreceive information regarding a type and version of the one or more autonomous operation features; and \nupdate the total risk level associated with autonomous operation of the vehicle based upon the type and version of the one or more autonomous operation features. \n | 10. The system of claim 1, wherein the one or more autonomous operation features include a vehicle-to-vehicle (V2V) wireless communication capability, and wherein the executable instructions further cause the one or more processors to: \nreceive telematics data generated or broadcast from other vehicles; and \ngenerate and display alternate routes based upon the telematics data received to facilitate safer vehicle travel and avoidance of bad weather, traffic, or road conditions. \n | 11. A computer-implemented method for monitoring usage of a vehicle having one or more autonomous operation features for controlling the vehicle, comprising: \ndetermining, by one or more processors, a risk profile associated with operation of the vehicle that includes a plurality of risk levels associated with operation of the vehicle (i) under a plurality of weather and road operating conditions with the one or more autonomous operation features of the vehicle engaged, and (ii) under the plurality of weather and road operating conditions with the one or more autonomous operation features of the vehicle disengaged; \nreceiving, at the one or more processors or an associated transceiver, a log of usage data regarding previous use of the one or more autonomous operation features of the vehicle by a vehicle operator during the plurality of weather and road conditions, via wireless communication or data transmission transmitted from a mobile device of the vehicle operator in communication with an on-board computer of the vehicle, wherein the mobile device generates the log of usage data from the data received from the on-board computer during a vehicle trip, the log of usage data including: a timestamp indicating a beginning of the vehicle trip, a timestamp indicating an end of the vehicle trip, one or more timestamps associated with engagement or disengagement of the one or more autonomous operation features, and configuration data associated with the one or more autonomous operation features when engaged, and wherein the log of usage data further includes current weather and road conditions during the vehicle trip; \ndetermining, by the one or more processors, a plurality of expected use levels of the vehicle during the plurality of weather and road operating conditions, wherein the expected use levels indicate whether or not the vehicle operator is expected to engage or disengage the one or more autonomous operation features during each of the plurality of weather and road operating conditions as determined from processor analysis of the usage data received in the log of usage data; \ndetermining, by the one or more processors, a total risk level associated with overall operation of the vehicle based at least in part upon (a) the determined risk profile, and (b) the determined expected use levels that indicate whether or not the vehicle operator is expected to engage or disengage the one or more autonomous operation features during each of the plurality of weather and road operating conditions; \ndetermining, via the one or more processors, types of one or more sensors installed in the vehicle based upon sensor data received from the vehicle; and \nadjusting, via the one or more processors, the total risk level associated with autonomous operation of the vehicle based at least in part upon the types of sensors installed in the vehicle. \n | 12. The computer-implemented method of claim 11, the method comprising: \nestimating future usage or operation of the vehicle, either by time or mileage, during each of the plurality of weather and road conditions and with each of the one or more autonomous operation features engaged or disengaged; and \nadjusting, via the one or more processors, the total risk level for the vehicle based upon (1) the determined risk profile, (2) the determined expected use levels, and (3) an amount of time or miles that the vehicle is operated in each of the plurality of weather and road conditions with each of the one or more autonomous operation features engaged or disengaged indicated by the usage data received in the log of usage data. \n | 13. The computer-implemented method of claim 11, the method comprising: \nestimating future usage or operation of the vehicle, either by time or mileage, during each of the plurality of weather and road conditions; and \nadjusting, via the one or more processors, the total risk level for the vehicle based upon (1) the determined risk profile, (2) the determined expected use levels, and (3) the amount of time or miles that the vehicle is expected to be operated in the future in each of the plurality of weather and road conditions for a given time period. \n | 14. The computer-implemented method of claim 11, wherein: \nthe risk profile associated with autonomous operation of the vehicle is based at least in part upon test result data generated from test units corresponding to the one or more autonomous operation features; \nthe test result data include responses of the test units to test inputs corresponding to test scenarios, the test scenarios including vehicle operation with an autonomous feature engaged during each of the plurality of weather and road conditions; and \nthe test result data are generated and recorded by the test units disposed within one or more test vehicles in response to sensor data from a plurality of sensors, and video recording devices, within the one or more test vehicles. \n | 15. The computer-implemented method of claim 11, wherein the risk profile associated with autonomous operation of the vehicle is based at least in part upon actual losses associated with insurance policies covering a plurality of other vehicles having at least one of the one or more autonomous operation features, the actual losses incurred through vehicle operation in each of the plurality of weather and road conditions.\n | 16. The computer-implemented method of claim 11, the method further comprising: \nadjusting, via the one or more processors, an insurance policy associated with the vehicle. \n | 17. The method of claim 11, further comprising: \nreceiving, at the one or more processors or the associated transceiver, a request for a quote of a premium associated with a vehicle insurance policy via wireless communication transmitted by the customer computing device; \ndetermining, by one or more processors, a premium associated with the vehicle insurance policy based upon the total risk level; and \npresenting, by one or more processors, an option to purchase the vehicle insurance policy to the customer associated with the vehicle. \n | 18. The method of claim 11, the method further comprising: \nreceiving, via the one or more processors or the associated transceiver, information regarding a type and version of the one or more autonomous operation features; and \nupdating the total risk level associated with autonomous operation of the vehicle, via the one or more processors, based upon the type and version of the one or more autonomous operation features. \n | 19. The method of claim 11, wherein the autonomous operation feature is a vehicle-to-vehicle (V2V) wireless communication capability, and the method comprises: \nreceiving, via one or more vehicle-mounted processors or associated transceiver, telematics data generated or broadcast from other vehicles; and \ngenerating and displaying alternate routes, via the one or more vehicle-mounted processors, based upon the telematics data received to facilitate safer vehicle travel and avoidance of bad weather, traffic, or road conditions. \n | 20. A computer system for monitoring usage of a vehicle having one or more autonomous operation features for controlling the vehicle, the system comprising one or more processors, one or more transceivers coupled to the one or more processors, and one or more program memories coupled to the one or more processors and storing executable instructions that cause the one or more processors to: \ndetermine a risk profile associated with operation of the vehicle that includes a plurality of risk levels associated with operation of the vehicle (i) under a plurality of weather and road operating conditions with the one or more autonomous operation features of the vehicle engaged, and (ii) under the plurality of weather and road operating conditions with the one or more autonomous operation features of the vehicle disengaged; \nreceive a log of usage data regarding previous use of the one or more autonomous operation features of the vehicle by a vehicle operator during the plurality of weather and road conditions, via wireless communication or data transmission from a mobile device of the vehicle operator in communication with an on-board computer of the vehicle, wherein the mobile device generates the log of usage data from the data received from the on-board computer during a vehicle trip, the log of usage data including: a timestamp indicating a beginning of the vehicle trip, a timestamp indicating an end of the vehicle trip, one or more timestamps associated with engagement or disengagement of the one or more autonomous operation features, and configuration data associated with the one or more autonomous operation features when engaged, and wherein the log of usage data further includes current weather and road conditions during the vehicle trip; \ndetermine from analysis of the usage data received in the log of usage data a plurality of expected use levels of the vehicle during the plurality of weather and road operating conditions, wherein the expected use levels indicate whether or not the vehicle operator is expected to engage or disengage the one or more autonomous operation features during each of the plurality of weather and road operating conditions; \ndetermine from analysis of the usage data received in the log of usage data, or from analysis of other vehicle or telematics data received from the vehicle or mobile device, an average amount of time or miles that the vehicle operator operates the vehicle during each of the plurality of weather and road operating conditions for a period of time; \ndetermine a total risk level associated with overall operation of the vehicle based at least in part upon (a) the determined risk profile, (b) the determined expected use levels that indicate whether or not the vehicle operator is expected to engage or disengage the one or more autonomous operation features during each of the plurality of weather and road operating conditions, and (c) the average amount of time or miles that the vehicle operator operates the vehicle during each of the plurality of weather and road operating conditions for the period of time to facilitate more accurate risk assessment and auto insurance pricing; \ndetermine types of one or more sensors installed in the vehicle based upon sensor data received from the vehicle; and \nadjust the total risk associated with autonomous operation of the vehicle based at least in part upon the types of sensors installed in the vehicle.", "output": "The system has transceivers coupled to processors (162), and program memories (160) coupled to the processors and storing executable instructions that cause the processors to determine a total risk level associated with overall operation of a vehicle based upon a determined risk profile, and the determined expected use levels that indicate whether or not a vehicle operator is expected to engage or disengage autonomous operation features during each of weather and road operating conditions. The processors determine types of sensors installed in the vehicle based on the sensor data associated with the vehicle, and adjust the total risk levels associated with autonomous operation of the vehicle. An INDEPENDENT CLAIM is included for a method for monitoring usage of vehicle. Computer system for monitoring usage of a vehicle i.e. autonomous vehicle. Can also be used for a semi-autonomous vehicle and a driverless vehicle. The risk assessment and premium determination for vehicle insurance policies covering vehicles with autonomous operation features can be facilitated. The driverless operation or accident avoidance can be achieved. The financial protection against physical damage and/or bodily injury resulting from traffic accidents and against liability can be provided. The drawing drawing shows the block diagram of an exemplary computer network, a computer server, a mobile device, and an on-board computer for implementing autonomous vehicle operation, monitoring, evaluation, and insurance processes.100Autonomous vehicle insurance system 104Back-end components 110Mobile device 130Network 140Server 146Database 160Program memory 162Processor 164RAM ", "instruction": "Please summarize the input" }, { "input": "FULLY AUTONOMOUS VEHICLE INSURANCE PRICINGMethods and systems for determining risk associated with operation of fully autonomous vehicles are provided. According to certain aspects, autonomous operation features associated with a vehicle may be determined, including types and version of sensors, control systems, and software. This information may be used to determine a risk profile reflecting risk levels for a plurality of features, which may be based upon test data regarding the features or actual loss data. Expected use levels may further be determined and used with the risk profile to determine a total risk level associated with operation of the vehicle by the autonomous operation features. The expected use levels may indicate expected vehicle use, as well as traffic, weather, or other conditions in which the vehicle is likely to operate. The total risk level may be used to determine or adjust aspects of an insurance policy associated with the vehicle.|1. A computer system for monitoring usage of a vehicle having one or more autonomous operation features, comprising one or more processors and one or more program memories storing executable instructions that cause the one or more processors to: \ndetermine a risk profile associated with operation of the vehicle (i) under operating environment conditions with the one or more autonomous operation features engaged, and (ii) under operating environment conditions with the one or more autonomous operation features disengaged; \nreceive a log of usage data regarding previous use of the one or more autonomous operation features by a vehicle operator during the operating environment conditions; \ndetermine a plurality of expected use levels of the vehicle during the operating environment conditions, including whether or not the vehicle operator is expected to engage or disengage the one or more autonomous operation features during the operating environment conditions as determined from processor analysis of the log of usage data; \ndetermine a risk level associated with operation of the vehicle based upon (a) the determined risk profile, and (b) the determined expected use levels; and \ncause the one or more processors to automatically perform an action based upon the determined risk level, wherein the action includes one or more of: adjust an insurance policy associated with the vehicle, determine a coverage level associated with the insurance policy, present information regarding the determined risk level to a reviewer via a display. \n | 2. The system of claim 1, wherein the executable instructions further cause the one or more processors to: \nestimate future usage or operation of the vehicle during the operating environment conditions and with each of the one or more autonomous operation features engaged or disengaged based upon the log of usage data; and \nadjust the risk level for the vehicle based upon (1) the determined risk profile, (2) the determined expected use levels, and (3) the estimated future usage or operation of the vehicle. \n | 3. The system of claim 1, wherein the executable instructions further cause the one or more processors to: \nestimate future usage or operation of the vehicle, either by time or mileage, during the operating environment conditions; and \nadjust the risk level for the vehicle based upon (1) the determined risk profile, (2) the determined expected use levels, and (3) an amount of time or miles that the vehicle is estimated to be operated in the operating environment conditions as indicated by the log of usage data. \n | 4. The method of claim 1, wherein: \nthe risk profile associated with autonomous operation of the vehicle is based upon test result data generated from test units corresponding to the one or more autonomous operation features; \nthe test result data include responses of the test units to test inputs corresponding to test scenarios, the test scenarios include vehicle operation with an autonomous feature engaged during the operating environment conditions; and \nthe test result data are generated and recorded by the test units disposed within one or more test vehicles in response to sensor data from a plurality of sensors, and video recording devices, within the one or more test vehicles. \n | 5. The system of claim 1, wherein the risk profile associated with autonomous operation of the vehicle is based upon actual losses associated with insurance policies covering a plurality of other vehicles having at least one of the one or more autonomous operation features, the actual losses incurred through vehicle operation in the operating environment conditions.\n | 6. The system of claim 1, wherein receiving the log of usage data comprises receiving the log via wireless communication from a mobile device of the vehicle operator in communication with an on-board computer of the vehicle, wherein the mobile device generates the log of usage data from the data received from the on-board computer during a vehicle trip, the log of usage data including: a timestamp indicating a beginning of the vehicle trip, a timestamp indicating an end of the vehicle trip, one or more timestamps associated with engagement or disengagement of the one or more autonomous operation features, and configuration data associated with the one or more autonomous operation features when engaged, and wherein the log of usage data further includes current operating environment conditions during the vehicle trip.\n | 7. The system of claim 1, wherein the executable instructions further cause the one or more processors to: \nreceive a request for a quote of a premium associated with a vehicle insurance policy via wireless communication; \ndetermine a premium associated with the vehicle insurance policy based upon the risk level; and \npresent an option to purchase the vehicle insurance policy to the customer associated with the vehicle. \n | 8. The system of claim 1, wherein the log of usage data includes a version of autonomous operation feature control software that is currently installed on the vehicle or in the autonomous operation feature system mounted on the vehicle.\n | 9. The system of claim 1, wherein the executable instructions further cause the one or more processors to: \nreceive information regarding a type and version of the one or more autonomous operation features; and \nupdate the risk level associated with autonomous operation of the vehicle based upon the type and version of the one or more autonomous operation features. \n | 10. The system of claim 1, wherein the one or more autonomous operation features include a vehicle-to-vehicle (V2V) wireless communication capability, and wherein the executable instructions further cause the one or more processors to: \nreceive telematics data from other vehicles; and \ngenerate and display alternate routes based upon the received telematics data. \n | 11. A computer-implemented method for use in connection with a vehicle having one or more autonomous operation features, comprising: \ndetermining, by one or more processors, a risk profile associated with operation of the vehicle (i) under operating environment conditions with the one or more autonomous operation features engaged, and (ii) under the operating environment conditions with the one or more autonomous operation features disengaged; \nreceiving, at the one or more processors or an associated transceiver, a log of usage data regarding previous use of the one or more autonomous operation features by a vehicle operator during the operating environment conditions; \ndetermining, by the one or more processors, a plurality of expected use levels of the vehicle during the operating environment conditions, wherein the expected use levels indicate whether or not the vehicle operator is expected to engage or disengage the one or more autonomous operation features during the operating environment conditions as determined from processor analysis of the log of usage data; \ndetermining, by the one or more processors, a risk level associated with operation of the vehicle based upon (a) the determined risk profile, and (b) the determined expected use levels; and \ncausing the one or more processors to automatically perform an action based upon the determined total risk level, wherein the actions include one or more of: adjust an insurance policy associated with the vehicle, determine a coverage level associated with the insurance policy, present information regarding the determined risk level to a reviewer via a display. \n | 12. The computer-implemented method of claim 11, the method comprising: \nestimating future usage or operation of the vehicle during the operating environment conditions and with the one or more autonomous operation features engaged or disengaged; and \nadjusting, via the one or more processors, the risk level for the vehicle based upon (1) the determined risk profile, (2) the determined expected use levels, and (3) an amount of time or miles that the vehicle is operated in the operating environment conditions with the one or more autonomous operation features engaged or disengaged indicated by the log of usage data. \n | 13. The computer-implemented method of claim 11, the method comprising: \nestimating future usage or operation of the vehicle, either by time or mileage, during the operating environment conditions; and \nadjusting, via the one or more processors, the risk level for the vehicle based upon (1) the determined risk profile, (2) the determined expected use levels, and (3) the amount of time or miles that the vehicle is expected to be operated in the future in the operating environment conditions. \n | 14. The computer-implemented method of claim 11, wherein: \nthe risk profile associated with autonomous operation of the vehicle is based upon test result data generated from test units corresponding to the one or more autonomous operation features; \nthe test result data include responses of the test units to test inputs corresponding to test scenarios, the test scenarios including vehicle operation with an autonomous feature engaged during the operating environment conditions; and \nthe test result data are generated and recorded by the test units disposed within one or more test vehicles in response to sensor data from a plurality of sensors, and video recording devices, within the one or more test vehicles. \n | 15. The computer-implemented method of claim 11, wherein the risk profile associated with autonomous operation of the vehicle is based upon actual losses associated with insurance policies covering a plurality of other vehicles having at least one of the one or more autonomous operation features, the actual losses incurred through vehicle operation in the operating environment conditions.\n | 16. The computer-implemented method of claim 11, wherein receiving the log of usage data regarding comprises receiving the log via wireless communication from a mobile device of the vehicle operator in communication with an on-board computer of the vehicle, wherein the mobile device generates the log of usage data from the data received from the on-board computer during a vehicle trip, the log of usage data including: a timestamp indicating a beginning of the vehicle trip, a timestamp indicating an end of the vehicle trip, one or more timestamps associated with engagement or disengagement of the one or more autonomous operation features, and configuration data associated with the one or more autonomous operation features when engaged, and wherein the log of usage data further includes current operating environment conditions.\n | 17. The method of claim 11, further comprising: \nreceiving, at the one or more processors or an associated transceiver, a request for a quote of a premium associated with a vehicle insurance policy via wireless communication; \ndetermining, by one or more processors, a premium associated with the vehicle insurance policy based upon the risk level; and \npresenting, by one or more processors, an option to purchase the vehicle insurance policy to the customer associated with the vehicle. \n | 18. The method of claim 11, the method further comprising: \nreceiving, via the one or more processors or an associated transceiver, information regarding a type and version of the one or more autonomous operation features; and \nupdating the total level associated with autonomous operation of the vehicle, via the one or more processors, based upon the type and version of the one or more autonomous operation features. \n | 19. The method of claim 11, wherein the autonomous operation feature is a vehicle-to-vehicle (V2V) wireless communication capability, and the method comprises: \nreceiving, via one or more vehicle-mounted processors or associated transceiver, telematics data from other vehicles; and \ngenerating and displaying alternate routes, via the one or more vehicle-mounted processors, based upon the telematics data. \n | 20. A computer system for monitoring usage of a vehicle having one or more autonomous operation features, the system comprising one or more processors and one or more program memories storing executable instructions that cause the one or more processors to: \ndetermine a risk profile associated with operation of the vehicle (i) under a plurality of operating environment conditions with the one or more autonomous operation features engaged, and (ii) under the operating environment conditions with the one or more autonomous operation features disengaged; \nreceive a log of usage data regarding previous use of the one or more autonomous operation features during the operating environment conditions; \ndetermine from analysis of the log of usage data a plurality of expected use levels of the vehicle during the operating environment conditions, wherein the expected use levels indicate whether or not the vehicle operator is expected to engage or disengage the one or more autonomous operation features during the operating environment conditions; \ndetermine from analysis of the log of usage data, or from analysis of other vehicle or telematics data received from the vehicle or mobile device, an average amount of time or miles that the vehicle operator operates the vehicle during the operating environment conditions for a period of time; \ndetermine a risk level associated with operation of the vehicle based upon (a) the determined risk profile, (b) the determined expected use levels, and (c) the average amount of time or miles that the vehicle operator operates the vehicle during the operating environment conditions for the period of time; and \ncause the one or more processors to automatically perform an action based upon the determined total risk level, wherein the action includes one or more of: adjust an insurance policy associated with the vehicle, determine a coverage level associated with the insurance policy, present information regarding the determined total risk level to a reviewer via a display.", "output": "The computer system (100) has processors (162) and program memories (160) storing executable instructions that cause the processors to determine expected use levels of a vehicle (108) during operating environment conditions, including whether or not the vehicle operator is expected to engage or disengage the autonomous operation features during operating environment conditions as determined from processor analysis of log of usage data, and determine risk level associated with operation of the vehicle based upon determined risk profile, and determined expected use levels. The processors automatically perform an action based upon determined risk level. The action includes one or more of adjust an insurance policy associated with the vehicle, determine a coverage level associated with the insurance policy, present information regarding the determined risk level to a reviewer via a display. An INDEPENDENT CLAIM is included for a method for use in connection with a vehicle having one or more autonomous operation features. Computer system for monitoring usage of vehicle having autonomous operation features such as autonomous vehicle, semi-autonomous vehicle. The risk assessment and premium determination for vehicle insurance policies covering vehicles with autonomous operation features can be facilitated. The driverless operation or accident avoidance can be achieved. The financial protection against physical damage and/or bodily injury resulting from traffic accidents and against liability can be provided. The drawing shows a block diagram of a computer network, a computer server, a mobile device, and an on-board computer for implementing autonomous vehicle operation, monitoring, evaluation, and insurance processes. 100Computer system 108Vehicle 110Mobile device 160Program memory 162Processor ", "instruction": "Please summarize the input" }, { "input": "Accident risk model determination using autonomous vehicle operating dataMethods and systems for evaluating the effectiveness of autonomous operation features of autonomous vehicles using an accident risk model are provided. According to certain aspects, an accident risk model may be determined using effectiveness information regarding autonomous operation features associated with a vehicle. The effectiveness information may indicate a likelihood of an accident for the vehicle and may include test data or actual loss data. Determining the likelihood of an accident may include determining risk factors for the features related to the ability of the features to make control decisions that successfully avoid accidents. The accident risk model may further include information regarding effectiveness of the features relative to location or operating conditions, as well as types and severity of accidents. The accident risk model may further be used to determine or adjust aspects of an insurance policy associated with an autonomous vehicle.What is claimed is: \n | 1. A computer-implemented method of evaluating effectiveness of an autonomous or semi-autonomous vehicle technology, the method comprising: \nimplementing, by one or more processors, the autonomous or semi-autonomous vehicle technology within a virtual test environment configured to simultaneously test multiple autonomous or semi-autonomous vehicle technologies; \npresenting, by the one or more processors, virtual test sensor data to the autonomous or semi-autonomous vehicle technology implemented within the virtual test environment, wherein the virtual test sensor data simulates sensor data for operating conditions associated with a plurality of test scenarios within the virtual test environment; \ngenerating, by the one or more processors, test responses of the autonomous or semi-autonomous vehicle technology implemented within the virtual test environment in response to the virtual test sensor data; \ngenerating, by the one or more processors, an accident risk model indicating one or more risk levels for vehicle accidents associated with the autonomous or semi-autonomous vehicle technology based upon the test responses; \nreceiving, at the one or more processors, actual accident data associated with accidents involving vehicles using the autonomous or semi-autonomous vehicle technology in a non-test environment; \nadjusting, by the one or more processors, the accident risk model based upon the actual accident data by adjusting at least one of the one or more risk levels of the accident risk level; \nidentifying, by the one or more processors, a customer vehicle having the autonomous or semi-autonomous vehicle control technology; and \ngenerating or updating, by the one or more processors, an insurance policy associated with the customer vehicle based upon the adjusted at least one of the one or more risk levels of the adjusted accident risk model. \n | 2. The computer-implemented method of claim 1, wherein: \ngenerating the test responses includes generating test responses relative to additional test responses of another autonomous or semi-autonomous vehicle technology; and \nthe one or more risk levels of the accident risk model are generated based in part upon compatibility of the test responses of the autonomous or semi-autonomous vehicle technology with the additional test responses of the other autonomous or semi-autonomous vehicle technology. \n | 3. The computer-implemented method of claim 2, wherein the compatibility of the test responses and the additional test responses is determined for a plurality of versions of the other autonomous or semi-autonomous vehicle technology.\n | 4. The computer-implemented method of claim 1, wherein generating the accident risk model includes determining the one or more risk levels based upon an effectiveness metric associated with the autonomous or semi-autonomous vehicle technology calculated from the test responses.\n | 5. The computer-implemented method of claim 1, further comprising: \ncausing, by the one or more processors, information regarding all or a portion of the insurance policy to be presented to a customer associated with the customer vehicle via a display of a customer computing device for review. \n | 6. The computer-implemented method of claim 1, wherein the virtual test sensor data includes virtual test communication data simulating autonomous vehicle-to-vehicle communication data.\n | 7. The computer-implemented method of claim 1, wherein the autonomous or semi-autonomous vehicle technology involves at least one of a vehicle self-braking functionality or a vehicle self-steering functionality.\n | 8. The computer-implemented method of claim 1, wherein the operating conditions are associated with one or more of the following: a road type, a time of day, or a weather condition.\n | 9. A computer system for evaluating effectiveness of an autonomous or semi-autonomous vehicle technology, comprising: \none or more processors; \none or more program memories coupled to the one or more processors and storing executable instructions that, when executed by the one or more processors, cause the computer system to: \nimplement the autonomous or semi-autonomous vehicle technology within a virtual test environment configured to simultaneously test multiple autonomous or semi-autonomous vehicle technologies; \npresent virtual test sensor data to the autonomous or semi-autonomous vehicle technology implemented within the virtual test environment, wherein the virtual test sensor data simulates sensor data for operating conditions associated with a plurality of test scenarios within the virtual test environment; \ngenerate test responses of the autonomous or semi-autonomous vehicle technology implemented within the virtual test environment in response to the virtual test sensor data; \ngenerate an accident risk model indicating one or more risk levels for vehicle accidents associated with the autonomous or semi-autonomous vehicle technology based upon the test responses; \nreceive actual accident data associated with accidents involving vehicles using the autonomous or semi-autonomous vehicle technology in a non-test environment; \nadjust the accident risk model based upon the actual accident data by adjusting at least one of the one or more risk levels of the accident risk level; \nidentify a customer vehicle having the autonomous or semi-autonomous vehicle control technology; and \ngenerate or update an insurance policy associated with the customer vehicle based upon the adjusted at least one of the one or more risk levels of the adjusted accident risk model. \n | 10. The computer system of claim 9, wherein: \nthe executable instructions that cause the computer system to generate the test responses cause the computer system to generate test responses relative to additional test responses of another autonomous or semi-autonomous vehicle technology; and \nthe one or more risk levels of the accident risk model are generated based in part upon compatibility of the test responses of the autonomous or semi-autonomous vehicle technology with the additional test responses of the other autonomous or semi-autonomous vehicle technology. \n | 11. The computer system of claim 10, wherein the compatibility of the test responses and the additional test responses is determined for a plurality of versions of the other autonomous or semi-autonomous vehicle technology.\n | 12. The computer system of claim 9, wherein the executable instructions that cause the computer system to generate the accident risk model further cause the computer system to determine the one or more risk levels based upon an effectiveness metric associated with the autonomous or semi-autonomous vehicle technology calculated from the test responses.\n | 13. The computer system of claim 9, wherein the executable instructions further cause the computer system to: \ncommunicate to a customer computing device, via a communication network, information regarding all or a portion of the insurance policy to be presented to a customer associated with the customer vehicle for review via a display of the customer computing device. \n | 14. The computer system of claim 9, wherein the virtual test sensor data includes virtual test communication data simulating autonomous vehicle-to-vehicle communication data.\n | 15. A tangible, non-transitory computer-readable medium storing executable instructions for evaluating effectiveness of an autonomous or semi-autonomous vehicle technology that, when executed by at least one processor of a computer system, cause the computer system to: \nimplement the autonomous or semi-autonomous vehicle technology within a virtual test environment configured to simultaneously test multiple autonomous or semi-autonomous vehicle technologies; \npresent virtual test sensor data to the autonomous or semi-autonomous vehicle technology implemented within the virtual test environment, wherein the virtual test sensor data simulates sensor data for operating conditions associated with a plurality of test scenarios within the virtual test environment; \ngenerate test responses of the autonomous or semi-autonomous vehicle technology implemented within the virtual test environment in response to the virtual test sensor data; \ngenerate an accident risk model indicating one or more risk levels for vehicle accidents associated with the autonomous or semi-autonomous vehicle technology based upon the test responses; \nreceive actual accident data associated with accidents involving vehicles using the autonomous or semi-autonomous vehicle technology in a non-test environment; \nadjust the accident risk model based upon the actual accident data by adjusting at least one of the one or more risk levels of the accident risk level; \nidentify a customer vehicle having the autonomous or semi-autonomous vehicle control technology; and \ngenerate or update an insurance policy associated with the customer vehicle based upon the adjusted at least one of the one or more risk levels of the adjusted accident risk model. \n | 16. The tangible, non-transitory computer-readable medium of claim 15, wherein: \nthe executable instructions that cause the computer system to generate the test responses cause the computer system to generate test responses relative to additional test responses of another autonomous or semi-autonomous vehicle technology; and \nthe one or more risk levels of the accident risk model are generated based in part upon compatibility of the test responses of the autonomous or semi-autonomous vehicle technology with the additional test responses of the other autonomous or semi-autonomous vehicle technology. \n | 17. The tangible, non-transitory computer-readable medium of claim 16, wherein the compatibility of the test responses and the additional test responses is determined for a plurality of versions of the other autonomous or semi-autonomous vehicle technology.\n | 18. The tangible, non-transitory computer-readable medium of claim 15, wherein the executable instructions that cause the computer system to generate the accident risk model further cause the computer system to determine the one or more risk levels based upon an effectiveness metric associated with the autonomous or semi-autonomous vehicle technology calculated from the test responses.\n | 19. The tangible, non-transitory computer-readable medium of claim 15, further storing executable instructions that, when executed by at least one processor of the computer system, cause the computer system to: \ncause information regarding all or a portion of the insurance policy to be presented to a customer associated with the customer vehicle via a display of a customer computing device for review. \n | 20. The tangible, non-transitory computer-readable medium of claim 15, wherein the virtual test sensor data includes virtual test communication data simulating autonomous vehicle-to-vehicle communication data.", "output": "The method involves receiving actual accident data associated with accidents involving vehicles (108) at processors (162) using autonomous or semi-autonomous vehicle technology in a non-test environment, where the autonomous or semi-autonomous vehicle technology includes a vehicle self-braking functionality or vehicle self-steering functionality. An accident risk model is adjusted by the processor based on the actual accident data by adjusting one of the risk levels of the accident risk level. A customer vehicle including the autonomous or semi-autonomous vehicle control technology is identified by the processor. An insurance policy associated with the customer vehicle is generated or updated by the processor based on the adjusted risk level of the adjusted accident risk model. INDEPENDENT CLAIMS are also included for the following:a computer systema tangible, non-transitory computer-readable medium comprising a set of instructions for evaluating effectiveness of an autonomous or semi-autonomous vehicle technology by a computer system during urban driving or motorway driving conditions. Method for evaluating effectiveness of an autonomous or semi-autonomous vehicle technology e.g. vehicle self-braking functionality or vehicle self-steering functionality, by a computer system (all claimed) during urban driving or motorway driving conditions. The method enables assisting a vehicle operator to safely or efficiently operate the vehicle or to take full control vehicle operation by providing autonomous vehicle operation features. The method enables monitoring driving experience and/or usage of the autonomous or semi-autonomous vehicle technology and small time-frames in real-time and periodically providing feedback to a driver and an insurance provider and/or to adjust the insurance policies or premiums. The method enables determining vehicle insurance premium by effectively evaluating the vehicle to avoid and/or mitigate crashes and/or extent to which driver's control of the vehicle is enhanced or replaced by vehicle's software and artificial intelligence. The drawing shows a schematic block diagram of a computer network, a computer server, a mobile device, and an on-board computer for implementing autonomous vehicle operation, monitoring, evaluation, and insurance processes. 102Front end components104Back-end components108Accidents involving vehicles114Client device162Processor", "instruction": "Please summarize the input" }, { "input": "VEHICULAR TRAFFIC ALERTS FOR AVOIDANCE OF ABNORMAL TRAFFIC CONDITIONSMethods and systems are described for generating a vehicle-to-vehicle traffic alert and updating a vehicle-usage profile. Various aspects include detecting, via one or more processors associated with a first vehicle, that an abnormal traffic condition exists in an operating environment of the first vehicle. An electronic message is generated and transmitted wirelessly, via a vehicle-mounted transceiver associated with the first vehicle, to alert a nearby vehicle of the abnormal traffic condition and to allow the nearby vehicle to avoid the abnormal traffic condition. The first vehicle receives telematics data regarding operation of the nearby vehicle after the nearby vehicle received the electronic message, and transmits the telematics data to a remote server for updating a vehicle-usage profile associated with the nearby vehicle.What is claimed is: \n | 1. A computer-implemented method of analyzing abnormal traffic conditions, the method comprising: \ndetermining, via one or more processors, a risk level of an abnormal traffic condition detected in a vehicle operating environment; \ntransmitting data comprising the abnormal traffic condition to a smart infrastructure component within a proximity of the vehicle operating environment, wherein the smart infrastructure component performs an action based upon a type of anomalous condition to modify the anomalous condition into an altered roadway condition with an adjusted risk level; and \ntransmitting, via the one or more processors, an electronic message to a nearby vehicle via wireless communication to alert the nearby vehicle of the altered roadway condition and to allow the nearby vehicle to determine whether to avoid or approach the altered roadway condition. \n | 2. The computer-implemented method of claim 1, wherein the abnormal traffic condition is one or more of the following: an erratic vehicle, an erratic driver, road construction, a closed highway exit, slowed or slowing traffic, slowed or slowing vehicular congestion, or one or more other vehicles braking ahead of a vehicle.\n | 3. The computer-implemented method of claim 1, wherein the abnormal traffic condition is bad weather and the electronic message indicates a GPS location of the bad weather.\n | 4. The computer-implemented method of claim 1 further comprising updating, via the one or more processors, a risk averse profile associated with an operator of the nearby vehicle based upon whether the nearby vehicle was operated in a manner to avoid or approach the altered roadway condition.\n | 5. The computer-implemented method of claim 4, wherein the smart infrastructure component comprises a smart traffic light.\n | 6. The computer-implemented method of claim 1, wherein the nearby vehicle comprises one or more of the following: an autonomous vehicle, a semi-autonomous vehicle or a self-driving vehicle, and wherein the nearby vehicle includes one or more processors for receiving the transmitted electronic message.\n | 7. The computer-implemented method of claim 1, wherein the transmitting the electronic message to the nearby vehicle requires transmitting the electronic message to one or more remote processors.\n | 8. The computer-implemented method of claim 1, wherein the abnormal traffic condition is detected by analyzing vehicular telematics data.\n | 9. The computer-implemented method of claim 1, wherein the nearby vehicle travels to the vehicle operating environment.\n | 10. The computer-implemented method of claim 1, wherein the electronic message contains location information of the abnormal traffic condition, and the nearby vehicle ignores the electronic message when the location information indicates that the abnormal traffic condition is beyond a threshold distance from the nearby vehicle.\n | 11. A computer system configured to analyze abnormal traffic conditions, the computer system comprising one or more processors, the one or more processors configured to: \ndetermine, via one or more processors, a risk level of an abnormal traffic condition detected in a vehicle operating environment; \ntransmit data comprising the abnormal traffic condition to a smart infrastructure component within a proximity of the vehicle operating environment, wherein the smart infrastructure component performs an action based upon a type of anomalous condition to modify the anomalous condition into an altered roadway condition with an adjusted risk level; and \ntransmit, via the one or more processors, an electronic message to a nearby vehicle via wireless communication to alert the nearby vehicle of the altered roadway condition and to allow the nearby vehicle to determine whether to avoid or approach the altered roadway condition. \n | 12. The computer system of claim 11, wherein the abnormal traffic condition is one or more of the following: an erratic vehicle, an erratic driver, road construction, a closed highway exit, slowed or slowing traffic, slowed or slowing vehicular congestion, or one or more other vehicles braking ahead of a vehicle.\n | 13. The computer system of claim 11, wherein the abnormal traffic condition is bad weather, and the electronic message indicates a GPS location of the bad weather.\n | 14. The computer system of claim 11, the system further configured to update, via the one or more processors, a risk averse profile associated with an operator of the nearby vehicle based upon whether the nearby vehicle was operated in a manner to avoid or approach the altered roadway condition.\n | 15. The computer system of claim 11, the system further configured to generate an alternate route for the nearby vehicle to take to avoid the abnormal traffic condition.\n | 16. The computer system of claim 11, wherein the one or more processors include one or more of the following: vehicle-mounted sensors or vehicle-mounted processors.\n | 17. The computer system of claim 11, wherein the nearby vehicle comprises one or more of the following: an autonomous vehicle, a semi-autonomous vehicle or a self-driving vehicle, and the nearby vehicle includes one or more processors for receiving the transmitted electronic message.\n | 18. The computer system of claim 11, wherein the transmission of the electronic message to the nearby vehicle requires transmission of the electronic message to one or more remote processors.\n | 19. The computer system of claim 11, wherein the abnormal traffic condition is detected by analyzing vehicular telematics data.\n | 20. The computer system of claim 11, wherein the nearby vehicle travels to the vehicle operating environment.", "output": "The computer-implemented method involves determining a risk level of an abnormal traffic condition detected in a vehicle operating environment through several processors. Data comprising the abnormal traffic condition is transmitted to a smart infrastructure component (208) within a proximity of the vehicle operating environment. The smart infrastructure component performs an action based upon a type of anomalous condition to modify the anomalous condition into an altered roadway condition with an adjusted risk level. An electronic message is transmitted to a nearby vehicle (202a) through wireless communication to alert the nearby vehicle of the altered roadway condition and to allow the nearby vehicle to determine whether to avoid or approach the altered roadway condition through the processors. An INDEPENDENT CLAIM is included for a computer system configured to analyze abnormal traffic conditions. Computer-implemented method for analyzing abnormal traffic conditions such as an erratic vehicle, an erratic driver, road construction, a closed highway exit, slowed or slowing traffic, slowed or slowing vehicular congestion, or several other vehicles braking ahead of the vehicle. The data collected may be used to generate vehicle-usage profiles that more accurately reflect vehicle risk, or lack thereof, and facilitate more appropriate auto insurance pricing. The electronic message may then be transmitted through the vehicle's transceiver using a wireless communication to the nearby vehicle to alert the nearby vehicles of the abnormal traffic condition and to allow the neighboring vehicles to avoid the abnormally occurring traffic condition. The drawing shows a block diagram of the system that collects telematics and/or other data, and uses V2x wireless communication to broadcast the data collected to other vehicles, mobile devices, remote servers, and smart infrastructure. 200Notification system 201Network 202aNearby vehicle 203Direct rRadio link 208Smart infrastructure component ", "instruction": "Please summarize the input" }, { "input": "Autonomous communication feature useMethods and systems for determining collision risk associated with operation of autonomous vehicles using autonomous communication are provided. According to certain aspects, autonomous operation features associated with a vehicle may be determined, including features associated with autonomous communication between vehicles or with infrastructure. This information may be used to determine collision risk levels for a plurality of features, which may be based upon test data regarding the features or actual collision data. Expected use levels and autonomous communication levels may further be determined and used with the collision risk levels to determine a total collision risk level associated with operation of the vehicle. The autonomous communication levels may indicate the types of communications, the levels of communication with other vehicles or infrastructure, or the frequency of autonomous communication.What is claimed is: \n | 1. A computer-implemented method for determining collision risk of one or more autonomous operation features of a vehicle, comprising: \nreceiving, by an autonomous communication feature of the one or more autonomous operation features, autonomous vehicle-to-vehicle communication data from one or more additional vehicles operating within communication range of the vehicle; \ncontrolling, by an on-board computer of the vehicle and the one or more autonomous operation features, operation of the vehicle using the one or more autonomous operation features and the received autonomous vehicle-to-vehicle communication data; \ncommunicating, from the on-board computer of the vehicle to one or more processors of a server via a communication network, information regarding the one or more autonomous operation features of the vehicle, including information regarding the autonomous communication feature of the vehicle and a log of vehicle operation data; \nreceiving, at the one or more processors of the server from the on-board computer of the vehicle via the communication network, the information regarding the one or more autonomous operation features of the vehicle; \ndetermining, by the one or more processors of the server, a plurality of collision risk levels associated with autonomous operation of the vehicle under a plurality of operating environments based upon the information regarding the one or more autonomous operation features; \ndetermining, by the one or more processors of the server, a plurality of expected use levels of the vehicle based upon entries in the log of vehicle operation data, wherein the expected use levels are associated with the plurality of operating environments; \ndetermining, by the one or more processors of the server, a plurality of autonomous communication levels within the plurality of operating environments associated with the plurality of expected use levels for the vehicle based upon locations and times associated with the operating environments during prior operation of the vehicle, wherein the autonomous communication levels indicate availability of each of a plurality of types of autonomous communication capability in other vehicles as a proportion of the other vehicles in corresponding operating environments that exhibit the types of autonomous communication capability; \ndetermining, by the one or more processors of the server, a total collision risk level associated with operation of the vehicle based at least in part upon the determined collision risk levels, the determined expected use levels, and the determined autonomous communication levels; and \ncausing, by the one or more processors of the server, one or more of the following actions to be performed based upon the determined total collision risk level: adjust an insurance policy associated with the vehicle, determine a coverage level associated with the insurance policy, present information regarding the determined total collision risk level to a reviewer via a display of a reviewer computing device to verify the determined total collision risk level, or present the determination to a customer via a display of a customer computing device for review of an adjustment to the insurance policy associated with the vehicle. \n | 2. The method of claim 1, wherein the autonomous communication levels further include information relating to one or more of the following: levels of autonomous communication with infrastructure, or frequency of autonomous communications between the vehicle and the other vehicles.\n | 3. The method of claim 1, further comprising receiving, at one or more processors, information regarding previous use of the one or more autonomous operation features of the vehicle, and wherein the plurality of expected use levels are determined, at least in part, based upon the information regarding previous use of the one or more autonomous operation features.\n | 4. The method of claim 3, wherein the information regarding previous use of the autonomous operation features includes information regarding previous use of the autonomous communication feature.\n | 5. The method of claim 1, wherein the information regarding the one or more autonomous operation features of the vehicle is based upon (i) test results for test units corresponding to the one or more autonomous operation features, which test results include responses of the test units to test inputs corresponding to test scenarios, and (ii) actual collision data associated with a plurality of other vehicles having at least one of the one or more autonomous operation features.\n | 6. The method of claim 1, wherein the total collision risk level is determined without reference to factors relating to collision risk associated with a vehicle operator.\n | 7. The method of claim 1, further comprising: \nreceiving, at one or more processors, information regarding a vehicle operator; and \ndetermining, by one or more processors, an operator collision-risk profile associated with vehicle operation by the vehicle operator; \nwherein the total collision risk level is determined, at least in part, based upon the operator collision-risk profile. \n | 8. A computer system for determining collision risk of one or more autonomous operation features of a vehicle, comprising: \none or more processors; \nan autonomous communication feature of the one or more autonomous operation features, configured to receive autonomous vehicle-to-vehicle communication data from one or more additional vehicles operating within communication range of the vehicle; \nan on-board computer within the vehicle, configured to control operation of the vehicle using the one or more autonomous operation features and the received autonomous vehicle-to-vehicle communication data; \none or more communication modules adapted to communicate data from the on-board computer to the one or more processors via a communication network; and \na program memory coupled to the one or more processors and storing executable instructions that when executed by the one or more processors cause the computer system to: \nreceive, via the communication network, information regarding the one or more autonomous operation features of the vehicle, including information regarding the autonomous communication feature of the vehicle and a log of vehicle operation data; \ndetermine a plurality of collision risk levels associated with autonomous operation of the vehicle under a plurality of operating environments based upon the information regarding the one or more autonomous operation features; \ndetermine a plurality of expected use levels of the vehicle based upon entries in the log of vehicle operation data, wherein the expected use levels are associated with the plurality of operating environments; \ndetermine a plurality of autonomous communication levels within the plurality of operating environments associated with the plurality of expected use levels for the vehicle based upon locations and times associated with the operating environments during prior operation of the vehicle, wherein the autonomous communication levels indicate availability of each of a plurality of types of autonomous communication capability in other vehicles as a proportion of the other vehicles in corresponding operating environments that exhibit the types of autonomous communication capability; \ndetermine a total collision risk level associated with operation of the vehicle based at least in part upon the determined collision risk levels, the determined expected use levels, and the determined autonomous communication levels; and \ncause one or more of the following actions to be performed based upon the determined total collision risk level: adjust an insurance policy associated with the vehicle, determine a coverage level associated with the insurance policy, present information regarding the determined total collision risk level to a reviewer via a display of a reviewer computing device to verify the determined total collision risk level, or present the determination to a customer via a display of a customer computing device for review of an adjustment to the insurance policy associated with the vehicle. \n | 9. The computer system of claim 8, wherein the autonomous communication levels further include information relating to one or more of the following: levels of autonomous communication with infrastructure, or frequency of autonomous communications between the vehicle and the other vehicles.\n | 10. The computer system of claim 8, wherein the executable instructions further cause the computer system to receive information regarding previous use of the one or more autonomous operation features of the vehicle, and wherein the plurality of expected use levels are determined, at least in part, based upon the information regarding previous use of the one or more autonomous operation features.\n | 11. The computer system of claim 10, wherein the information regarding previous use of the autonomous operation features includes information regarding previous use of the autonomous communication feature.\n | 12. The computer system of claim 8, wherein the information regarding the one or more autonomous operation features of the vehicle is based upon (i) test results for test units corresponding to the one or more autonomous operation features, which test results include responses of the test units to test inputs corresponding to test scenarios, and (ii) actual collision data associated with a plurality of other vehicles having at least one of the one or more autonomous operation features.\n | 13. The computer system of claim 8, wherein the total collision risk level is determined without reference to factors relating to collision risks associated with a vehicle operator.\n | 14. The computer system of claim 8, wherein the executable instructions further cause the computer system to: \nreceive information regarding a vehicle operator; and \ndetermine an operator collision-risk profile associated with vehicle operation by the vehicle operator; \nwherein the total collision risk level is determined, at least in part, based upon the operator collision-risk profile. \n | 15. A tangible, non-transitory computer-readable medium storing instructions for determining collision risk of one or more autonomous operation features of a vehicle, when executed by at least one processor of a computer system, cause the computer system to: \nreceive autonomous vehicle-to-vehicle communication data from one or more additional vehicles operating within communication range of the vehicle by an autonomous communication feature of the one or more autonomous operation features; \ncontrol operation of the vehicle using the one or more autonomous operation features and the received autonomous vehicle-to-vehicle communication data by an on-board computer of the vehicle and the one or more autonomous operation features; \ncommunicate information regarding the one or more autonomous operation features of the vehicle, including information regarding the autonomous communication feature of the vehicle and a log of vehicle operation data, from the on-board computer of the vehicle to a server via a communication network; \nreceive the information regarding the one or more autonomous operation features of the vehicle at the server from the on-board computer \ndetermine a plurality of collision risk levels associated with autonomous operation of the vehicle under a plurality of operating environments based upon the information regarding the one or more autonomous operation features; \ndetermine a plurality of expected use levels of the vehicle based upon entries in the loci of vehicle operation data, wherein the expected use levels are associated with the plurality of operating environments; \ndetermine a plurality of autonomous communication levels within the plurality of operating environments associated with the plurality of expected use levels for the vehicle based upon locations and times associated with the operating environments during prior operation of the vehicle, wherein the autonomous communication levels indicate availability of each of a plurality of types of autonomous communication capability in other vehicles as a proportion of the other vehicles in corresponding operating environments that exhibit the types of autonomous communication capability; \ndetermine a total collision risk level associated with operation of the vehicle based at least in part upon the determined collision risk levels, the determined expected use levels, and the determined autonomous communication levels; and \ncause one or more of the following actions to be performed based upon the determined total collision risk level: adjust an insurance policy associated with the vehicle, determine a coverage level associated with the insurance policy, present information regarding the determined total collision risk level to a reviewer via a display of a reviewer computing device to verify the determined total collision risk level, or present the determination to a customer via a display of a customer computing device for review of an adjustment to the insurance policy associated with the vehicle. \n | 16. The tangible, non-transitory computer-readable medium of claim 15, wherein the autonomous communication levels further include information relating to one or more of the following: levels of autonomous communication with infrastructure, or frequency of autonomous communications between the vehicle and the other vehicles.\n | 17. The tangible, non-transitory computer-readable medium of claim 15, further comprising executable instructions that, when executed by at least one processor of a computer system, cause the computer system to receive information regarding previous use of the one or more autonomous operation features of the vehicle, and wherein the plurality of expected use levels are determined, at least in part, based upon the information regarding previous use of the one or more autonomous operation features.\n | 18. The tangible, non-transitory computer-readable medium of claim 17, wherein the information regarding previous use of the autonomous operation features includes information regarding previous use of the autonomous communication feature.", "output": "The method involves determining a total collision risk level associated with operation of a vehicle (108) based upon collision risk levels, expected use levels, and autonomous communication levels by processors (162) of a server (140). Following actions are caused to be performed based upon the determined total collision risk level by the processors of the server such that an insurance policy associated with the vehicle is adjusted, a coverage level associated with the insurance policy is determined, information regarding the determined total collision risk level is presented to a reviewer by a display of a reviewer computing device to verify the determined total collision risk level, or determination to a customer by a display of a customer computing device for review of adjustment to the insurance policy associated with the vehicle is presented. INDEPENDENT CLAIMS are also included for the following:a computer system for determining collision risk of autonomous operation features of a vehiclea tangible non-transitory computer-readable medium comprising a set of instructions for determining collision risk of autonomous operation features of a vehicle. Method for determining collision risk of autonomous operation features of a vehicle i.e. smart car. The method enables allowing near real-time uploads and downloads of information and periodic uploads and downloads of information. The method enables providing autonomous vehicle operation features to assist the vehicle operator to safely or efficiently operate the vehicle or take full control of vehicle operation under part or all circumstances. The method enables monitoring driving experience and/or usage of the autonomous or semi-autonomous vehicle technology in real time, small timeframes, and/or periodically to provide feedback to the driver, insurance provider, and/or adjust insurance policies or premiums. The method enables determining automobile insurance premium by effectively evaluating the vehicle to avoid and/or mitigate crashes and/or extent to which driver's control of the vehicle is enhanced or replaced by vehicle's software and artificial intelligence. The drawing shows a schematic block diagram of a computer network, a computer server, a mobile device, and an on-board computer for implementing autonomous vehicle operation, monitoring, evaluation, and insurance processes. 100Autonomous vehicle insurance system102Front-end components104Back-end components108Vehicle130Network140Server162Processors", "instruction": "Please summarize the input" }, { "input": "Accident fault determination for autonomous vehiclesMethods and systems for determining fault for an accident involving a vehicle having one or more autonomous (and/or semi-autonomous) operation features are provided. According to certain aspects, operating data from sensors within or near the vehicle may be used to determine fault for a vehicle accident, such as a collision. The operating data may include information regarding use of the features at the time of the accident and may further be used to determine an allocation of fault for the accident between a vehicle operator, the autonomous operation features, or a third party. The allocation of fault may be used to determine and/or adjust coverage levels for an insurance policy associated with the vehicle. The allocation of fault may further be used to adjust risk levels or profiles associated with the vehicle operator or with the autonomous operation features.What is claimed is: \n | 1. A computer system for reconstructing a vehicle crash, the computer system comprising one or more processors, one or more transceivers coupled to the one or more processors, and one or more program memories coupled to the one or more processors and storing executable instructions that cause the one or more processors to: \nreceive vehicle operating data for a vehicle having one or more autonomous operation features for controlling the vehicle, the vehicle operating data being generated and transmitted by an on-board computer or mobile device using wireless communication or data transmission, wherein the vehicle operating data includes: \n(i) sensor data from one or more vehicle-mounted sensors associated with the one or more autonomous operation features, the sensor data also indicating a configuration or setting of each autonomous operation feature before and during the vehicle crash; and \n(ii) a recorded log of decisions made by the one or more autonomous operation features, and commands sent from the on-board computer to control components to operate the vehicle, before and during the vehicle crash; \nreceive an indication of an accident involving the vehicle, or otherwise determine that the vehicle has been involved in the accident based upon processor analysis of the (i) sensor data and (ii) recorded log received; \ngenerate a crash reconstruction representing a sequence of events involved in the accident by automatically determining, for each of a plurality of times in the sequence of events, each of the following: (i) a location of the vehicle based upon the sensor data, (ii) a location of an obstruction involved in the accident based upon the sensor data, and (iii) a movement of the vehicle based upon the decisions in the recorded log; \ndetermine an allocation of fault for the accident for each of the one or more autonomous operation features based at least in part upon the crash reconstruction; and \ncause one or more of the following actions to be performed based upon the determined total risk level: adjust an insurance policy associated with the vehicle, determine a coverage level associated with the insurance policy, present information regarding the determined total risk level to a reviewer via a display of a reviewer computing device to verify the determined total risk level, or present the determination to a customer via a display of a customer computing device for review of an adjustment to the insurance policy associated with the vehicle. \n | 2. The computer system of claim 1, wherein the one or more processors analyze the vehicle operating data received to determine an extent of vehicle damage, or a cost to repair the damage or replace part or all of the vehicle, the vehicle operating data including video or image data.\n | 3. The computer system of claim 1, wherein determining an allocation of fault for the accident for the one or more autonomous operation features further includes the one or more processors analyzing data generated by the vehicle-mounted sensors or cameras depicting a vehicle environment and data from the sensors regarding the response of the vehicle to its environment prior to, or during, the vehicle crash.\n | 4. The computer system of claim 1, wherein determining an allocation of fault for the accident for the one or more autonomous operation features further includes the one or more processors analyzing wireless communications or data transmissions to and from the vehicle, including vehicle-to-vehicle or infrastructure-to-vehicle communications.\n | 5. The computer system of claim 1, wherein the one or more vehicle-mounted sensors include one or more of a GPS (Global Positioning System) unit, a radar unit, a LIDAR unit, an ultrasonic sensor, an infrared sensor, a camera, an accelerometer, a tachometer, or a speedometer, and at least one sensor is configured to actively or passively scan a vehicle environment of the vehicle for obstacles, including other vehicles, buildings, and pedestrians.\n | 6. The computer system of claim 1, wherein the one or more vehicle-mounted sensors include one or more of an ignition sensor, an odometer, a system clock, a speedometer, a tachometer, an accelerometer, a gyroscope, a compass, a geolocation or GPS unit, a camera, or a distance sensor.\n | 7. The computer system of claim 1, the one or more processors further configured to: \ndetermine an allocation of fault for the accident based, at least in part, upon whether or not the vehicle was being operated in accordance with optimal use levels for a variety of combinations of configurations and settings associated with the one or more autonomous operation features based upon road, weather, or traffic conditions at the time of, or prior to, the vehicle crash, the optimal use levels being associated with a lowest risk of vehicle crash. \n | 8. The computer system of claim 1, the one or more processors further configured to: \ndetermine an optimal use level for a variety of combinations of configurations and settings associated with the one or more autonomous operation features based upon current road, weather, or traffic conditions; \ncompare the optimal use level with a current actual use level for the variety of combinations of configurations and settings associated with the one or more autonomous operation features; and \nif the optimal use level differs from the current actual use level, generate and transmit an electronic notification to the vehicle or vehicle operator's mobile device recommending that the optimal use level be used. \n | 9. The computer system of claim 1, wherein determining the allocation of fault further includes determining, by a processor, a point of impact on the vehicle, or an indication of a state of one or more traffic signals before, or during, the vehicle crash.\n | 10. The computer system of claim 1, the one or more processors further configured to receive data indicating engagement of at least one of the one or more autonomous operation features before the vehicle crash; and \ndetermining the allocation of fault for the vehicle crash includes the one or more processors analyzing whether the autonomous operation feature failed to take appropriate control actions or whether control signals were ineffective in controlling the vehicle immediately prior to the vehicle crash. \n | 11. The computer system of claim 1, the one or more processors further configured to receive data indicating engagement or disengagement of the one or more autonomous operation features before the vehicle crash; and \nwherein determining the allocation of fault for the vehicle crash includes the one or more processors analyzing whether the vehicle had time to take action to avoid the accident but that action was not taken. \n | 12. The computer system of claim 1, the one or more processors and transceivers further configured to receive data indicating engagement or disengagement of the one or more autonomous operation features before the vehicle crash; and \nwherein determining the allocation of fault for the vehicle crash includes the one or more processors determining that autonomous operation of the vehicle prior to the vehicle crash was no longer feasible due to conditions in a vehicle environment of the vehicle. \n | 13. The computer system of claim 1, the one or more processors and transceivers configured to receive data indicating engagement of the one or more autonomous operation features before the vehicle crash; and \nwherein determining the allocation of fault for the vehicle crash includes the one or more processors determining whether the one or more autonomous operation features attempted to return control of the vehicle to the vehicle operator prior to the vehicle crash and whether or not an adequate period of time for transition was available prior to the vehicle crash. \n | 14. The computer system of claim 1, wherein the vehicle operating data received via wireless communication further includes telematics data indicating vehicle operation before and during a vehicle crash, including vehicle speed, heading, acceleration, and braking; and \nthe one or more processors are configured to (1) determine that the vehicle has been involved in the accident based upon processor analysis of the (i) sensor data, (ii) recorded log received, and (iii) telematics data; and (2) determine an allocation of fault for the accident for the one or more autonomous operation features based at least in part upon the received (i) sensor data, (ii) recorded log received, and (iii) telematics data. \n | 15. A computer-implemented method for reconstructing a vehicle crash, comprising: \nreceiving, via one or more processors or an associated transceiver, vehicle operating data for a vehicle having one or more autonomous operation features for controlling the vehicle, the vehicle operating data being generated and transmitted by an on-board computer or mobile device using wireless communication or data transmission, wherein the vehicle operating data includes: \n(i) sensor data from one or more vehicle-mounted sensors associated with the one or more autonomous operation features, the sensor data also indicating a configuration or setting of each autonomous operation feature before and during the vehicle crash; and \n(ii) a recorded log of decisions made by the one or more autonomous operation features, and commands sent from the on-board computer to control components to operate the vehicle, before and during the vehicle crash; \nreceiving, via the one or more processors or associated transceiver, an indication of an accident involving the vehicle, or otherwise determining, via the one or more processors, that the vehicle has been involved in the accident based upon processor analysis of the (i) sensor data, and (ii) recorded log received; \ngenerating, by the one or more processors, a crash reconstruction representing a sequence of events involved in the accident by automatically determining, for each of a plurality of times in the sequence of events, each of the following: (i) a location of the vehicle based upon the sensor data, (ii) a location of an obstruction involved in the accident based upon the sensor data, and (iii) a movement of the vehicle based upon the decisions in the recorded log; \ndetermining, by the one or more processors, an allocation of fault for each of the accident for the one or more autonomous operation features based at least in part upon the crash reconstruction; and \ncausing, by the one or more processors, one or more of the following actions to be performed based upon the determined total risk level: adjust an insurance policy associated with the vehicle, determine a coverage level associated with the insurance policy, present information regarding the determined total risk level to a reviewer via a display of a reviewer computing device to verify the determined total risk level, or present the determination to a customer via a display of a customer computing device for review of an adjustment to the insurance policy associated with the vehicle. \n | 16. The computer-implemented method of claim 15, wherein, the one or more processors analyze the vehicle operating data received to determine an extent of vehicle damage, or cost to repair the damage or replace part or all of the vehicle, the vehicle operating data including video or image data.\n | 17. The computer-implemented method of claim 15, wherein determining, by the one or more processors, an allocation of fault for the accident for the one or more autonomous operation features further includes processor analysis of data generated by the vehicle-mounted sensors or cameras depicting a vehicle environment and data from the sensors regarding the response of the vehicle to its environment prior to, and during, the vehicle crash.\n | 18. The computer-implemented method of claim 15, wherein determining, by the one or more processors, an allocation of fault for the accident for the one or more autonomous operation features further includes processor analysis of wireless communications or data transmissions to and from the vehicle, including vehicle-to-vehicle or infrastructure-to-vehicle communications.\n | 19. The computer-implemented method of claim 15, the method comprising: \ndetermining, by the one or more processors, an allocation of fault for the accident based, at least in part, upon whether or not the vehicle being operated in accordance with optimal use levels for a variety of combinations of configurations and settings associated with the one or more autonomous operation features based upon road, weather, or traffic conditions at the time of, or prior to, the vehicle crash, the optimal use levels being associated with a lowest risk of vehicle crash. \n | 20. The computer-implemented method of claim 15, the method comprising: \ndetermining, by the one or more processors, an optimal use level for a variety of combinations of configurations and settings associated with the one or more autonomous operation features based upon current road, weather, or traffic conditions; \ncomparing the optimal use level with a current actual use level for the variety of combinations of configurations and settings associated with the one or more autonomous operation features; and if the optimal use level differs from the current actual use level, generating and transmitting a notification to the vehicle or vehicle operator's mobile device recommending that the optimal use level be used.", "output": "The system has a processor for determining an allocation of fault for accident for set of autonomous operation features based on crash reconstruction. The processor causes set of following actions to be performed based on determined total risk level, adjusts an insurance policy associated with a vehicle, determines coverage level associated with insurance policy, presents information regarding determined total risk level to a reviewer via a display of a reviewer computing device to verify the determined total risk level or present determination to a customer via a display of a customer computing device for review of an adjustment to the insurance policy associated with the vehicle. An INDEPENDENT CLAIM is also included for a method for reconstructing a vehicle crash. Computer system for reconstructing a vehicle e.g. autonomous vehicle crash. The system reduces risks associated with vehicle operation to control a vehicle to a vehicle operator and utilizes server to allocate fault for accident to set of autonomous operation features and adjusts risk levels and/or risk profiles associated with set of autonomous operation features at block. The system increases autonomous operation feature performance by facilitating near real-time uploads and downloads of information. The drawing shows a schematic block diagram of a computer network, a computer server, a mobile device, and an on-board computer for implementing autonomous vehicle operation, monitoring, evaluation, and insurance processes. 100Autonomous vehicle insurance system102Front end component104Back-end component110Mobile device114Communication component", "instruction": "Please summarize the input" }, { "input": "Vehicular traffic alerts for avoidance of abnormal traffic conditionsMethods and systems are described for generating a vehicle-to-vehicle traffic alert and updating a vehicle-usage profile. Various aspects include detecting, via one or more processors associated with a first vehicle, that an abnormal traffic condition exists in an operating environment of the first vehicle. An electronic message is generated and transmitted wirelessly, via a vehicle-mounted transceiver associated with the first vehicle, to alert a nearby vehicle of the abnormal traffic condition and to allow the nearby vehicle to avoid the abnormal traffic condition. The first vehicle receives telematics data regarding operation of the nearby vehicle after the nearby vehicle received the electronic message, and transmits the telematics data to a remote server for updating a vehicle-usage profile associated with the nearby vehicle.What is claimed is: \n | 1. A computer-implemented method of generating a vehicle traffic alert and updating a vehicle-usage profile, the method comprising: \ndetecting, via one or more processors, that an abnormal traffic condition exists in an operating environment of a first vehicle; \ngenerating, via the one or more processors, an electronic message regarding abnormal traffic condition; \ntransmitting, via the one or more processors, the electronic message to a nearby vehicle, wherein the electronic message is transmitted via wireless communication to alert the nearby vehicle of the abnormal traffic condition and to allow the nearby vehicle to avoid the abnormal traffic condition; \nreceiving, via the one or more processors, telematics data regarding operation of the nearby vehicle after the nearby vehicle received the electronic message; and \nupdating, via the one or more processors, a vehicle-usage profile associated with the nearby vehicle based upon the received telematics data regarding operation of the nearby vehicle. \n | 2. The computer-implemented method of claim 1, wherein the abnormal traffic condition is one or more of the following: an erratic vehicle, an erratic driver, road construction, a closed highway exit, slowed or slowing traffic, slowed or slowing vehicular congestion, or one or more other vehicles braking ahead of the first vehicle.\n | 3. The computer-implemented method of claim 1, wherein the abnormal traffic condition is bad weather and the electronic message indicates a GPS location of the bad weather.\n | 4. The computer-implemented method of claim 1, wherein updating the vehicle-usage profile causes an insurance premium adjustment to an insurance policy associated with an operator of the nearby vehicle.\n | 5. The computer-implemented method of claim 1, wherein the nearby vehicle comprises one or more of the following: an autonomous vehicle, a semi-autonomous vehicle or a self-driving vehicle, and wherein the nearby vehicle includes one or more processors for receiving the transmitted electronic message.\n | 6. The computer-implemented method of claim 1, wherein the transmitting the electronic message to the nearby vehicle requires transmitting the electronic message to one or more remote processors.\n | 7. The computer-implemented method of claim 1, wherein the abnormal traffic condition is detected by analyzing vehicular telematics data.\n | 8. The computer-implemented method of claim 1, wherein the nearby vehicle travels to the operating environment of the first vehicle.\n | 9. The computer-implemented method of claim 1, the method further comprising transmitting the electronic message to a smart infrastructure component, wherein the smart infrastructure component: \nanalyzes the electronic message to determine a type of anomalous condition for the abnormal traffic condition; and \nperforms an action based upon the type of anomalous condition in order to modify the anomalous condition. \n | 10. The computer-implemented method of claim 1, wherein the electronic message contains location information of the abnormal traffic condition, and the nearby vehicle ignores the electronic message when the location information indicates that the abnormal traffic condition is beyond a threshold distance from the nearby vehicle.\n | 11. A computer system configured to generate a vehicle traffic alert and update a vehicle-usage profile, the computer system comprising one or more processors, the one or more processors configured to: \ndetect that an abnormal traffic condition exists in an operating environment of a first vehicle; \ngenerate an electronic message regarding the abnormal traffic condition; \ntransmit the electronic message to a nearby vehicle, wherein the electronic message is transmitted via wireless communication to alert the nearby vehicle of the abnormal traffic condition and to allow the nearby vehicle to avoid the abnormal traffic condition; \nreceive telematics data regarding operation of the nearby vehicle after the nearby vehicle received the electronic message; and \nupdate a vehicle-usage profile associated with the nearby vehicle based upon the received telematics data regarding operation of the nearby vehicle. \n | 12. The computer system of claim 11, wherein the abnormal traffic condition is one or more of the following: an erratic vehicle, an erratic driver, road construction, a closed highway exit, slowed or slowing traffic, slowed or slowing vehicular congestion, or one or more other vehicles braking ahead of the first vehicle.\n | 13. The computer system of claim 11, wherein the abnormal traffic condition is bad weather, and the electronic message indicates a GPS location of the bad weather.\n | 14. The computer system of claim 11, the system further configured to generate an alternate route for the nearby vehicle to take to avoid the abnormal traffic condition.\n | 15. The computer system of claim 11, wherein updating the vehicle-usage profile causes an insurance premium adjustment to an insurance policy associated with an operator of the nearby vehicle.\n | 16. The computer system of claim 11, wherein the one or more processors include one or more of the following: vehicle-mounted sensors or vehicle-mounted processors.\n | 17. The computer system of claim 11, wherein the nearby vehicle comprises one or more of the following: an autonomous vehicle, a semi-autonomous vehicle or a self-driving vehicle, and the nearby vehicle includes one or more processors for receiving the transmitted electronic message.\n | 18. The computer system of claim 11, wherein the transmission of the electronic message to the nearby vehicle requires transmission of the electronic message to one or more remote processors.\n | 19. The computer system of claim 11, wherein the abnormal traffic condition is detected by analyzing vehicular telematics data.\n | 20. The computer system of claim 11, wherein the nearby vehicle travels to the operating environment of the first vehicle.", "output": "The method involves detecting (1104) that an abnormal traffic condition exists in an operating environment of a first vehicle through a processor. An electronic message is generated (1106) regarding abnormal traffic condition. The electronic message transmitted to a nearby vehicle. The electronic message is transmitted (1108) through wireless communication to alert the nearby vehicle of the abnormal traffic condition and to allow the nearby vehicle to avoid the abnormal traffic condition. The telematics data regarding operation of the nearby vehicle is received after the nearby vehicle received the electronic message. A vehicle-usage profile associated with the nearby vehicle is updated based upon the received telematics data regarding operation of the nearby vehicle. An INDEPENDENT CLAIM is included for a system configured to generate a vehicle traffic alert and update a vehicle-usage profile. Method for generating vehicle traffic alert and updating vehicle-usage profile. The travel recommendations reduce or lower risk and enhance driver or vehicle safety. The insurance policies are adjusted, generated and updated. The drawing shows a flow diagram of a traffic condition broadcast method. 1100Method for traffic condition broadcast1102Step for collecting sensor data regarding a vehicle operating environment from sensors1104Step for detecting that an abnormal traffic condition exists in an operating environment1106Step for generating an electronic message regarding abnormal traffic condition1108Step for transmitting the electronic message through wireless communication to alert the nearby vehicle of the abnormal traffic condition", "instruction": "Please summarize the input" }, { "input": "Method and system for enhancing the functionality of a vehicleMethods and systems for enhancing the functionality of a semi-autonomous vehicle are described herein. The semi-autonomous vehicle may receive a communication from a fully autonomous vehicle within a threshold distance of the semi-autonomous vehicle. If the vehicles are travelling on the same route or the same portion of a route, the semi-autonomous vehicle may navigate to a location behind the fully autonomous vehicle. Then the semi-autonomous vehicle may operate autonomously by replicating one or more functions performed by the fully autonomous vehicle. The functions and/or maneuvers performed by the fully autonomous vehicle may be detected via sensors in the semi-autonomous vehicle and/or may be identified by communicating with the fully autonomous vehicle to receive indications of upcoming maneuvers. In this manner, the semi-autonomous vehicle may act as a fully autonomous vehicle.What is claimed is: \n | 1. A computer-implemented method for enhancing the functionality of a vehicle, comprising: \nbroadcasting, via one or more processors and/or associated transceivers of a semiautonomous vehicle having one or more autonomous operation features, a request to follow a fully autonomous vehicle within a predetermined communication range of the semi-autonomous vehicle via vehicle-to-vehicle wireless communication; \nreceiving, at the one or more processors and/or associated transceivers of the semiautonomous vehicle via vehicle-to-vehicle communication, an indication directly from several autonomous vehicles that each autonomous vehicle is within the predetermined communication range of the semi-autonomous vehicle, wherein each indication includes identification information for the autonomous vehicle for determining a safety rating of the autonomous vehicle; \nselecting, at the one or more processors of the semi-autonomous vehicle, an autonomous vehicle from among the several autonomous vehicles within the predetermined communication range of the semi-autonomous vehicle-based upon the safety rating of each of the several autonomous vehicles as determined according to the identification information for each autonomous vehicle; and \nfor a portion of the route, causing, by the one or more processors, the semi-autonomous vehicle to follow the selected autonomous vehicle and mimic each maneuver performed by the autonomous vehicle. \n | 2. The computer-implemented method of claim 1, wherein the one or more processors periodically re-verify that the semi-autonomous vehicle remains within a predetermined distance of the selected autonomous vehicle, and when a distance between the vehicles exceeds the predetermined threshold distance, the semi-autonomous vehicle maneuvers to the side of the road and parks.\n | 3. The computer-implemented method of claim 1, wherein at least one component in the semi-autonomous vehicle is malfunctioning, such that the semi-autonomous vehicle requires input from a vehicle operator to operate.\n | 4. The computer-implemented method of claim 3, wherein the semi-autonomous vehicle is damaged in a vehicle collision and the selected autonomous vehicle is a tow service vehicle.\n | 5. The computer-implemented method of claim 1, wherein the semi-autonomous vehicle includes fewer sensors for autonomous operation than the selected autonomous vehicle.\n | 6. The computer-implemented method of claim 1, wherein causing the semi-autonomous vehicle to mimic each maneuver performed by the selected autonomous vehicle includes: \nreceiving, at the one or more processors, an indication of an upcoming maneuver to be performed by the selected autonomous vehicle and an indication of a time or location at which the upcoming maneuver will be performed; and \ncausing, by the one or more processors, the semi-autonomous vehicle to perform the upcoming maneuver at the indicated time or location. \n | 7. The computer-implemented method of claim 6, further comprising: \nreceiving, at the one or more processors, an indication of a speed at which the selected autonomous vehicle is travelling; and \ncausing, by the one or more processors, the semi-autonomous vehicle to travel slower than the selected autonomous vehicle based upon the received speed. \n | 8. The computer-implemented method of claim 1, wherein causing the semi-autonomous vehicle to mimic each maneuver performed by the selected autonomous vehicle includes: \ndetecting, via one or more sensors within the semi-autonomous vehicle, a maneuver performed by the selected autonomous vehicle; and \ncausing, by the one or more processors, the semi-autonomous vehicle to perform a same maneuver as the detected maneuver. \n | 9. The computer-implemented method of claim 1, wherein a vehicle operator for the semi-autonomous vehicle provides input to the semi-autonomous vehicle to direct the semi-autonomous vehicle to a location behind the autonomous vehicle; and \nwhen the semi-autonomous vehicle detects the selected autonomous vehicle in front of the semi-autonomous vehicle, the method further includes causing, by the one or more processors, the semi-autonomous vehicle to operate without input from a vehicle operator. \n | 10. The computer-implemented method of claim 1, wherein selecting, at the one or more processors of the semi-autonomous vehicle, an autonomous vehicle from among the several autonomous vehicles within the predetermined communication range of the semi-autonomous vehicle is based upon a comparison of the current route of the semi-autonomous vehicle with each of the several autonomous vehicles' route, respectively.\n | 11. A computer system configured to enhance the functionality of a vehicle, the computer system comprising one or more local or remote processors, transceivers, and/or sensors configured to: \nbroadcast, via a semi-autonomous vehicle having one or more autonomous operation features, a request to follow a fully autonomous vehicle within a predetermined communication range of the semi-autonomous vehicle via vehicle-to-vehicle wireless communication; \nreceive, at the semi-autonomous vehicle via vehicle-to-vehicle communication, an indication directly from several fully autonomous or fully operational autonomous vehicles that each fully autonomous or fully operational autonomous vehicle is within the predetermined communication range of the semi-autonomous vehicle, wherein each indication includes identification information for the autonomous vehicle for determining a safety rating of the autonomous vehicle; \nselect, at the semi-autonomous vehicle, an autonomous vehicle from among the several autonomous vehicles within the predetermined communication range of the semiautonomous vehicle based upon the safety rating of each of the several autonomous vehicles as determined according to the identification information for each autonomous vehicle; and \nfor a portion of the route, cause the semi-autonomous vehicle to follow the selected autonomous vehicle and mimic each maneuver performed by the selected autonomous vehicle. \n | 12. The computer system of claim 11, wherein the semiautonomous vehicle periodically re-verifies that the semi-autonomous vehicle remains within a predetermined distance of the selected autonomous vehicle, and when a distance between the vehicles exceeds the predetermined threshold distance, the semi-autonomous vehicle maneuvers to the side of the road and parks.\n | 13. The computer system of claim 11, wherein at least one component in the semi-autonomous vehicle is malfunctioning, such that the semi-autonomous vehicle requires input from a vehicle operator to operate.\n | 14. The computer system of claim 13, wherein the semiautonomous vehicle is damaged in a vehicle collision and the selected autonomous vehicle is a tow service vehicle.\n | 15. The computer system of claim 11, wherein the semiautonomous vehicle includes fewer sensors for autonomous operation than the selected autonomous vehicle.\n | 16. The computer system of claim 11, wherein to cause the semi-autonomous vehicle to mimic each maneuver performed by the selected autonomous vehicle, the one or more local or remote processors, transceivers, and/or sensors are configured to: \nreceive an indication of an upcoming maneuver to be performed by the selected autonomous vehicle and an indication of a time or location at which the upcoming maneuver will be performed; and \ncause the semi-autonomous vehicle to perform the upcoming maneuver at the indicated time or location. \n | 17. The computer system of claim 16, wherein one or more local or remote processors, transceivers, and/or sensors are further configured to: \nreceive an indication of a speed at which the selected autonomous vehicle is travelling; and \ncause the semi-autonomous vehicle to travel slower than the selected autonomous vehicle based upon the received speed. \n | 18. The computer system of claim 11, wherein to cause the semi-autonomous vehicle to mimic each maneuver performed by the selected autonomous vehicle, the one or more local or remote processors, transceivers, and/or sensors are configured to: \ndetect, via one or more sensors within the semi-autonomous vehicle, a maneuver performed by the selected autonomous vehicle; and \ncause the semi-autonomous vehicle to perform a same maneuver as the detected maneuver. \n | 19. The computer system of claim 11, wherein a vehicle operator for the semi-autonomous vehicle provides input to the semi-autonomous vehicle to direct the semi-autonomous vehicle to a location behind the selected autonomous vehicle; and \nwhen the semi-autonomous vehicle detects the selected autonomous vehicle in front of the semi-autonomous vehicle, the one or more local or remote processors, transceivers, and/or sensors are configured to cause the semi-autonomous vehicle to operate without input from a vehicle operator. \n | 20. The computer system of claim 11, wherein selecting at the semi-autonomous vehicle, an autonomous vehicle from among the several autonomous vehicles within the predetermined communication range of the semiautonomous vehicle is based upon a comparison of the current route of the semi-autonomous vehicle with each of the several autonomous vehicles' route, respectively.", "output": "The method involves broadcasting a request to follow a fully autonomous vehicle within a predetermined communication range of the semi-autonomous vehicle (108) through vehicle-to-vehicle wireless communication through processors and/or associated transceivers of a semi-autonomous vehicle having autonomous operation features. An autonomous vehicle from among the several autonomous vehicles within the predetermined communication range of the semi-autonomous vehicle-based upon the safety rating of each of the several autonomous vehicles as determined according to the identification information for each autonomous vehicle is selected at the processors of the semi-autonomous vehicle and for a portion of the route. The semi-autonomous vehicle to follow the selected autonomous vehicle and mimic each maneuver performed by the autonomous vehicle is caused by the processors. An INDEPENDENT CLAIM is included for a computer system configured to enhance the functionality of a vehicle. Computer based method for enhancing functionality of vehicle by caravanning with fully autonomous vehicles. The data application facilitates data communication between the front-end components and the back-end components are more efficient processing and data storage. The automobile insurance premium may be determined by evaluating how effectively the vehicle may be able to avoid and/or mitigate crashes and/or the extent to which the driver's control of the vehicle is enhanced or replaced by the vehicle's software and artificial intelligence. The drawing shows a block diagram of an autonomous vehicle data system for autonomous vehicle operation, monitoring, communication and related functions.100Autonomous vehicle data system 108Semi-autonomous vehicle 110Mobile devices 120Sensors 130Network ", "instruction": "Please summarize the input" }, { "input": "Autonomous vehicle insurance pricing and offering based upon accident riskMethods and systems for monitoring use, determining risk, and pricing insurance policies for an autonomous vehicle having one or more autonomous operation features are provided. According to certain aspects, accident risk factors may be determined for autonomous operation features of the vehicle using information regarding the autonomous operation features of the vehicle or other accident related factors associated with the vehicle. The accident risk factors may indicate the ability of the autonomous operation features to avoid accidents during operation, particularly without vehicle operator intervention. The accident risk levels determined for a vehicle may further be used to determine or adjust aspects of an insurance policy associated with the vehicle.What is claimed is: \n | 1. A computer-implemented method of evaluating effectiveness of an autonomous or semi-autonomous vehicle technology, the method comprising: \ngenerating, by one or more computing systems configured to evaluate the autonomous or semi-autonomous vehicle technology operating within a virtual test environment configured to simultaneously test at least one additional autonomous or semi-autonomous vehicle technologies, test results for the autonomous or semi-autonomous vehicle technology, wherein the computing systems generate the test results as hardware or software responses of the autonomous or semi-autonomous vehicle technology to virtual test sensor data that simulates sensor data for operating conditions associated with a plurality of test scenarios within the virtual test environment; \nreceiving, at one or more processors, information regarding the test results; \ndetermining, by one or more processors, an indication of reliability of the autonomous or semi-autonomous vehicle technology based upon the test results, including compatibility of the autonomous or semi-autonomous vehicle technology with the at least one additional autonomous or semi-autonomous vehicle technologies tested; \ndetermining, by one or more processors, an accident risk factor based upon the received information regarding the test results and the indication of reliability by analyzing an effect on a risk associated with a potential vehicle accident of the autonomous or semi-autonomous vehicle technology, wherein the accident risk factor is determined based upon an ability of a version of artificial intelligence of the autonomous or semi-autonomous vehicle technology to avoid collisions without human interaction; \ndetermining, by one or more processors, one or more vehicle insurance policy premiums for one or more vehicles based at least in part upon the determined accident risk factor; and \ncausing, by one or more processors, information regarding the one or more vehicle insurance policies to be presented to one or more customers for review. \n | 2. The computer-implemented method of claim 1, wherein the autonomous or semi-autonomous vehicle technology includes at least one of a fully autonomous vehicle feature or a limited human driver control feature.\n | 3. The computer-implemented method of claim 1, wherein the autonomous or semi-autonomous vehicle technology performs at least one of the following functions: \nsteering; \naccelerating; \nbraking; \nmonitoring blind spots; \npresenting a collision warning; \nadaptive cruise control; or \nparking. \n | 4. The computer-implemented method of claim 1, wherein the autonomous or semi-autonomous vehicle technology is related to at least one of the following: \ndriver alertness monitoring; \ndriver responsiveness monitoring; \npedestrian detection; \nartificial intelligence; \na back-up system; \na navigation system; \na positioning system; \na security system; \nan anti-hacking measure; \na theft prevention system; or \nremote vehicle location determination. \n | 5. The computer-implemented method of claim 1, further comprising receiving, at one or more processors, an accident-related factor, wherein: \nthe accident risk factor is further determined based in part upon the received accident-related factor, and \nthe accident-related factor is related to at least one of the following: \na point of impact; \na type of road; \na time of day; \na weather condition; \na type of a trip; \na length of a trip; \na vehicle style; \na vehicle-to-vehicle communication; or \na vehicle-to-infrastructure communication. \n | 6. The computer-implemented method of claim 1, wherein the accident risk factor is further determined for the autonomous or semi-autonomous vehicle technology based upon at least one of the following: (1) a type of the autonomous or semi-autonomous vehicle technology, (2) a version of computer instructions of the autonomous or semi-autonomous vehicle technology, (3) an update to computer instructions of the autonomous or semi-autonomous vehicle technology, or (4) an update to the artificial intelligence associated with the autonomous or semi-autonomous vehicle technology.\n | 7. The computer-implemented method of claim 1, wherein the method further includes determining at least one of a discount, a refund, or a reward associated with the one or more vehicle insurance policies based upon the accident risk factor determined for the autonomous or semi-autonomous vehicle technology.\n | 8. The computer-implemented method of claim 1, wherein the received information further includes at least one of a database or a model of accident risk assessment based upon information regarding past vehicle accident information.\n | 9. The computer-implemented method of claim 1, wherein causing information regarding the one or more vehicle insurance policies to be presented to the one or more customers for review includes communicating to each customer an insurance premium for automobile insurance coverage.\n | 10. A computer system for evaluating effectiveness of an autonomous or semi-autonomous vehicle technology, comprising: \none or more processors; \none or more communication modules adapted to communicate data; \none or more computing systems configured to evaluate the autonomous or semi-autonomous vehicle technology operating within a virtual test environment configured to simultaneously test at least one additional autonomous or semi-autonomous vehicle technologies to generate test results for the autonomous or semi-autonomous vehicle technology, wherein the computing systems generate the test results as hardware or software responses of the autonomous or semi-autonomous vehicle technology to virtual test sensor data that simulates sensor data for operating conditions associated with a plurality of test scenarios within the virtual test environment, and wherein the test results are communicated to the one or more processors via the one or more communication modules; and \na program memory coupled to the one or more processors and storing executable instructions that when executed by the one or more processors cause the computer system to: \nreceive information regarding the test results; \ndetermine an indication of reliability of the autonomous or semi-autonomous vehicle technology based upon the test results, including compatibility of the autonomous or semi-autonomous vehicle technology with the at least one additional autonomous or semi-autonomous vehicle technologies tested; \ndetermine an accident risk factor based upon the received information regarding the test results and the indication of reliability by analyzing an effect on a risk associated with a potential vehicle accident of the autonomous or semi-autonomous vehicle technology, wherein the accident risk factor is determined based upon an ability of a version of artificial intelligence of the autonomous or semi-autonomous vehicle technology to avoid collisions without human interaction; \ndetermine one or more vehicle insurance policy premiums for one or more vehicles based at least in part upon the determined accident risk factor; \nand cause information regarding the one or more vehicle insurance policies to be presented to one or more customers for review. \n | 11. The computer system of claim 10, wherein the accident risk factor is further determined for the autonomous or semi-autonomous vehicle technology based upon at least one of the following: (1) a type of the autonomous or semi-autonomous vehicle technology, (2) a version of computer instructions of the autonomous or semi-autonomous vehicle technology, (3) an update to computer instructions of the autonomous or semi-autonomous vehicle technology, or (4) an update to the artificial intelligence associated with the autonomous or semi-autonomous vehicle technology.\n | 12. The computer system of claim 10, wherein the received information further includes at least one of a database or a model of accident risk assessment based upon information regarding past vehicle accident information.\n | 13. The computer system of claim 10, wherein the autonomous or semi-autonomous vehicle technology includes at least one of a fully autonomous vehicle feature or a limited human driver control feature.\n | 14. The computer system of claim 10, wherein the executable instructions that cause the computer system to cause information regarding the one or more vehicle insurance policies to be presented to the one or more customers for review include instructions that cause the computer system to communicate to each customer an insurance premium for automobile insurance coverage.\n | 15. A computer-implemented method of evaluating effectiveness of an autonomous or semi-autonomous driving package of computer instructions, the method comprising: \ngenerating, by one or more computing systems configured to evaluate the autonomous or semi-autonomous driving package operating within a virtual test environment configured to simultaneously test at least one additional autonomous or semi-autonomous driving packages of computer instructions, test results for the autonomous or semi-autonomous driving package of computer instructions in the virtual test environment, wherein the computing systems generate the test results as responses of the computer instructions implemented within the virtual test environment to virtual test sensor data that simulates sensor data for operating conditions associated with a plurality of test scenarios within the virtual test environment; \ndetermining, by one or more processors, an indication of reliability of the autonomous or semi-autonomous driving package based upon the test results, including compatibility of the autonomous or semi-autonomous driving package with the at least one additional autonomous or semi-autonomous driving packages tested; \nanalyzing, by one or more processors, loss experience associated with the computer instructions to determine effectiveness in actual driving situations; \ndetermining, by one or more processors, a relative accident risk factor for artificial intelligence of the computer instructions based upon the ability of the computer instructions to make automated or semi-automated driving decisions for a vehicle that avoid collisions using the test results, the indication of reliability, and analysis of loss experience; \ndetermining, by one or more processors, one or more vehicle insurance policy premiums for one or more vehicles based at least in part upon the relative risk factor assigned to the artificial intelligence of the autonomous or semi-autonomous driving package of computer instructions; and \ncausing, by one or more processors, information regarding the one or more vehicle insurance policies to be presented to one or more customers for review. \n | 16. The computer-implemented method of claim 15, wherein the autonomous or semi-autonomous driving package of computer instructions are stored on a non-transitory computer readable medium and direct autonomous or semi-autonomous vehicle functionality related to at least one of the following functions: \nsteering; \naccelerating; \nbraking; \nmonitoring blind spots; \npresenting a collision warning; \nadaptive cruise control; or \nparking. \n | 17. The computer-implemented method of claim 15, wherein the autonomous or semi-autonomous driving package of computer instructions are stored on a non-transitory computer readable medium and direct autonomous or semi-autonomous vehicle functionality related to at least one of the following: \ndriver alertness monitoring; \ndriver responsiveness monitoring; \npedestrian detection; \nartificial intelligence; \na back-up system; \na navigation system; \na positioning system; \na security system; \nan anti-hacking measure; \na theft prevention system; or \nremote vehicle location determination. \n | 18. The computer-implemented method of claim 15, wherein the relative accident factor is based upon, at least in part, at least one accident-related factor, including: \na point of impact; \na type of road; \na time of day; \na weather condition; \na type of a trip; \na length of a trip; \na vehicle style; \na vehicle-to-vehicle communication; or \na vehicle-to-infrastructure communication. \n | 19. The computer-implemented method of claim 15, the method further comprising adjusting at least one of an insurance premium, a discount, a refund, or a reward associated with the one or more vehicle insurance policies based upon the relative accident risk factor.\n | 20. The computer-implemented method of claim 15, wherein causing information regarding the one or more vehicle insurance policies to be presented to the one or more customers for review by the one or more customers includes communicating to each customer a cost of automobile insurance coverage.", "output": "The method involves generating test results for autonomous or semi-autonomous vehicle technology. The information regarding test results are received (1004). An indication of reliability of autonomous or semi-autonomous vehicle technology is determined (1010). An accident risk factor is determined (1012) based upon received information regarding test results and indication of reliability. The vehicle insurance policy premiums for vehicles are determined (1014). The information regarding vehicle insurance policies are presented to the customers for review. INDEPENDENT CLAIMS are included for the following:a computer system for evaluating effectiveness of an autonomous or semi-autonomous vehicle technology; anda computer-based method of evaluating effectiveness of an autonomous or semi-autonomous driving package of computer instructions. Computer-based method of evaluating effectiveness of autonomous or semi-autonomous vehicle technology. An automobile insurance premium is determined by evaluating how effectively the vehicle is able to avoid and/or mitigate crashes and/or the extent to which the driver's control of the vehicle is enhanced or replaced by the vehicle's software and artificial intelligence. The autonomous vehicle operation features assist the vehicle operator to more safely or efficiently operate a vehicle or take full control of vehicle operation under some or all circumstances. The autonomous or semi-autonomous vehicle technology and/or the autonomous or semi-autonomous driving package of computer instructions can perform the following functions such as steering, accelerating, braking, monitoring blind spots, presenting a collision warning, adaptive cruise control, and/or parking and relate to the following driver alertness monitoring, driver responsiveness monitoring, pedestrian detection, artificial intelligence, a back-up system, a navigation system, a positioning system, a security system, an anti-hacking measure, a theft prevention system, and/or remote vehicle location determination. The drawing shows a flow diagram depicting an autonomous vehicle insurance pricing method for determining risk and premiums for vehicle insurance policies covering autonomous vehicles with autonomous communication features. 1004Step for receiving information regarding test results1006Step for determining risk levels associated with autonomous operation1010Step for determining indication of reliability of autonomous or semi-autonomous vehicle technology1012Step for determining accident risk factor1014Step for determining vehicle insurance policy premiums", "instruction": "Please summarize the input" }, { "input": "Method and system for enhancing the functionality of a vehicleMethods and systems for enhancing the functionality of a semi-autonomous vehicle are described herein. The semi-autonomous vehicle may receive a communication from a fully autonomous vehicle within a threshold distance of the semi-autonomous vehicle. If the vehicles are travelling on the same route or the same portion of a route, the semi-autonomous vehicle may navigate to a location behind the fully autonomous vehicle. Then the semi-autonomous vehicle may operate autonomously by replicating one or more functions performed by the fully autonomous vehicle. The functions and/or maneuvers performed by the fully autonomous vehicle may be detected via sensors in the semi-autonomous vehicle and/or may be identified by communicating with the fully autonomous vehicle to receive indications of upcoming maneuvers. In this manner, the semi-autonomous vehicle may act as a fully autonomous vehicle.What is claimed is: \n | 1. A computer-implemented method for enhancing the functionality of a vehicle, comprising: \nbroadcasting, via one or more processors and/or associated transceivers of a semiautonomous vehicle having one or more autonomous operation features, a request to follow a fully autonomous vehicle within a predetermined communication range of the semi-autonomous vehicle via vehicle-to-vehicle wireless communication when the semi-autonomous vehicle is operating in a partially autonomous mode of operation with at least some of the control decisions being made by a vehicle operator; \nreceiving, at the one or more processors and/or associated transceivers of the semiautonomous vehicle via vehicle-to-vehicle communication, an indication directly from several autonomous vehicles that each autonomous vehicle is within the predetermined communication range of the semi-autonomous vehicle, wherein each indication includes identification information for the autonomous vehicle for determining a safety rating of the autonomous vehicle; \nselecting, at the one or more processors of the semi-autonomous vehicle, an autonomous vehicle from among the several autonomous vehicles within the predetermined communication range of the semi-autonomous vehicle based upon the safety rating of each of the several autonomous vehicles as determined according to the identification information for each autonomous vehicle; and \nfor a portion of the route, causing, by the one or more processors, the semi-autonomous vehicle to follow the selected autonomous vehicle and mimic each maneuver performed by the autonomous vehicle, such that the semi-autonomous vehicle is capable of operating without input from the vehicle operator along the same portion of the route. \n | 2. The computer-implemented method of claim 1, wherein the one or more processors periodically re-verify that the semi-autonomous vehicle remains within a predetermined distance of the selected autonomous vehicle, and when a distance between the vehicles exceeds the predetermined threshold distance, the semi-autonomous vehicle maneuvers to the side of the road and parks.\n | 3. The computer-implemented method of claim 1, wherein at least one component in the semi-autonomous vehicle is malfunctioning, such that the semi-autonomous vehicle requires input from the vehicle operator to operate.\n | 4. The computer-implemented method of claim 3, wherein the semi-autonomous vehicle is damaged in a vehicle collision and the selected autonomous vehicle is a tow service vehicle.\n | 5. The computer-implemented method of claim 1, wherein the semi-autonomous vehicle includes fewer sensors for autonomous operation than the selected autonomous vehicle.\n | 6. The computer-implemented method of claim 1, wherein causing the semi-autonomous vehicle to mimic each maneuver performed by the selected autonomous vehicle includes: \nreceiving, at the one or more processors, an indication of an upcoming maneuver to be performed by the selected autonomous vehicle and an indication of a time or location at which the upcoming maneuver will be performed; and \ncausing, by the one or more processors, the semi-autonomous vehicle to perform the upcoming maneuver at the indicated time or location. \n | 7. The computer-implemented method of claim 6, further comprising: \nreceiving, at the one or more processors, an indication of a speed at which the selected autonomous vehicle is travelling; and \ncausing, by the one or more processors, the semi-autonomous vehicle to travel slower than the semi-autonomous vehicle based upon the received speed. \n | 8. The computer-implemented method of claim 1, wherein causing the semi-autonomous vehicle to mimic each maneuver performed by the selected autonomous vehicle includes: \ndetecting, via one or more sensors within the semi-autonomous vehicle, a maneuver performed by the selected autonomous vehicle; and \ncausing, by the one or more processors, the semi-autonomous vehicle to perform a same maneuver as the detected maneuver. \n | 9. The computer-implemented method of claim 1, wherein the vehicle operator for the semi-autonomous vehicle provides input to the semi-autonomous vehicle to direct the semi-autonomous vehicle to a location behind the autonomous vehicle; and \nwhen the semi-autonomous vehicle detects the selected autonomous vehicle in front of the semi-autonomous vehicle, the method further includes causing, by the one or more processors, the semi-autonomous vehicle to operate without input from the vehicle operator. \n | 10. The computer-implemented method of claim 1, wherein selecting, at the one or more processors of the semi-autonomous vehicle, an autonomous vehicle from among the several autonomous vehicles within the predetermined communication range of the semi-autonomous vehicle is based upon a comparison of the current route of the semi-autonomous vehicle with each of the several autonomous vehicles' route, respectively.\n | 11. A computer system configured to enhance the functionality of a vehicle, the computer system comprising one or more local or remote processors, transceivers, and/or sensors configured to: \nbroadcast, via a semi-autonomous vehicle having one or more autonomous operation features, a request to follow a fully autonomous vehicle within a predetermined communication range of the semi-autonomous vehicle via vehicle-to-vehicle wireless communication when the semi-autonomous vehicle is operating in a partially autonomous mode of operation with at least some of the control decisions being made by a vehicle operator; \nreceive, at the semi-autonomous vehicle via vehicle-to-vehicle communication, an indication directly from several fully autonomous or fully operational autonomous vehicles that each fully autonomous or fully operational autonomous vehicle is within the predetermined communication range of the semi-autonomous vehicle, wherein each indication includes identification information for the autonomous vehicle for determining a safety rating of the autonomous vehicle; \nselect, at the semi-autonomous vehicle, an autonomous vehicle from among the several autonomous vehicles within the predetermined communication range of the semiautonomous vehicle based upon the safety rating of each of the several autonomous vehicles as determined according to the identification information for each autonomous vehicle; and \nfor a portion of the route, cause the semi-autonomous vehicle to follow the selected autonomous vehicle and mimic each maneuver performed by the selected autonomous vehicle, such that the semi-autonomous vehicle is capable of operating without input from the vehicle operator. \n | 12. The computer system of claim 11, wherein the semiautonomous vehicle periodically re-verifies that the semi-autonomous vehicle remains within a predetermined distance of the selected autonomous vehicle, and when a distance between the vehicles exceeds the predetermined threshold distance, the semi-autonomous vehicle maneuvers to the side of the road and parks.\n | 13. The computer system of claim 11, wherein at least one component in the semi-autonomous vehicle is malfunctioning, such that the semi-autonomous vehicle requires input from the vehicle operator to operate.\n | 14. The computer system of claim 13, wherein the semiautonomous vehicle is damaged in a vehicle collision and the selected autonomous vehicle is a tow service vehicle.\n | 15. The computer system of claim 11, wherein the semiautonomous vehicle includes fewer sensors for autonomous operation than the selected autonomous vehicle.\n | 16. The computer system of claim 11, wherein to cause the semi-autonomous vehicle to mimic each maneuver performed by the selected autonomous vehicle, the one or more local or remote processors, transceivers, and/or sensors are configured to: \nreceive an indication of an upcoming maneuver to be performed by the selected autonomous vehicle and an indication of a time or location at which the upcoming maneuver will be performed; and \ncause the semi-autonomous vehicle to perform the upcoming maneuver at the indicated time or location. \n | 17. The computer system of claim 16, wherein one or more local or remote processors, transceivers, and/or sensors are further configured to: \nreceive an indication of a speed at which the selected autonomous vehicle is travelling; and \ncause the semi-autonomous vehicle to travel slower than the semi-autonomous vehicle based upon the received speed. \n | 18. The computer system of claim 11, wherein to cause the semi-autonomous vehicle to mimic each maneuver performed by the selected autonomous vehicle, the one or more local or remote processors, transceivers, and/or sensors are configured to: \ndetect, via one or more sensors within the semi-autonomous vehicle, a maneuver performed by the selected autonomous vehicle; and \ncause the semi-autonomous vehicle to perform a same maneuver as the detected maneuver. \n | 19. The computer system of claim 11, wherein the vehicle operator for the semi-autonomous vehicle provides input to the semi-autonomous vehicle to direct the semi-autonomous vehicle to a location behind the selected autonomous vehicle; and \nwhen the semi-autonomous vehicle detects the selected autonomous vehicle in front of the semi-autonomous vehicle, the one or more local or remote processors, transceivers, and/or sensors are configured to cause the semi-autonomous vehicle to operate without input from the vehicle operator. \n | 20. The computer system of claim 11, wherein selecting at the semi-autonomous vehicle, an autonomous vehicle from among the several autonomous vehicles within the predetermined communication range of the semiautonomous vehicle is based upon a comparison of the current route of the semi-autonomous vehicle with each of the several autonomous vehicles' route, respectively.", "output": "The method involves broadcasting, via a processor and/or associated transceivers of a semi-autonomous vehicle, a request to follow a fully autonomous vehicle (502) within a predetermined communication range of the semi-autonomous vehicle via vehicle-to-vehicle wireless communication. An indication is received directly from a number of autonomous vehicles that each autonomous vehicle is within the predetermined communication range (504). An autonomous vehicle is selected from among the autonomous vehicles within the predetermined communication range based upon a safety rating of each of the autonomous vehicles as determined according to identification information for each autonomous vehicle. The semi-autonomous vehicle is caused to follow the selected autonomous vehicle (510) and mimic each maneuver performed by the autonomous vehicle (512), such that the semi-autonomous vehicle is capable of operating without input from the vehicle operator along the same portion of the route. An INDEPENDENT CLAIM is also included for a computer system configured to enhance the functionality of a vehicle. Method for enhancing functionality of semi-autonomous vehicle. The method enables the fully autonomous vehicle to act as a guide to ensure the semi-autonomous vehicle is safe to make a particular maneuver, when the semi-autonomous vehicle does not have the sensor capabilities to detect and/or monitor all of its surroundings. The drawing shows the flow diagram of an autonomous vehicle caravan method for causing a semi-autonomous vehicle to follow a follow autonomous vehicle. 502Broadcast request to follow fully autonomous vehicle504Receive communication from autonomous vehicle within predetermined communication range506Compare route for fully autonomous vehicle to route for semi-autonomous vehicle510Cause semi-autonomous vehicle to follow selected autonomous vehicle512Mimic each maneuver performed by autonomous vehicle", "instruction": "Please summarize the input" }, { "input": "Vehicular traffic alerts for avoidance of abnormal traffic conditionsMethods and systems are described for generating a vehicle-to-vehicle traffic alert and updating a vehicle-usage profile. Various aspects include detecting, via one or more processors associated with a first vehicle, that an abnormal traffic condition exists in an operating environment of the first vehicle. An electronic message is generated and transmitted wirelessly, via a vehicle-mounted transceiver associated with the first vehicle, to alert a nearby vehicle of the abnormal traffic condition and to allow the nearby vehicle to avoid the abnormal traffic condition. The first vehicle receives telematics data regarding operation of the nearby vehicle after the nearby vehicle received the electronic message, and transmits the telematics data to a remote server for updating a vehicle-usage profile associated with the nearby vehicle.What is claimed is: \n | 1. A computer-implemented method of generating a vehicle traffic alert and updating a vehicle-usage profile, the method comprising: \ndetecting, via one or more processors, that an abnormal traffic condition exists in an operating environment of a first vehicle; \ngenerating, via the one or more processors, an electronic message regarding the abnormal traffic condition; \ntransmitting, via the one or more processors, the electronic message to a nearby vehicle, wherein the electronic message is transmitted via wireless communication to alert the nearby vehicle of the abnormal traffic condition and to allow the nearby vehicle to avoid the abnormal traffic condition; \nreceiving, via the one or more processors, telematics data regarding operation of the nearby vehicle after the nearby vehicle received the electronic message; and \nupdating, via the one or more processors, a vehicle-usage profile associated with the nearby vehicle based upon the received telematics data regarding operation of the nearby vehicle, wherein updating the vehicle-usage profile causes an insurance premium adjustment to an insurance policy associated with an operator of the nearby vehicle. \n | 2. The computer-implemented method of claim 1, wherein the abnormal traffic condition is one or more of the following: an erratic vehicle, an erratic driver, road construction, a closed highway exit, slowed or slowing traffic, slowed or slowing vehicular congestion, or one or more other vehicles braking ahead of the first vehicle.\n | 3. The computer-implemented method of claim 1, wherein the abnormal traffic condition is bad weather and the electronic message indicates a GPS location of the bad weather.\n | 4. The computer-implemented method of claim 1, wherein the nearby vehicle comprises one or more of the following: an autonomous vehicle, a semi-autonomous vehicle or a self-driving vehicle, and wherein the nearby vehicle includes one or more processors for receiving the transmitted electronic message.\n | 5. The computer-implemented method of claim 1, wherein the transmitting the electronic message to the nearby vehicle requires transmitting the electronic message to one or more remote processors.\n | 6. The computer-implemented method of claim 1, wherein the abnormal traffic condition is detected by analyzing vehicular telematics data.\n | 7. The computer-implemented method of claim 1, wherein the nearby vehicle travels to the operating environment of the first vehicle.\n | 8. The computer-implemented method of claim 1, the method further comprising transmitting the electronic message to a smart infrastructure component, wherein the smart infrastructure component: \nanalyzes the electronic message to determine a type of anomalous condition for the abnormal traffic condition; and \nperforms an action based upon the type of anomalous condition in order to modify the anomalous condition. \n | 9. The computer-implemented method of claim 1, wherein the electronic message contains location information of the abnormal traffic condition, and the nearby vehicle ignores the electronic message when the location information indicates that the abnormal traffic condition is beyond a threshold distance from the nearby vehicle.\n | 10. A computer system configured to generate a vehicle traffic alert and update a vehicle-usage profile, the computer system comprising one or more processors, the one or more processors configured to: \ndetect that an abnormal traffic condition exists in an operating environment of a first vehicle; \ngenerate an electronic message regarding the abnormal traffic condition; \ntransmit the electronic message to a nearby vehicle, wherein the electronic message is transmitted via wireless communication to alert the nearby vehicle of the abnormal traffic condition and to allow the nearby vehicle to avoid the abnormal traffic condition; \nreceive telematics data regarding operation of the nearby vehicle after the nearby vehicle received the electronic message; and \nupdate a vehicle-usage profile associated with the nearby vehicle based upon the received telematics data regarding operation of the nearby vehicle, wherein updating the vehicle-usage profile causes an insurance premium adjustment to an insurance policy associated with an operator of the nearby vehicle. \n | 11. The computer system of claim 10, wherein the abnormal traffic condition is one or more of the following: an erratic vehicle, an erratic driver, road construction, a closed highway exit, slowed or slowing traffic, slowed or slowing vehicular congestion, or one or more other vehicles braking ahead of the first vehicle.\n | 12. The computer system of claim 10, wherein the abnormal traffic condition is bad weather, and the electronic message indicates a GPS location of the bad weather.\n | 13. The computer system of claim 10, the system further configured to generate an alternate route for the nearby vehicle to take to avoid the abnormal traffic condition.\n | 14. The computer system of claim 10, wherein the one or more processors include one or more of the following: vehicle-mounted sensors or vehicle-mounted processors.\n | 15. The computer system of claim 10, wherein the nearby vehicle comprises one or more of the following: an autonomous vehicle, a semi-autonomous vehicle or a self-driving vehicle, and the nearby vehicle includes one or more processors for receiving the transmitted electronic message.\n | 16. The computer system of claim 10, wherein the transmission of the electronic message to the nearby vehicle requires transmission of the electronic message to one or more remote processors.\n | 17. The computer system of claim 10, wherein the abnormal traffic condition is detected by analyzing vehicular telematics data.\n | 18. The computer system of claim 10, wherein the nearby vehicle travels to the operating environment of the first vehicle.\n | 19. A computer-implemented method of generating a vehicle traffic alert and updating a vehicle-usage profile, the method comprising: \ndetecting, via one or more processors, that an abnormal traffic condition exists in an operating environment of a first vehicle; \ngenerating, via the one or more processors, an electronic message regarding the abnormal traffic condition; \ntransmitting, via the one or more processors, the electronic message to a nearby vehicle, wherein the electronic message is transmitted via wireless communication to alert the nearby vehicle of the abnormal traffic condition and to allow the nearby vehicle to avoid the abnormal traffic condition; and \nreceiving, via the one or more processors, telematics data regarding operation of the nearby vehicle after the nearby vehicle received the electronic message, \nwherein the electronic message contains location information of the abnormal traffic condition, and the nearby vehicle ignores the electronic message when the location information indicates that the abnormal traffic condition is beyond a threshold distance from the nearby vehicle.", "output": "The method involves detecting that an abnormal traffic condition exists in an operating environment of a first vehicle through multiple processors. An electronic message regarding the abnormal traffic condition is generated. The electronic message is transmitted to a nearby vehicle, where the electronic message is transmitted through wireless communication to alert a nearby vehicle (108) of the abnormal traffic condition and to allow the nearby vehicle to avoid the abnormal traffic condition. The telematics data regarding operation of the nearby vehicle is received after the nearby vehicle received the electronic message. A vehicle-usage profile is updated associated with the nearby vehicle based upon the received telematics data regarding operation of the nearby vehicle, where updating the vehicle-usage profile causes an insurance premium adjustment to an insurance policy associated with an operator of the nearby vehicle. An INDEPENDENT CLAIM is included for a computer system for generating vehicle traffic alert and update vehicle-usage profile. Method for generating vehicle traffic alert and updating usage profile of vehicle such as slow-moving vehicle e.g. farm machinery, construction equipment, oversized load vehicle, or emergency vehicle e.g. ambulance, fire truck, police vehicle equipped to transmit electronic message indicate presence to nearby vehicle. The communication unit is configured to conditionally send data, which is particularly advantageous when computing device is implemented as a mobile computing device, as such conditions helps to reduce power usage and prolong battery life. The second computing device ignores the telematics data, thus saving processing power and battery life. The external computing device updates the earlier profile based upon new telematics data, which updates occur periodically or upon occurrence of an event. The drawing shows a block diagram of the telematics collection system. 100Telematics collection system106External computing device108Nearby vehicle110Computing device120Tactile alert system", "instruction": "Please summarize the input" }, { "input": "Vehicular traffic alerts for avoidance of abnormal traffic conditionsMethods and systems are described for generating a vehicle-to-vehicle traffic alert. Various aspects may include detecting that an abnormal traffic condition exists in an operating environment of a vehicle and generating a related electronic message. The electronic message may be transmitted via the vehicle's transceiver using a wireless communication to a nearby vehicle to alert the nearby vehicle of the abnormal traffic condition and to allow the nearby vehicle to avoid the abnormal traffic condition.What is claimed is: \n | 1. A computer-implemented method of generating a vehicle-to-vehicle traffic alert, the method comprising: \ndetecting, via one or more processors, that an abnormal traffic condition exists in an operating environment of a vehicle; \ngenerating, via the one or more processors, an electronic message regarding the abnormal traffic condition; \ntransmitting, via a vehicle-mounted transceiver associated with the vehicle, the electronic message to a nearby vehicle, wherein the electronic message is transmitted via wireless communication to alert the nearby vehicle of the abnormal traffic condition and to allow the nearby vehicle to avoid the abnormal traffic condition; and \nupdating a risk aversion profile associated with a vehicle operator of the nearby vehicle based upon the electronic message, wherein the risk aversion profile is associated with a travel environment for the nearby vehicle, the travel environment including at least an environment where the nearby vehicle has traveled two or more times. \n | 2. The computer-implemented method of claim 1, wherein the abnormal traffic condition is one or more of the following: an erratic vehicle, an erratic driver, road construction, a closed highway exit, slowed or slowing traffic, slowed or slowing vehicular congestion, or one or more other vehicles braking ahead of the vehicle.\n | 3. The computer-implemented method of claim 1, wherein the abnormal traffic condition is bad weather and the electronic message indicates a GPS location of the bad weather.\n | 4. The computer-implemented method of claim 1, the method further comprising generating an auto insurance discount associated with the vehicle.\n | 5. The computer-implemented method of claim 1, wherein the nearby vehicle comprises one or more of the following: an autonomous vehicle, a semi-autonomous vehicle or a self-driving vehicle, and wherein the nearby vehicle includes one or more processors for receiving the transmitted electronic message.\n | 6. The computer-implemented method of claim 1, wherein the transmitting the electronic message to the nearby vehicle requires transmitting the electronic message to one or more remote processors.\n | 7. The computer-implemented method of claim 1, wherein the abnormal traffic condition is detected by analyzing vehicular telematics data.\n | 8. The computer-implemented method of claim 1, wherein the nearby vehicle travels to the operating environment of the vehicle.\n | 9. The computer-implemented method of claim 1, the method further comprising transmitting the electronic message to a smart infrastructure component, wherein the smart infrastructure component: \nanalyzes the electronic message to determine a type of anomalous condition for the abnormal traffic condition; and \nperforms an action based on the type of anomalous condition in order to modify the anomalous condition. \n | 10. The computer-implemented method of claim 1, wherein the electronic message contains location information of the abnormal traffic condition, and wherein the nearby vehicle ignores the electronic message when the location information indicates that the abnormal traffic condition is beyond a threshold distance from the nearby vehicle.\n | 11. A computer system configured to generate a vehicle-to-vehicle traffic alert, the computer system comprising one or more processors, the one or more processors configured to: \ndetect that an abnormal traffic condition exists in an operating environment of a vehicle; \ngenerate an electronic message regarding the abnormal traffic condition; \ntransmit, via a vehicle-mounted transceiver associated with the vehicle, the electronic message to a nearby vehicle, wherein the electronic message is transmitted via wireless communication to alert the nearby vehicle of the abnormal traffic condition and to allow the nearby vehicle to avoid the abnormal traffic condition; and \nupdate a risk aversion profile associated with a vehicle operator of the nearby vehicle based upon the electronic message, wherein the risk aversion profile is associated with a travel environment for the nearby vehicle, the travel environment including at least an environment where the nearby vehicle has traveled two or more times. \n | 12. The computer system of claim 11, wherein the abnormal traffic condition is one or more of the following: an erratic vehicle, an erratic driver, road construction, a closed highway exit, slowed or slowing traffic, slowed or slowing vehicular congestion, or one or more other vehicles braking ahead of the vehicle.\n | 13. The computer system of claim 11, wherein the abnormal traffic condition is bad weather and the electronic message indicates a GPS location of the bad weather.\n | 14. The computer system of claim 11, the system further configured to generate an alternate route for the nearby vehicle to take to avoid the abnormal traffic condition.\n | 15. The computer system of claim 11, the system further configured to generate, an auto insurance discount associated with the vehicle.\n | 16. The computer system of claim 11, wherein the one or more processors is one or more of the following: vehicle-mounted sensors or vehicle-mounted processors.\n | 17. The computer system of claim 11, wherein the nearby vehicle comprises one or more of the following: an autonomous vehicle, a semi-autonomous vehicle or a self-driving vehicle, and wherein the nearby vehicle includes one or more processors for receiving the transmitted electronic message.\n | 18. The computer system of claim 11, wherein the transmission of the electronic message to the nearby vehicle requires transmission of the electronic message to one or more remote processors.\n | 19. The computer system of claim 11, wherein the abnormal traffic condition is detected by analyzing vehicular telematics data.\n | 20. The computer system of claim 11, wherein the nearby vehicle travels to the operating environment of the vehicle.", "output": "The method involves transmitting the electronic message through the wireless communication to alert nearby vehicle (108) of abnormal traffic condition and to allow nearby vehicle to avoid abnormal traffic condition. A risk aversion profile associated with a vehicle operator (106) of the nearby vehicle is updated based upon the electronic message. The risk aversion profile is associated with a travel environment for the nearby vehicle, the travel environment including an environment where the nearby vehicle has traveled several times. An INDEPENDENT CLAIM is included for a computer system configured to generate a vehicle-to-vehicle traffic alert. Computer based method for generating vehicle-to-vehicle traffic alert. The insurance policies such as vehicle or life insurance policies can be adjusted, generated, and/or updated, based upon an individual's usage and/or taking travel recommendations, such as travel recommendations that reduce or lower risk and/or enhance driver or vehicle safety. The risk can be reduced, by reducing road rage by reporting negative driving behavior. The risk averse customers can receive insurance discounts or other insurance cost savings based upon data that reflects low risk driving behavior and/or technology that mitigates or prevents risk to insured assets, such as vehicles or even homes, and/or vehicle operators or passengers. The drawing shows a block diagram of the telematics collection system. 106Vehicle operator108Vehicle114Board computer116Link122Speaker", "instruction": "Please summarize the input" }, { "input": "Shared control for vehicles travelling in formationMethods and apparatus for controlling two or more vehicles travelling in formation. Selected vehicles may be fully or partially autonomously controlled; at least one vehicle is partially controlled by a human driver. Information is collected at each vehicle and from the drivers and it is shared with other vehicles and drivers to create a shared world model. Aspects of the shared world model may be presented to the human driver, who may then respond with a control input. Autonomy systems and the drivers on the vehicles then collaborate to make a collective decision to act or not to act and execute any such action in a coordinated manner.The invention claimed is: \n | 1. A method for collaborative control of a platoon of vehicles wherein a first vehicle is at least partially controllable by a human driver and a second vehicle is at least partially controllable by autonomy logic, comprising: \ncollecting information from human driver inputs on the first vehicle; \ncollecting information from sensors on both the first vehicle and the second vehicle; \nsharing information thus collected between the first vehicle and the second vehicle to provide shared information; \neach vehicle using the shared information to maintain a respective local copy of a shared world model, wherein maintaining the respective local copy of the shared world model comprises: \nmaintaining a local model, including by processing, by the first vehicle, at least a subset of the collected information from sensors on both the first vehicle and the second vehicle to derive perception information and situation information, wherein the derived perception information comprises one or more attributes of one or more sensed objects and wherein the derived situation information comprises information about one or more sensed events, \nafter deriving the derived perception information and the derived situation information, inputting the derived perception information and the derived situation information to a first local copy of the shared world model at the first vehicle, \nreceiving, by the second vehicle, the derived perception information and the derived situation information added to the first local copy of the shared world model, and \ninputting the derived perception information and the derived situation information to a second local copy of the shared world model at the second vehicle; \ncollaboratively determining, by both the first and the second vehicle in direct vehicle to vehicle communication, to perform a proposed action based on the first and second local copies of the shared world model; and \nin accordance with determining, by both the first and the second vehicle, to perform the proposed action, performing, by either or both the first and the second vehicle, the proposed action. \n | 2. The method of claim 1 wherein the proposed action is to update a state of the world model.\n | 3. The method of claim 1 wherein the proposed action is proposed by the autonomy logic and the proposed action may be vetoed by the human driver.\n | 4. The method of claim 1 wherein the proposed action is proposed by the human driver and the proposed action may be vetoed by the autonomy logic.\n | 5. The method of claim 1 wherein the proposed action includes either the first or second vehicle joining or leaving the platoon.\n | 6. The method of claim 5 wherein the proposed action includes a third vehicle which is at least partially controlled by a human joining the platoon behind the first vehicle, and then the third vehicle entering an autonomous driving mode.\n | 7. The method of claim 5 wherein the proposed action includes the first vehicle leaving an autonomous driving mode and entering a human-controlled mode and exiting the platoon.\n | 8. The method of claim 1 wherein the proposed action includes swapping roles of a leader vehicle and a follower vehicle in the platoon.\n | 9. The method of claim 1 wherein the proposed action includes \neither the first or second vehicle changing lanes; or \neither the first or second vehicle entering or leaving a travel lane; or \neither the first or second vehicle increasing or decreasing speed or distance to another vehicle; or \neither the first or second vehicle maneuvering to park next to another vehicle. \n | 10. The method of claim 1 wherein the human driver inputs include information conveyed visually, via audio, or physically such as by forces on a joystick, or a steering device, or other input device.\n | 11. The method of claim 1 wherein determining whether to perform the proposed action includes propagating, between the first vehicle and the second vehicle, constraints imposed on either the first vehicle or the second vehicle.\n | 12. The method of claim 11 wherein the constraints include the autonomy logic discouraging but not preventing the human from making a steering decision.\n | 13. The method of claim 1 wherein the shared information includes \ndata originating outside components of the autonomy logic or human control and derived data; \ndata originating inside the autonomy logic or human control; and/or \nphysical phenomena that is capable of being sensed by the human driver. \n | 14. The method of claim 1, further comprising: \ndisplaying at least a selected portion of the shared information on a display associated with the first vehicle.", "output": "The method involves collecting information from human driver inputs on the first vehicle. Information is collected from sensors (112) on both the first vehicle and the second vehicle. Information collected between the first vehicle and the second vehicle is shared to provide shared information. Each vehicle is enabled to collaboratively engage in a decision (124) with the other vehicle as a unit using a world model (126). A proposed action is proposed by a logic of autonomy (122), where the proposed action includes either the first or second vehicle joining or leaving the platoon. An INDEPENDENT CLAIM is included for an interface for enabling collaborative control of a platoon of vehicles. Method for realizing collaborative control of platoon of vehicles such as commercial vehicles e.g. long-haul truck. Can also be used in autonomous vehicles. The method enables improving operation of the platoon for the autonomy logic on one or both vehicles to collaborate with each other and with the human driver to improve operation of platoon. The method enables allowing a human to function as a resource for the robot, thus providing assistance with cognition and perception during task execution and enabling the human to compensate inadequacies of autonomy. The drawing shows a schematic diagram of an autonomous and a human-driven truck.112Sensors 114Actuators 122Autonomy 124Decision 126World model ", "instruction": "Please summarize the input" }, { "input": "Systems and methods for simulating GNSS multipath and obscuration with networked autonomous vehiclesThe disclosed technology teaches testing an autonomous vehicle: shielding a GNSS receiving antenna of the vehicle from ambient GNSS signals while the vehicle is under test and supplanting the ambient GNSS signals with simulated GNSS signals. Testing includes using a GNSS signal generating system: receiving the ambient GNSS signals using an antenna of the system and determining a location and acceleration of the vehicle from the GNSS signals, accessing a model of an augmented environment that includes multi-pathing and obscuration of the GNSS signals along a test path, based on the determined location—generating the simulated GNSS signals to feed to the vehicle, in real time—simulating at least one constellation of GNSS satellite sources modified according to the augmented environment, based on the determined location, and feeding the simulated signals to a receiver in the vehicle, thereby supplanting ambient GNSS as the autonomous vehicle travels along the test path.We claim as follows: \n | 1. A method of testing an autonomous vehicle, including \nshielding a Global Navigation Satellite System (abbreviated GNSS) receiving antenna of the autonomous vehicle from ambient GNSS signals while the autonomous vehicle is under test and supplanting the ambient GNSS signals with simulated GNSS signals; \nusing a GNSS signal generating system, \nreceiving the ambient GNSS signals using an antenna of the GNSS signal generating system and determining a location and acceleration of the autonomous vehicle from the ambient GNSS signals; \naccessing a model of an augmented environment that includes at least multi-pathing and obscuration of the ambient GNSS signals along a test path, based on the location determined from the GNSS signals; \ngenerating the simulated GNSS signals to feed to the autonomous vehicle, in real time, simulating at least one constellation of GNSS satellite sources modified according to the augmented environment, based on the location determined from the GNSS signals; and \nfeeding the simulated GNSS signals to a receiver in the autonomous vehicle, thereby supplanting ambient GNSS as the autonomous vehicle travels along the test path. \n | 2. The method of claim 1, further including spoofing by substituting pirate signals for ambient GNSS as the autonomous vehicle travels along the test path.\n | 3. The method of claim 1, further including wireless and conductive feeds of the simulated GNSS signals.\n | 4. The method of claim 1, further including using a Faraday cage to shield intent of the autonomous vehicle.\n | 5. The method of claim 1, further including coupling the received ambient GNSS signals with inertial measurements unit (abbreviated IMU) input to determine the position of the vehicle in real time with reduced latency.\n | 6. The method of claim 1, further including operating the vehicle on a track and simulating buildings.\n | 7. The method of claim 1, further including operating the vehicle in an urban environment and combining impaired GNSS signals with object sensors (visual, LIDAR, SONAR, RADAR) used by the vehicle for navigation.\n | 8. The method of claim 1, further including operating the vehicle in an urban environment and combining impaired GNSS signals with vehicle to vehicle (abbreviated V2V) and vehicle to infrastructure (Abbreviated V2I) communications used by the vehicle for navigation.\n | 9. A method of testing a connected vehicle that is connected to other vehicles and/or infrastructure, including: \nshielding a cellular receiving antenna of the connected vehicle from ambient cellular signals while the connected vehicle is under test and supplanting the ambient cellular signals with simulated cellular signals; \nusing a cellular signal generating system, \nreceiving the ambient cellular signals and ambient Global Navigation Satellite System (abbreviated GNSS) signals using at least one antenna of the cellular signal generating system and determining a location and acceleration of the connected vehicle from the ambient GNSS signals; \naccessing a model of an augmented environment that includes at least multi-pathing and obscuration of the ambient cellular signals along a test path, based on the location determined from the cellular signals; \ngenerating the simulated cellular signals to feed to the connected vehicle, in real time, simulating with at least one vehicle and/or infrastructure source modified according to the augmented environment, based on for the location determined from the cellular signals; and \nfeeding the simulated cellular signals to a receiver in the connected vehicle, thereby supplanting ambient cellular as the connected vehicle travels along the test path. \n | 10. The method of claim 9, wherein the ambient signals include at least one of GNSS, Wi-Fi, 5G and LTE signals that can be manipulated and impaired to test situational awareness of the vehicle in fully controlled and challenging RF environments.\n | 11. A tangible non-transitory computer readable storage media impressed with computer program instructions that, when executed, test an autonomous vehicle, including \nshielding a Global Navigation Satellite System (abbreviated GNSS) receiving antenna of the autonomous vehicle from ambient GNSS signals while the autonomous vehicle is under test and supplanting the ambient GNSS signals with simulated GNSS signals; \nusing a GNSS signal generating system, \nreceiving the ambient GNSS signals using an antenna of the GNSS signal generating system and determining a location and acceleration of the autonomous vehicle from the ambient GNSS signals; \naccessing a model of an augmented environment that includes at least multi-pathing and obscuration of the ambient GNSS signals along a test path, based on the location determined from the GNSS signals; \ngenerating the simulated GNSS signals to feed to the autonomous vehicle, in real time, simulating at least one constellation of GNSS satellite sources modified according to the augmented environment, based on the location determined from the GNSS signals; and \nfeeding the simulated GNSS signals to a receiver in the autonomous vehicle, thereby supplanting ambient GNSS as the autonomous vehicle travels along the test path. \n | 12. The tangible non-transitory computer readable storage media of claim 11, further including spoofing by substituting pirate signals for ambient GNSS as the autonomous vehicle travels along the test path.\n | 13. The tangible non-transitory computer readable storage media of claim 11, further including wireless and conductive feeds of the simulated GNSS signals.\n | 14. The tangible non-transitory computer readable storage media of claim 11, further including using a Faraday cage to shield intent of the autonomous vehicle.\n | 15. The tangible non-transitory computer readable storage media of claim 11, further including coupling the received ambient GNSS signals with inertial measurements unit (abbreviated IMU) input to determine the position of the vehicle in real time with reduced latency.\n | 16. The tangible non-transitory computer readable storage media of claim 11, further including operating the vehicle on a track and simulating buildings.\n | 17. The tangible non-transitory computer readable storage media of claim 11, further including operating the vehicle in an urban environment and combining impaired GNSS signals with object sensors (visual, LIDAR, SONAR, RADAR) used by the vehicle for navigation.\n | 18. A system for testing autonomous vehicles includes one or more processors coupled to memory, the memory loaded with computer instructions, that when executed on the processors, implement the shielding, receiving, accessing, generating and feeding of claim 11.\n | 19. A tangible non-transitory computer readable storage media impressed with computer program instructions that, when executed, test a connected vehicle that is connected to other vehicles and/or infrastructure, including \nshielding a cellular receiving antenna of the connected vehicle from ambient cellular signals while the connected vehicle is under test and supplanting the ambient cellular signals with simulated cellular signals; \nusing a cellular signal generating system, \nreceiving the ambient cellular signals and ambient Global Navigation Satellite System (abbreviated GNSS) signals using at least one antenna of the cellular signal generating system and determining a location and acceleration of the connected vehicle from the ambient GNSS signals; \naccessing a model of an augmented environment that includes at least multi-pathing and obscuration of the ambient cellular signals along a test path, based on the location determined from the cellular signals; \ngenerating the simulated cellular signals to feed to the connected vehicle, in real time, simulating with at least one vehicle and/or infrastructure source modified according to the augmented environment, based on for the location determined from the cellular signals; and \nfeeding the simulated cellular signals to a receiver in the connected vehicle, thereby supplanting ambient cellular as the connected vehicle travels along the test path. \n | 20. A system for testing a connected vehicle that is connected to other vehicles and/or infrastructure, includes one or more processors coupled to memory, the memory loaded with computer instructions, that when executed on the processors, implement the shielding, receiving, accessing, generating and feeding of claim 19.", "output": "The method involves shielding a GNSS receiving antenna of an autonomous vehicle from ambient GNSS signals while the autonomous vehicle is under test and the ambient GNSS signals are supplanted with simulated GNSS signals. A GNSS signal generating system is used. The ambient GNSS signals are received using an antenna of the GNSS signal generating system and a location and acceleration of the autonomous vehicle is determined from the ambient GNSS signals. The simulated GNSS signals are generated to feed to the autonomous vehicle, in real time, and constellation of GNSS satellite sources modified is simulated according to the augmented environment, based on the location determined from the GNSS signals. The simulated GNSS signals are feed to a receiver in the autonomous vehicle, thus supplanting ambient GNSS as the autonomous vehicle travels along the test path. INDEPENDENT CLAIMS are included for the following: (1) a method for testing connected vehicle that is connected to other vehicles and infrastructure;(2) tangible non-transitory computer readable storage media storing program for testing autonomous vehicle; and(3) a system for testing autonomous vehicle. Method for testing autonomous vehicle. The cellular and GNSS testing is enhanced using an inertial measurement unit to improve on accuracy of location determination from GNSS signals, especially under jerk conditions. The GNSS correction data is used and additional sensors are integrated into the onboard navigation system, to increase accuracy, availability and integrity. The track and the required environment are first modelled within the three dimensional (3D) environment model simulation software and then used in real time, to calculate the obscuration, multipath and other impairments from the scene. The drawing shows a flow chart of the method for testing autonomous vehicle. 400Method for testing autonomous vehicle 428Vehicle antenna 445GNSS simulator 455Three dimensional environment 465Three dimensional module ", "instruction": "Please summarize the input" }, { "input": "Information processing apparatus, information processing method, and mobile body apparatusProvided is an information processing apparatus that creates map information on the basis of sensor information obtained by an on-vehicle sensor. The information processing apparatus includes a creation section that creates a map of a surrounding area of a mobile body on the basis of sensor information acquired by one or more sensors mounted on the mobile body, a request section that issues an information request to an external apparatus on the basis of a state of the map created by the creation section, and a merge section that merges information acquired by the request section from the external apparatus with the created map. The request section issues an information request to the external apparatus on the basis of a condition of a dead angle included in the map created by the creation section.The invention claimed is: \n | 1. An information processing apparatus, comprising: \na creation section configured to create a map of a surrounding area of a first mobile body based on sensor information acquired by at least one sensor mounted on the first mobile body, wherein the map includes a first grid map indicating object existence probabilities in respective grids; \na request section configured to: \nissue a first information request to an external apparatus based on a state of the map; and \nacquire information from the external apparatus, wherein the acquired information is a second grid map; \na merge section configured to merge the first grid map with the second grid map; and \na control section configured to control driving of the first mobile body based on one of a merging result or the map created by the creation section, wherein the merging result is based on the merger of the first grid map with the second grid map. \n | 2. The information processing apparatus according to claim 1, wherein the issuance of the first information request to the external apparatus is based on a condition of a dead angle included in the map.\n | 3. The information processing apparatus according to claim 1, wherein the issuance of the first information request to the external apparatus is based on detection of a failure in the at least one sensor.\n | 4. The information processing apparatus according to claim 1, wherein, in a case where autonomous driving of the first mobile body based on the map is discontinued, the request section is further configured to issue a second information request to the external apparatus.\n | 5. The information processing apparatus according to claim 4, wherein, in a case where evacuation of the first mobile body to a safe place is impossible due to a dead angle included in the map created by the creation section, the request section is further configured to issue a third information request to the external apparatus.\n | 6. The information processing apparatus according to claim 1, wherein the request section is further configured to issue a second information request to the external apparatus based on a result of comparison of information regarding a current position of the first mobile body with map information.\n | 7. The information processing apparatus according to claim 6, wherein, in a case where the map information indicates that a plurality of dead angles from the current position of the first mobile body exists, the request section is further configured to issue a third information request to the external apparatus.\n | 8. The information processing apparatus according to claim 1, wherein the request section is further configured to issue a request to the external apparatus for one of map information to complement a dead angle included in the map or sensor information that is used to create a specific map to complement the dead angle.\n | 9. The information processing apparatus according to claim 1, wherein the request section is further configured to control issuance of a second information request to the external apparatus, based on the merging result.\n | 10. The information processing apparatus according to claim 9, wherein \nthe information acquired from the external apparatus is merged with the map created at the merge section, and \nthe request section is further configured to continue issuance of a request to the external apparatus until dead angles included in the map become equal to or less than a specific value, or stop the issuance of the request to the external apparatus when the dead angles included in the map become equal to or less than the specific value. \n | 11. The information processing apparatus according to claim 1, wherein the request section is further configured to issue a second information request to a second mobile body.\n | 12. The information processing apparatus according to claim 1, wherein \nthe first mobile body includes a first vehicle, and \nthe request section is further configured to issue a second information request to a second vehicle through vehicle-to-vehicle communication. \n | 13. The information processing apparatus according to claim 1, wherein \neach of the creation section, the request section, and the merge section is further configured to perform information processing on the map for each grid. \n | 14. An information processing method, comprising: \ncreating a map of a surrounding area of a mobile body based on sensor information acquired by at least one sensor mounted on the mobile body, wherein the map includes a first grid map indicating object existence probabilities in respective grids; \nissuing an information request to an external apparatus based on a state of the map; \nacquiring information from the external apparatus, wherein the acquired information is a second grid map; \nmerging the first grid map with the second grid map; and \ncontrolling driving of the mobile body based on one of a merging result or the created map, wherein the merging result is based on the merger of the first grid map with the second grid map. \n | 15. An information processing apparatus, comprising: \na creation section configured to create a map of a surrounding area of a first mobile body based on sensor information acquired by at least one sensor mounted on the first mobile body, wherein the map includes a grid map indicating object existence probabilities in respective grids; and \na providing section configured to provide at least partial information of the map created by the creation section, in response to a request from an external apparatus, wherein the external apparatus controls a second mobile body based on the at least partial information of the map created by the creation section. \n | 16. The information processing apparatus according to claim 15, wherein the providing section is further configured to: \nreceive the request together with position information of the external apparatus, and \nprovide information of the map to the external apparatus that exists within a specific range from current position information of the first mobile body. \n | 17. The information processing apparatus according to claim 15, wherein \nthe first mobile body includes a first vehicle, and \nthe providing section is further configured to provide information of the map to a second vehicle through vehicle-to-vehicle communication. \n | 18. An information processing method, comprising: \ncreating a map of a surrounding area of a first mobile body based on sensor information acquired by at least one sensor mounted on the first mobile body, wherein the map includes a grid map indicating object existence probabilities in respective grids; and \nproviding, by a request section, at least partial information of the created map, in response to a request from an external apparatus, wherein the external apparatus controls a second mobile body based on the at least partial information of the map created. \n | 19. A mobile body apparatus, comprising: \na mobile body comprising a mobile body main part; \nat least one sensor mounted on the mobile body main part; \na creation section configured to create a map of a surrounding area of the mobile body based on sensor information acquired by the at least one sensor, wherein the map includes a first grid map indicating object existence probabilities in respective grids; \na request section configured to: \nissue information request to an external apparatus based on a state of the map; and \nacquire information from the external apparatus, wherein the acquired information is a second grid map; \na merge section configured to merge the first grid map with the second grid map; and \na control section configured to control driving of the mobile body main part based on one of a merging result or the map created by the creation section, wherein the merging result is based on the merger of the first grid map with the second grid map.", "output": "The information processing apparatus has a preparation unit which produces the map around the mobile object such as vehicle (200) based on the sensor information acquired by one or more sensors mounted in the mobile object. The request unit requests information of an external device based on the state of the map produced by preparation unit. A synthetic unit synthesizes the information obtained from the external device by the request unit with the produced map. The request unit requests the information of the external device based on the condition of the blind spot contained in the map created by the preparation unit. INDEPENDENT CLAIMS are included for the following:the information processing method; andthe mobile object apparatus. Information processing apparatus used during processing of sensor information from vehicle-mounted sensors mounted in vehicle, of vehicle control system. Can also be used in processing of sensor information from sensors mounted in robot, ship, aircraft, unmanned aircraft such as drone, and predetermined working spaces such as home, office and factory. The synthetic unit synthesizes the information obtained from the external device by the request unit with the produced map, so that the provision of the information processing apparatus which complements the blind spot contained in the map information based on own sensor information based on the information from an external device can be possible. The drawing shows an explanatory view illustrating the synthesize of the grid map of the own vehicle and grid map of the surrounding vehicle. (Drawing includes non-English language text) 200Vehicle201-204,221,222Vehicle-mounted cameras210Bicycle220Surrounding vehicle700Grid map", "instruction": "Please summarize the input" }, { "input": "DEVELOPMENT OF WIRELESS RESOURCE AND COMPUTATION OFFLOADING FOR ENHANCED ENERGY EFFICIENCY IN THE INTERNET OF VEHICLES (IOV)The advent of the Internet of Vehicles (IoV) has brought about a paradigm shift in the field of computing, leading to enhanced vehicle intelligence and improved computational services for applications that require high processing power and minimal latency. These applications include autonomous driving, vehicular virtual reality, and real-time traffic control. The Internet of Vehicles (IoV) holds significant potential for extensive development due to the rapid advancements in vehicle wireless connection technologies. Security applications are of paramount importance within the realm of the Internet of Vehicles (IoV) due to their direct impact on vehicle safety. The concept of Vehicle-to-Vehicle (V2V) communication has garnered significant academic interest within the field of intelligent transportation systems (ITS). This technology is recognised for its potential to fulfil the stringent latency and reliability criteria necessary for safety applications. The Internet of Vehicles (IoV) is a nascent concept that is anticipated to play a crucial role in future mobile networks beyond the fifth and sixth generations. Nevertheless, the computational demands and stringent time limitations of Internet of Vehicles (IoV) applications provide a formidable obstacle for vehicle processing units. In order to achieve this objective, multi-access edge computing (MEC) has the potential to utilise the computing resources located at the periphery of the network in order to fulfil the high computational requirements. However, the allocation of computer resources in an optimal manner is a significant challenge due to the presence of multiple parameters, including the quantity of cars, the availability of resources, and the specific demands associated with each individual activity. This study examines a network comprising several vehicles linked to roadside units (RSUs) equipped with mobile edge computing (MEC) capabilities. We present a methodology that aims to minimise the overall energy consumption of the system by concurrently optimising the decision-making process for task offloading, power and bandwidth allocation, and task assignment to MEC-enabled RSUs. In order to address the inherent complexity of the original problem, we employ a strategy of decoupling it into smaller subproblems. To iteratively optimise these subproblems, we utilise the block coordinate descent approach. The numerical findings provide evidence that the suggested system is capable of significantly reducing overall energy usage across different quantities of cars and MEC nodes, all the while ensuring a minimal likelihood of service disruption.|1. Development Of Wireless Resource And Computation Offloading For Enhanced Energy Efficiency In The Internet Of Vehicles (Iov) provides ground work for future research.\n | 2. Development Of Wireless Resource And Computation Offloading For Enhanced Energy Efficiency In The Internet Of Vehicles (Iov) wherein said that the Internet of Things (IoT) refers to a network of physical items that possess the ability to interact, communicate, and exchange data with each other and the surrounding environment through a network, without requiring human interaction.\n | 3. Development Of Wireless Resource And Computation Offloading For Enhanced Energy Efficiency In The Internet Of Vehicles (Iov) wherein said with the rapid expansion of the Internet of Things (IoT) across all domains, The computational requirements posed by growing automotive applications have presented a significant problem within the context of the Internet of Vehicles (IoVs).\n | 4. Development Of Wireless Resource And Computation Offloading For Enhanced Energy Efficiency In The Internet Of Vehicles (Iov) wherein said It is anticipated that forthcoming wireless networks will possess the capacity to deliver data and voice services to a substantial quantity of mobile devices (MDs), while also enabling the integration of computational and artificial intelligence (AI) functionalities within these MDs.\n | 5. Development Of Wireless Resource And Computation Offloading For Enhanced Energy Efficiency In The Internet Of Vehicles (Iov) wherein said that in this paper, we analysed and discussed various aspects.\n | 6. Development Of Wireless Resource And Computation Offloading For Enhanced Energy Efficiency In The Internet Of Vehicles (Iov) wherein said that additionally, The emergence of Smart Internet of Vehicles (IoV) as a promising application within the realm of Internet of Things (IoT) can be attributed to the advancements in fifth generation mobile connectivity.", "output": "The method involves providing ground work for future research, where an Internet of Things (IoT) refers to a network of physical items that possess an ability to interact, communicate, and exchange data with each other and a surrounding environment through a network without requiring human interaction. Data and voice services are delivered to a quantity of mobile devices, while enabling integration of computational and artificial intelligence (AI) functionalities within the mobile devices. Wireless resource and computer offloading method for internet of vehicles (Iov). The method enables minimizing overall energy consumption of the Iov by concurrently optimizing decision-making process for task offloading, power and bandwidth allocation, and task assignment to mobile edge computing (MEC)-enabled roadside units (RSUs). The method enables efficient utilization of computing resources for facilitating resource sharing.", "instruction": "Please summarize the input" }, { "input": "A PROCESS OF MONITORING AND EVALUATING HUMAN PHYSICAL HEALTH PARAMETERS AND METHOD OF USE BY DASH-CAMThe present invention relates to monitoring, evaluating and reporting physical health parameters of human by a modular dash-cam. Further, it evaluates the driving skills of driver using dash-cam with unique hardware and software capabilities. The dash5 cam comprises a modular rotatable thermal camera with improved Field of view (FOV) that monitors and evaluates person's physical health parameters such as temperature, cough, mask detection, sanitization, oxygen levels measurement etc. and an optical camera monitors security surveillance inside vehicle and upon flipping it starts analyzing the exterior and route. The driver and auto vehicle driving skill results is communicated to multiple surrounding vehicles and pedestrians using similar V2V communication, to alert about potential near risky encounters. The design is scalable to multiple applications like oil/gas leakage, parking assistance, fire detection in vehicle, video game production in low light, AI based movie review, fire detection, drill automation, security monitoring, smart agriculture and integration to PPE smart jacket.|1. A system of determining the health parameters of the driver and/or co-passengers entering in to vehicle utilizing edge and/or cloud computing comprises:a modular dash cam attached to the interior of the windshield comprises; a rotatable thermal camera designed to capture the temperature and other health parameters inside the vehicle; and a flipable or pivoted optical camera to record the activities inside and outside of the vehicle; a processing unit connected to the memory; a plurality of sensors to detect the health parameters; a rechargeable battery assembly; wherein the collected data of health is store in a unstructured distributed way in an artificial intelligence engine and analysed by a processing unit to provide feedback in real-time to the driver.\n | 2. A modular dash camera of claim 1, send alerts to the in-vehicle display, driver and/or any emergency contact.\n | 3. A modular dash camera of claim 1, wherein the health parameters are such as but not limited to temperature and other body vitals.\n | 4. A modular dash camera of claim 1, stores the identity of the driver and co-passenger with a unique identifier to protect privacy.\n | 5. A modular dash camera of claim 1, where the thermal and optical camera arrangement is modular in nature and can be interchangeable.\n | 6. A modular dash camera of claim 1, wherein the sensors are for detecting oxygen levels, temperature, humidity, blood pressure, heart rate and other sensors.\n | 7. A modular dash camera of claim 1, is part of an artificial intelligence engine, where the artificial engine is trained for distributed computing.\n | 8. A method of determining the health parameters of the drivers and/or co-passenger entering into the vehicle utilizing edge and cloud computing comprises the steps of: obtain a complete picture of region of interest, with overlapping field of vision with thermal camera; initiating a voice interaction with a microphone and speakers embedded within; conceal identity of the driver and co-passenger by providing a unique identifier; automatic flipping the optical camera for surround view of the interior or exterior of a place or a vehicle; determining the oxygen levels in the area of concern with oximeter installed with camera, and; Reporting via live stream and stream on identification of region and event of interest.\n | 9. The method of claim 8, wherein health parameters may be temperature, cough and sneezing, blood pressure, Heart rate and other parameters detectable by dash cam.\n | 10. The method of claim 8, sending alerts to the in-vehicle display, driver and/or any emergency contact.\n | 11. The method of claim 8, wherein the health parameters are such as but not limited to temperature and other body vitals.\n | 12. The method of claim 8, storing the identity of the driver and co-passenger with a unique identifier to protect privacy.\n | 13. The method of claim 8, wherein the sensors are for detecting oxygen levels, temperature, humidity, blood pressure, heart rate and other sensors.\n | 14. A method of determining the driving skills of the driver using a modular dash cam comprising the steps of: activating optical camera with inbuild AI processing unit with machine learning software and capable of identifying road region in front and, understand scene complexity, traffic signs, speed of car and various other parameters of analysis; comparing with previously trained artificial intelligence algorithms by collecting data at a similar situation during driving with a skilled driver /instructor with precision driving skills; simultaneous processing of thermal images in the software application from thermal camera to understand various driver health analytics such as if driver is drowsy or measure anxiety levels of driver during critical situations or understand precision in right or left turns or roundabouts, or understand whether speed limits were mainlined; creating the skill test report of the driver, based on parameters of analysis and comparison; send the report to the driver and others to improve or rate the driving skills of the driver.\n | 15. The method of claim 14, is used by insurance companies to allow or deny the insurance in case of any accident due to driver's mistake.\n | 16. The method of claim 14, wherein the driver can designate the controls or take controls in an autonomous vehicle if the skills of driver are not perfect or perfect respectively.\n | 17. The method of claim 14, wherein the driver and auto vehicle driving skill with current circumstance is communicated to multiple surrounding vehicles through V2V (vehicle to vehicle) communication technology which can potentially alert multiple other vehicles about potential near risky encounters.", "output": "The system has a modular dash cam (810) that is attached to an interior of a windshield. A rotatable thermal camera captures temperature and health parameters inside a vehicle. A flipable or pivoted optical camera records activities inside and outside of the vehicle. Sensors detect the health parameters. A processing unit is connected to a memory. Collected data of health is stored in an unstructured distributed way in an artificial intelligence engine and analyzed by the processing unit to provide feedback in real-time to a driver. The sensors detect oxygen levels, temperature, humidity, blood pressure, heart rate and other sensors. INDEPENDENT CLAIMS are included for the following:a method of determining the health parameters of the drivers and/or co-passenger entering into the vehicle; anda method of determining the driving skills of the driver using a modular dash cam. System for determining health parameters of driver and/or co-passengers entering in vehicle, particularly aeroplane utilizing edge and cloud computing. The method of dynamic pairing between electronic devices, based on the time and proximity of the devices, reduces the possibility for unintentional communications. The system for determining the health parameters of the driver and/or co-passengers entering in to vehicle utilizing edge and cloud computing comprises a modular dash cam attached to the interior of the windshield comprises a rotatable thermal camera designed to capture the temperature and other health parameters inside the vehicle, and a flipable or pivoted optical camera to record the activities inside and outside of the vehicle. The modular dash camera has thermal and optical camera arrangement is modular in nature and can be interchangeable. The drawing shows a schematic view of the dash-cams set up in an aeroplane. 810Modular dash cam812Seat", "instruction": "Please summarize the input" }, { "input": "Domain controller and automatic driving vehicleThe utility model discloses a domain controller and an automatic driving vehicle, wherein the domain controller comprises: from SOC, for performing signal processing to the image detection signal output by the multi-path high-definition camera and the radar data signal output by the multi-path vehicular Ethernet, and outputting the corresponding environment processing signal; a main SOC, the main SOC is connected with the secondary SOC, the main SOC is used for performing signal processing according to the environment processing signal, outputting the corresponding driving planning signal, The technical solution of the utility model is to improve the calculation force and calculation precision of the domain controller of the automatic driving automobile so as to improve the driving safety of the automatic driving automobile.|1. A domain controller, which is applied to automatic driving automobile, the automatic driving automobile comprises a plurality of high definition cameras, a plurality of laser radar and vehicle-mounted Ethernet, wherein the domain controller comprises: a slave SOC, the slave SOC is used for respectively accessing a multi-path high-definition camera and a multi-path vehicle-mounted Ethernet, for performing signal processing on the image detection signal output by the multi-path high-definition camera and the radar data signal output by the multi-path vehicle-mounted Ethernet, and outputting the corresponding environment processing signal, wherein the number of the secondary SOC is at least two; a main SOC, the main SOC is connected with the output end of the secondary SOC, the main SOC is used for performing signal processing according to the environment processing signal, outputting the corresponding driving planning signal, so as to control the function module of the automatic driving automobile to work.\n | 2. The domain controller according to claim 1, wherein the number of the secondary SOC is two, which are respectively a first secondary SOC and a second secondary SOC; the first secondary SOC and the second secondary SOC are respectively used with the main SOC. the multi-path high-definition camera is electrically connected with the multi-path vehicle Ethernet; the first secondary SOC is used for performing signal processing on the received multi-path image detection signal and multi-path radar data signal, and outputting a corresponding first environment processing signal; the second secondary SOC is used for performing signal processing on the received multiple paths of image detection signals and multiple paths of radar data signals, and outputting a corresponding second environment processing signal; the main SOC is used for performing signal processing to the received first environment processing signal and/or second environment processing signal, and outputting corresponding driving planning signal.\n | 3. The domain controller according to claim 2, wherein the number of the secondary SOC is four, which are respectively the first secondary SOC, the second secondary SOC, the third secondary SOC and the fourth secondary SOC, the first secondary SOC, the second secondary SOC and the third secondary SOC. the third secondary SOC and the fourth secondary SOC are respectively electrically connected with the main SOC; the first secondary SOC is used for outputting the received multi-path image detection signal and multi-path radar data signal to the third secondary SOC and/or the fourth secondary SOC through the main SOC; the second sub-SOC is used for outputting the received multi-path image detection signal and multi-path radar data signal to the third sub-SOC and/or the fourth sub-SOC through the main SOC; the third secondary SOC and the fourth secondary SOC are respectively used for processing the received multi-path image detection signal and multi-path radar data signal, and outputting the corresponding driving planning signal through the main SOC.\n | 4. The domain controller according to claim 1, wherein the automatic driving vehicle further comprises a driving component, the domain controller further comprises: a function safety MCU, the function safety MCU is electrically connected with the main SOC, the function safety MCU is used for performing signal processing on the received driving planning signals, and outputting the corresponding driving control signal to the driving component, so as to control the driving route and driving speed of the driving component.\n | 5. The domain controller according to claim 4, wherein the domain controller further comprises: a CANFD interface, the CANFD interface is electrically connected with the functional safety MCU, for accessing one or more of millimetre wave radar, ultrasonic radar or vehicle control ECU.\n | 6. The domain controller according to claim 4, wherein the domain controller further comprises: a FlexRay interface electrically connected with the functional safety MCU and used for accessing one or more of laser radar, V2X communication module or EIMU detection system.\n | 7. The domain controller according to claim 1, wherein the domain controller further comprises: an FAKRK interface, the FAKRK interface is used for electrically connecting with the image detection signal output by the multi-path high definition camera, and accessing the image detection signal output by the multi-path high definition camera; a de-serializing chip electrically connected with the FAKRK interface and the secondary SOC, respectively, for decoding the received image detection signal and outputting the image detection signal to the secondary SOC for signal processing so as to output corresponding environment processing signal; the main SOC is used for processing the received environment processing signal and outputting the corresponding driving planning signal.\n | 8. The domain controller according to claim 1, wherein the domain controller further comprises: a plurality of storage modules, a plurality of storage modules are respectively electrically connected with the main SOC and the secondary SOC, a plurality of storage modules are respectively used for storing the corresponding temporary data.\n | 9. The domain controller according to claim 1, wherein the domain controller further comprises: a plurality of power supply management modules, a plurality of power supply management modules are respectively electrically connected with the main SOC and the secondary SOC, a plurality of power supply management modules are respectively used for accessing the direct current power supply, and respectively controlling the direct current power supply to access/stop accessing the main SOC and/or the secondary SOC.\n | 10. An automatic driving vehicle, comprising multiple high-definition cameras, a vehicle Ethernet and the domain controller according to any one of claims 1 to 9.", "output": "The controller has a slave security operation center (SOC) (10) which performs data processing on the image detection signal output by a multi-channel high-definition camera and the radar data signal output by the multi-channel vehicle Ethernet , and outputs the corresponding environment processing signal. A master SOC (30) is connected to the slave SOC. The master SOC performs signal processing according to the environment processing signal, and outputs corresponding driving planning signal, to control the operation of the functional modules of the autonomous driving vehicle. Domain controller for automatic driving automobile (claimed). The computing power and the calculation accuracy of the domain controller of the automatic driving automobile are improved, thus improving the driving safety of the automatic driving automobile. The drawing shows a block diagram of the domain controller. (Drawing includes non-English language text)10Slave SOC 11First Slave SOC 12Second Slave 13Third Slave SOC 20Master SOC 30Functional safety MCU 50Deserialization chip 60Storage Module 70Power management module ", "instruction": "Please summarize the input" }, { "input": "TRAFFIC CONTROL USING SOUND SIGNALSMethods for vehicle to vehicle communication, vehicle detection, and vehicle to traffic sign communication are devised. Such methods can involve the use of one or a plurality of speakers to emit artificial sound signals, as well as the use of one or a plurality of sound detectors to record artificial or natural sound signals emitted by nearby vehicles or traffic signs. The use of an active sonar system will also allow autonomous vehicles to detect nearby surroundings. The Doppler Effect can also be used to determine the speeds of moving vehicles. These methods allow autonomous vehicles to drive and respond to their surroundings, and also allow traffic signs to respond to various traffic situations by detecting the presence of nearby vehicles.|1. A method for automobiles for detecting nearby traffic conditions that comprises the following steps: \nrecord sound signals measured by one or a plurality of sound detectors in the automobile, \nuse the signal processing capabilities of the automobile to analyze the recorded sound signals to identify sound signals emitted by nearby traffic signs or vehicles, \nuse the sound signals emitted by nearby traffic signs or vehicles to assess surrounding traffic conditions, and \nprovide traffic information to direct the driving of the automobile. \n | 2. The method in claim 1 wherein the step of recording sound signals comprises the step of recording sound signals measured by two or more microphones in the automobile.\n | 3. The method in claim 1 wherein the step of using the signal processing capabilities of the automobile to analyze recorded sound signals comprises a step of comparing the recorded sound signals to a database of already known vehicle noise patterns to determine the types of nearby vehicles.\n | 4. The method in claim 1 wherein the step of using the signal processing capabilities of the automobile to analyze recorded sound signals comprises a step of distinguishing sound signals coming from different vehicles in order to estimate the number of nearby vehicles.\n | 5. The method in claim 1 wherein the step of using the signal processing capabilities of the automobile to analyze recorded sound signals comprises a step of using the Doppler Effect to determine the relative speeds of nearby vehicles.\n | 6. The method in claim 1 wherein the step of using the signal processing capabilities of the automobile to analyze recorded sound signals comprises a step of distinguishing sound signals that are in a pre-defined format coming from nearby vehicles.\n | 7. The method in claim 6 wherein the step of using the signal processing capabilities of the automobile to analyze recorded sound signals comprises a step of distinguishing amplitude modulated sound signals coming from nearby vehicles.\n | 8. The method in claim 6 wherein the step of using the signal processing capabilities of the automobile to analyze recorded sound signals comprises a step of distinguishing frequency modulated sound signals coming from nearby vehicles.\n | 9. The method in claim 1 further comprises a step that uses active sonar to transmit a sound signal and detect the echo of the transmitted sound in order to detect the surroundings of the automobile.\n | 10. The method in claim 1 further comprises a step of transmitting a sound signal that is in a pre-defined format for communicating with nearby vehicles or traffic signs.\n | 11. The method in claim 10 comprises a step of transmitting an amplitude modulated sound signal that is in a pre-defined format in order to communicate with nearby vehicles or traffic signs.\n | 12. The method in claim 10 comprises a step of transmitting a frequency modulated sound signal that is in a pre-defined format for communicating with nearby vehicles or traffic signs.\n | 13. The method in claim 1 is implemented on an autonomous automobile.\n | 14. The method in claim 1 further comprises a step of receiving sound signals transmitted by traffic signs.\n | 15. The method in claim 14 comprises a step of receiving amplitude modulated sound signals transmitted by traffic signs.\n | 16. The method in claim 14 comprises a step of receiving frequency modulated sound signals transmitted by traffic signs.\n | 17. A method for detecting nearby traffic conditions for an automobile that comprises the following steps: \ntransmit sound signals by one or a plurality of sound transmitting devices, \nrecord echoes of said transmitted sound signals measured by one or a plurality of sound detectors in the automobile, \nuse the signal processing capabilities of the automobile to analyze the recorded echoed sound signals to assess the surroundings of the automobile. \n | 18. The method in claim 17 wherein the step of recording sound signals comprises the step of recording sound signals measured by two or more microphones in the automobile.\n | 19. The method in claim 17 wherein the step of transmitting sound signals comprises a step of including identification information in the transmitted sound signals.\n | 20. The method in claim 17 wherein the step of using the signal processing capabilities of the automobile to analyze echoed sound signals comprises a step of using the Doppler Effect to determine the relative speeds of nearby vehicles.", "output": "The method involves recording sound signals measured by multiple sound detectors in the automobile. The signal processing capabilities of the automobile is used to analyze the recorded sound signals to identify sound signals emitted by nearby traffic signs (504) or vehicles (505). The sound signals emitted by nearby traffic signs or vehicles are used for assessing surrounding traffic conditions. The traffic information is provided to direct the driving of the automobile. The recorded sound signals are compared to a database of already known vehicle noise patterns to determine the types of nearby vehicles. The doppler effect is used for determining the relative speeds of nearby vehicles. An INDEPENDENT CLAIM is included for a method for detecting nearby traffic conditions for an automobile. Method for automobiles for detecting nearby traffic conditions by using sound signals with the help of doppler effect. By knowing the type, speed, distance, and direction of each nearby vehicle, the mobile phone is able to rank the level of potential danger that each vehicle poses and provides warnings for the user. The drawing shows a symbolic diagram that shows the traffic conditions near an intersection. 100Pedestrians cell phone109Earphones504Traffic signs505Vehicles581Pedestrian", "instruction": "Please summarize the input" }, { "input": "Mine automatic driving vehicle coordination planning method based on vehicle road cooperationThe invention claims a coordinated planning method of automatic driving vehicle under mine based on vehicle road cooperation, comprising the following steps: automatically driving the vehicle road to obtain a high precision map under mine provided by a cloud platform, planning a global smooth navigation path based on the road centre line in the high precision map, and realizing smooth reference line; the automatic driving vehicle bottom planner performs path and speed decision planning, and sends the track information output by the bottom planner as output to the control module; automatically driving the vehicle to run normally according to the planning track of the bottom layer of the vehicle, and performing advanced planning according to the specific condition; respectively planning by path and speed decoupling, iteratively solving the feasible self-vehicle track; the self-vehicle track of each automatic driving vehicle is input to the control module for executing the transverse and longitudinal control of the automatic driving vehicle to finish the vehicle meeting action through the mine crossing. The invention effectively plans the collision-free track of multiple automatic driving vehicles and improves the running efficiency of the automatic driving vehicle under the interactive scene.|1. A coordinated planning method of automatic driving vehicle under mine based on vehicle road cooperation, wherein it comprises the following steps: the automatic driving vehicle road obtains the high precision map under the mine provided by the cloud platform, based on the central line of the road in the high precision map, planning the global smooth navigation path, realizing the complete reference line smoothing; according to the planed smooth reference line, the automatic driving vehicle bottom planner performs path and speed decision planning, the track information output by the bottom planner is used as output and sent to the control module; automatically driving the vehicle to run normally according to the planning track of the bottom layer of the vehicle, triggering the coordination node when meeting the narrow tunnel meeting and the crossing meeting scene, and performing the advanced planning according to the specific condition; the planning result of the high-level planning period replaces the original reference line of the automatic driving vehicle, the path and the speed decoupling are used for planning respectively, and the feasible self-vehicle track is solved iteratively; The vehicle track of each automatic driving vehicle is input to the control module, the control module executes the transverse and longitudinal control of the automatic driving vehicle to finish the vehicle meeting action through the mine crossing.\n | 2. The coordinated planning method of automatic driving vehicle under mine based on vehicle road cooperation according to claim 1, wherein the automatic driving vehicle road obtains the high precision map under mine provided by the cloud platform, based on the road central line in the high precision map, for planning the global smooth navigation path, realizing complete reference line smoothing, comprising the following steps: the automatic driving vehicle road obtains the high precision map under the mine provided by the cloud platform, based on the road central line in the high precision map, firstly planning the global smooth navigation path; wherein the road central line is a discrete point set for smoothing processing as the reference line, the discrete point set of the road central line adopts cubic polynomial connection and uniform sampling encryption central line discrete point, polynomial connection adjacent discrete point (xi, yi) and (xi + 1, yi + 1): y is equal to f (x) = a0 + a1x + a2x2 + a3x3, wherein a0, a1, a2 and a3 respectively represent 0-order term coefficient, 1-order term coefficient, 2-order term coefficient and 3-order term coefficient of the cubic polynomial; planning the smooth reference line, searching the self-vehicle projection point in the discrete point set in a planning period, segmenting based on the projection point, taking the path after segmenting as the path section to be smoothed; converting the reference line smoothing problem into the secondary planning problem based on the sectioned path section composed of the densified discrete points, solving according to the cost function and the constraint condition of the sectioned central line point set smoothing to obtain the smooth reference line point set; finally, splicing the reference line segments with different periods so as to realize smooth reference line.\n | 3. The coordination planning method of automatic driving vehicle under mine based on vehicle road cooperation according to claim 2, wherein the cost function of the centre line point set smoothing is as follows: wherein w1, w2, w3 are the weight of each item in the cost function, xi, yi and xref, yref are the horizontal and vertical coordinates of the reference line and the density central line, respectively.\n | 4. The coordination planning method of automatic driving vehicle under mine based on vehicle road cooperation according to claim 3, wherein the high precision map under mine stores the road data and the fixed alignment information of the tunnel under mine as structured data, in the process of performing reference line smoothing processing on the road central line, the projection point of the vehicle on the road central line in each automatic driving vehicle planning period is used as the starting point, the point set in a certain range before and after the smooth starting point, and the point set after smooth is used as the reference line.\n | 5. The coordination planning method of automatic driving vehicle under mine based on vehicle road cooperation according to claim 1, wherein the automatic driving vehicle bottom planner performs path and speed decision planning according to the planed smooth reference line. The track information output by the bottom planner is used as an output and is sent to the control module, which comprises the following steps: according to the planned smooth reference line, the automatic driving vehicle bottom planner performs path and speed decision planning based on Frenet coordinate system taking the navigation path as coordinate axis, and sends the track information output by the bottom planner as output to the control module; The bottom planner adopts the SLT dimension reduction method to decide the planning process as follows: (1) using SLT dimensionality reduction method to divide into SL layer and ST layer for planning, then constructing path and speed planning problem in SL coordinate system and ST coordinate system: wherein l represents the transverse offset of the automatic driving vehicle path relative to the central line of the road, s represents the longitudinal offset of the automatic driving vehicle path along the central line of the road; t represents the moment corresponding to the longitudinal offset in the speed plan; (2) based on static and low-speed obstacle projection, establishing SL image and discretizing the state space, adopting heuristic search method and numerical optimization method for path decision planning; (3) based on dynamic obstacle track prediction, establishing ST image and discretizing the state space, adopting heuristic search method and numerical optimization method for speed decision planning.\n | 6. The coordination planning method of automatic driving vehicle under mine based on vehicle road cooperation according to claim 5, wherein the Cartesian coordinate system is converted into Frenet coordinate system in the planning process of the bottom planner. before the track information is sent to the control module, the Frenet coordinate system is converted into the global Cartesian coordinate system.\n | 7. The coordination planning method of automatic driving vehicle under mine based on vehicle road cooperation according to claim 5, wherein the path planning in the SL diagram and the speed planning in the ST diagram set different non-uniform sampling scales according to the tunnel scene, firstly performing state space discretization, distributing the cost value of each discrete point according to the cost function, adopting the improved A* algorithm heuristic search to quickly obtain the initial solution; the initial solution is used as the decision solution to open the safe space, the original problem based on the safe space is converted into the convex optimized problem, the optimal track solution is obtained by the convex optimized solving method under the constraint condition; In the numerical optimization process of speed planning in the SL diagram, the cost function is: wherein w1, w2, w3, w4, w5 are the weight of each item in the cost function, li, lcentre respectively represent the path in the SL image and the longitudinal offset of the reference line; In the numerical optimization process of the speed planning in the ST diagram, the cost function is: wherein w1, w2, w3, w4 are the weights of each item in the cost function, si, vref respectively represent the transverse displacement of the path and the reference speed in the ST image.\n | 8. The coordination planning method of automatic driving vehicle under mine based on vehicle-road cooperation according to claim 7, wherein when meeting narrow tunnel meeting vehicle and road junction meeting vehicle scene, triggering the coordination node, and performing advanced planning according to the specific condition, comprising the following steps: automatically driving the vehicle to drive normally according to the planning track of the bottom layer of the vehicle, triggering the coordination node when meeting the narrow tunnel meeting and the intersection meeting scene, firstly judging whether the original track of the automatic driving vehicle is conflicted, if there is no conflict, driving according to the original track; if there is conflict, forming a conflict area in the vehicle interaction area, forming a buffer coordination area in front of the conflict area of the mine narrow tunnel meeting and the road junction meeting scene; one or more automatic driving vehicles in the mine tunnel drive into the buffer coordination area to reduce speed or stop, and wait for coordination in turn; the coordination node receives the driving maneuvering state of all automatic driving vehicles in the intersection buffer coordination area through V2I communication, wherein the high-level planner performs coordination planning for the vehicles in all buffer coordination areas, generating the coordination reference track of the traffic conflict area, and the coordination reference track only considers the automatic driving vehicle in the buffer coordination area, and does not consider other static or dynamic obstacles; wherein the path generation of the advanced planner adopts a smooth optimization method based on straight line and circular arc, firstly sampling the straight line and circle: Knots: ((xk, m, yk, m, sk, m) m = 0, 1, ..., nk) Anchor points: ((xa, j, ya, j, sa, j) j = 0, 1, ..., na) wherein Knots and Anchor points represent the node and anchor point of the divided straight line and circle, m and j represent the number of the corresponding node and trace point, (xk, m, yk, m, sk, m) respectively represent the transverse and longitudinal coordinates of the node and the total length of the divided straight line, (xa, j, ya, j, sa, j) respectively represent the transverse and longitudinal coordinates of the anchor point and the total length of the divided circle; a reference path between every two adjacent nodes is connected by a quintuple polynomial, and then a smooth feasible path is searched near a straight line and a circular path by an optimization method; the advanced planner performs speed planning on all automatic driving vehicles in the range in the ST graph based on each vehicle path planning result, firstly, the interaction between each vehicle and the conflict area is projected into the ST graph, then the state space discretization is performed and the automatic driving vehicle passing sequence is determined, orderly performing the initial solution search and optimization of the speed; and the speed planning of the high-level planner satisfies the constraint condition that the conflict area is only occupied by the same vehicle at the same time; the automatic driving vehicle enters the buffer coordination area, the coordination node realizes V2I communication with the automatic driving vehicle in the area through the PC5 direct connection communication interface of C-V2X, the road side sensing and vehicle-mounted sensing are connected together by using V2I communication technology, realizing low time delay of data transmission, high reliable requirement, establishing reliable information transmission channel, realizing multi-dimensional, all-aspect sensing information sharing and cooperative scheduling control; when the coordination node judges that there is conflict relation between the automatic driving vehicle track of each intersection and the track of other automatic driving vehicles, the high-level planner determines the planning starting point based on the current mobile state of the automatic driving vehicle, and replans and coordinates all vehicles in the buffer coordination area.\n | 9. The coordination planning method of automatic driving vehicle under mine based on vehicle road cooperation according to claim 8, wherein the planning result of the high-level planning period replaces the original reference line of the automatic driving vehicle, adopts path and speed decoupling to respectively plan, and iteratively solves the feasible self-vehicle track, The method comprises the following steps: the planning result of the advanced planning period replaces the original reference line of the automatic driving vehicle, each automatic driving vehicle establishes a Frenet coordinate system according to the coordinated reference line, the intersection obstacle information sensed by the vehicle-mounted sensor is projected into the SL image and ST image, the vehicle bottom planner performs replanning, respectively planning by path and speed decoupling, and iteratively solving the feasible self-vehicle track; The time for each automatic driving vehicle output by the advanced planner to enter and exit the conflict area is used as the limit area of the ST diagram in the bottom planning process, so as to ensure that there is no conflict between the output track of the automatic vehicle re-planning and the output track of other automatic driving vehicles: wherein tsl, tel represents the time domain boundary of the passing conflict area of the self-vehicle in the advanced planner speed planning result, tin, tout represents the interaction time of the self-vehicle bottom planner speed planning result and the conflict area; The passing speed of the conflict area should satisfy the speed constraint condition: wherein represents the speed planning result of the bottom planner, and is less than the speed limiting v1 of the conflict area; the high-level planner plans each automatic driving vehicle coordination reference track under the corresponding scene, each automatic driving vehicle bottom planner uses the coordination reference track as input to perform self-vehicle weight planning, so as to avoid various obstacles when passing through the conflict area, the high-level planner and the bottom planner ensure that each vehicle passes through the conflict area safely and harmoniously in turn.\n | 10. The coordination planning method of automatic driving vehicle under mine based on vehicle road cooperation according to claim 1, wherein the self-vehicle track of each automatic driving vehicle is input to the control module. the control module executes the transverse and longitudinal control of the automatic driving vehicle to finish the vehicle meeting action through the mine crossing, the self-vehicle track of the automatic driving vehicle is converted by the coordinate system to be input to the control module, wherein the transverse control uses the model prediction control method, the longitudinal control uses the PID control method.", "output": "The method involves obtaining a high-precision map of a road in a mine under a mine by an automatic driving vehicle. A global smooth navigation path is planned based on a central line of the road in the high precision map. A planed smooth reference line is realized. A path and speed decision planning process is performed according to the planning track of the bottom layer of the vehicle. A vehicle track of each automatic vehicle is input to a control module. The control module executes transverse and longitudinal control of the automatic vehicle to complete vehicle meeting action through a mine crossing. Coordinated planning method of automatic driving vehicle under mine based on vehicle road cooperation. The collision-free track of multiple automatic driving vehicles is effectively planned and the running efficiency of the automatic driving vehicle under the interactive scene is improved. The drawing shows a flow diagram of a planning method. (Drawing includes non-English language text).", "instruction": "Please summarize the input" }, { "input": "REDUNDANT COMMUNICATION METHOD, APPARATUS AND SYSTEM FOR COOPERATIVE AUTONOMOUS DRIVING PLATOONINGThe present disclosure relates to Internet of Vehicles technology, and provides a method, an apparatus, and system for redundant communication for platooning. The method includes: transmitting application data to be transmitted to at least two V2V devices; and controlling the at least two V2V devices that have received the application data to transmit the application data to a predetermined air interface, such that a receiving apparatus obtains the application data from the air interface. With the redundant configuration of the V2V devices, the problem caused by communication failure of one single V2V device can be avoided, so as to ensure stability of V2V communication and guarantee safe operation for platooning.|1-25. (canceled)\n | 26. A transmitting apparatus, comprising a first processing device and at least two V2V devices, wherein \nthe first processing device is configured to transmit application data to the at least two V2V devices, and \nthe at least two V2V devices are configured to transmit the application data to a predetermined air interface, such that a receiving apparatus obtains the application data from the air interface. \n | 27. The transmitting apparatus of claim 26, wherein the first processing device is further configured to: \nconvert the application data into an Ethernet message, and \ntransmit the Ethernet message to the at least two V2V devices. \n | 28. The transmitting apparatus of claim 27, wherein the at least two V2V devices are further configured to: \npacketize the Ethernet message into a V2X message; and \ntransmit their respectively packetized V2X messages using different frequency bands to air interfaces corresponding to the different frequency bands. \n | 29. The transmitting apparatus of claim 28, wherein each of the at least two V2V devices comprises a plurality of antennas, and the at least two V2V devices are further configured to: \ntransmit their respectively packetized V2X messages using the different frequency bands to the air interfaces corresponding to the different frequency bands via the plurality of antennas provided at each of the at least two V2V devices, wherein each V2V device occupies one frequency band, and the plurality of antennas of each V2V device occupy a same frequency band. \n | 30. A receiving apparatus, comprising a second processing device and at least two V2V devices, wherein \neach of the at least two V2V devices is configured to obtain application data from an air interface, and \nthe second processing device is configured to obtain, from each of the at least two V2V devices, the application data corresponding to the V2X device, and fuse and verify the application data to obtain valid data. \n | 31. The receiving apparatus of claim 30, wherein: \nthe air interfaces correspond to a plurality of frequency bands; \neach of the at least two V2V devices occupies different one of the plurality of frequency bands and comprises a plurality of antennas; \nthe plurality of antennas of each V2V device occupy a same frequency band; and \neach of the at least two V2V devices is further configured to: \nreceive V2X messages from air interfaces corresponding to different frequency bands via a plurality of antennas; and \nperform signal fusion on the V2X messages received via the plurality of antennas of the V2V device, to form application data information corresponding to the V2V device. \n | 32. The receiving apparatus of claim 31, the second processing device is further configured to: \ncontrol each of the at least two V2V devices to decode the application data information corresponding to the V2V device, and packetize the decoded application data information into an Ethernet message; and \nreceive, from each of the at least two V2V devices, the Ethernet message corresponding to the V2V device. \n | 33. The receiving apparatus of claim 32, wherein the second processing device is further configured to: \ndetermine, at an end of a current detection period, one or more V2V devices corresponding to the Ethernet message received in the current detection period, the detection period being a predetermined message communication period; \nperform, when only one V2V device corresponds to the Ethernet message received in the current detection period, message identity detection on the Ethernet message corresponding to the only one V2V device as received in the current detection period to form a first detection result; \ndetermine whether the Ethernet message corresponding to the only one V2V device as received in the current detection period is valid data or invalid data based on the first detection result. \n | 34. The receiving apparatus of claim 33, wherein the second processing device is further configured to: \nperform, when more than one V2V device corresponds to the Ethernet message received in the current detection period, message identity detection on the Ethernet message corresponding to the more than one V2V device as received in the current detection period to form a second detection result; \ndetermine an Ethernet message to be discarded and an Ethernet message to be verified based on the second detection result; and \ndiscard the Ethernet message to be discarded, and verify the Ethernet message to be verified to obtain valid data or invalid data. \n | 35. The receiving apparatus of claim 33, wherein the second processing device is further configured to: \ndetermine whether an identity of the Ethernet message corresponding to the only one V2V device as received in the current detection period is same as an expected message identity known in advance; \nset a status flag corresponding to the only one V2V device to a first flag indicating same identity when the identity of the Ethernet message corresponding to the only one V2V device as received in the current detection period is same as the expected message identity known in advance; and \nmaintain a flag corresponding to the only one V2V device as an initial flag to indicate different identity when the identity of the Ethernet message corresponding to the only one V2V device as received in the current detection period is different from the expected message identity known in advance. \n | 36. The receiving apparatus of claim 35, wherein the second processing device is further configured to: \ndetermine whether the flag corresponding to the only one V2V device is the initial flag or the first flag; \ndetermine that the Ethernet message corresponding to the only one V2V device as received in the current detection period is invalid data when the flag corresponding to the only one V2V device is the initial flag; and \ndetermine that the Ethernet message corresponding to the only one V2V device as received in the current detection period is valid data when the flag corresponding to the only one V2V device is the first flag. \n | 37. The receiving apparatus of claim 33, wherein the second processing device is further configured to: \ndetermine whether an identity of the Ethernet message corresponding to each V2V device as received in the current detection period is same as an expected message identity known in advance; \nset a status flag corresponding to each V2V device to a first flag indicating same identity when the identity of the Ethernet message corresponding to the V2V device as received in the current detection period is same as the expected message identity known in advance; and \nmaintain a flag corresponding to each V2V device as an initial flag to indicate different identity when the identity of the Ethernet message corresponding to the V2V device as received in the current detection period is different from the expected message identity known in advance. \n | 38. The receiving apparatus of claim 37, the second processing device is further configured to: \ndetermine whether the flag corresponding to each V2V device is the initial flag or the first flag; \ndetermine that the Ethernet message corresponding to each V2V device as received in the current detection period is an Ethernet message to be discarded when the flag corresponding to the V2V device is the initial flag; and \ndetermine that the Ethernet message corresponding to each V2V device as received in the current detection period is an Ethernet message to be verified when the flag corresponding to the V2V device is the first flag. \n | 39. The receiving apparatus of claim 37, the second processing device is further configured to: \ncalculate data bits in the Ethernet message to be verified corresponding to each V2V device in accordance with a predetermined algorithm to obtain a calculation result corresponding to the V2V device, the predetermined algorithm comprising addition, multiplication, MD5 message digest algorithm; \ncompare the calculation results; \ndetermine the Ethernet message to be verified corresponding to each V2V device to be same, and determine the same Ethernet messages to be verified corresponding to the V2V device as valid data, when the calculation results are same; and \ndetermine the Ethernet message to be verified corresponding to each V2V device as invalid data, when different calculation results exist in the calculation results. \n | 40. A method for redundant communication for platooning, comprising: \ncontrolling at least two V2V devices to obtain application data from an air interface; and \nobtaining, from the at least two V2V devices, the application data corresponding to the V2X device; and \nfusing and verifying the application data to obtain valid data. \n | 41. The method of claim 40, wherein: \nthe air interfaces correspond to a plurality of frequency bands; \neach of the at least two V2V devices occupies different one of the plurality of frequency bands and comprises a plurality of antennas; \nthe plurality of antennas of each V2V device occupy a same frequency band; and \nsaid controlling the at least two V2V devices to obtain the application data from the air interface comprises: \ncontrolling the at least two V2V devices to receive V2X messages from air interfaces corresponding to different frequency bands via a plurality of antennas of each V2V device; and \ncontrolling each of the at least two V2V devices to perform signal fusion on the V2X messages received via the plurality of antennas of the V2V device, to form application data information corresponding to the V2V device. \n | 42. The method of claim 41, wherein said obtaining, from the at least two V2V devices, the application data corresponding to the V2X device comprises: \ncontrolling each of the at least two V2V devices to decode the application data information corresponding to the V2V device, and packetize the decoded application data information into an Ethernet message; and \nreceiving, from each of the at least two V2V devices, the Ethernet message corresponding to the V2V device via a router or a switch. \n | 43. The method of claim 42, wherein said fusing and verifying the application data to obtain the valid data comprises: \ndetermining, at an end of a current detection period, one or more V2V devices corresponding to the Ethernet message received in the current detection period, the detection period being a predetermined message communication period; \nperforming, when only one V2V device corresponds to the Ethernet message received in the current detection period, message identity detection on the Ethernet message corresponding to the only one V2V device as received in the current detection period to form a first detection result; \ndetermining whether the Ethernet message corresponding to the only one V2V device as received in the current detection period is valid data or invalid data based on the first detection result; \nperforming, when more than one V2V device corresponds to the Ethernet message received in the current detection period, message identity detection on the Ethernet message corresponding to the more than one V2V device as received in the current detection period to form a second detection result; \ndetermining an Ethernet message to be discarded and an Ethernet message to be verified based on the second detection result; and \ndiscarding the Ethernet message to be discarded, and verifying the Ethernet message to be verified to obtain valid data or invalid data. \n | 44. The method of claim 43, wherein \nsaid performing the message identity detection on the Ethernet message corresponding to the only one V2V device as received in the current detection period to form the first detection result comprises: \ndetermining whether an identity of the Ethernet message corresponding to the only one V2V device as received in the current detection period is same as an expected message identity known in advance; \nsetting a status flag corresponding to the only one V2V device to a first flag indicating same identity when the identity of the Ethernet message corresponding to the only one V2V device as received in the current detection period is same as the expected message identity known in advance; and \nmaintaining a flag corresponding to the only one V2V device as an initial flag to indicate different identity when the identity of the Ethernet message corresponding to the only one V2V device as received in the current detection period is different from the expected message identity known in advance, and \nsaid determining whether the Ethernet message corresponding to the only one V2V device as received in the current detection period is valid data or invalid data based on the first detection result comprises: \ndetermining whether the flag corresponding to the only one V2V device is the initial flag or the first flag; \ndetermining that the Ethernet message corresponding to the only one V2V device as received in the current detection period is invalid data when the flag corresponding to the only one V2V device is the initial flag; and \ndetermining that the Ethernet message corresponding to the only one V2V device as received in the current detection period is valid data when the flag corresponding to the only one V2V device is the first flag. \n | 45. A non-transitory computer readable storage medium, having a computer program stored thereon, the program comprising code configured to perform a method for redundant communication for platooning of claim 40.", "output": "The method, involves sending application data to at least two Vehicle-to-vehicle devices (101). The at least two Vehicle-to-vehicle devices is controlled and received the application data to apply the application (102). The data is send to the preset air interface, so that the received device obtains the application data from the air interface. The application data to be send to at least two Vehicle-to-vehicle (V2V) devices has converted the application data to be send into application data Ethernet packets and passed the router or the switch sends the application data Ethernet message to at least two V2V devices. An INDEPENDENT CLAIM is included for the following:a sending end device;a receiving terminal device;a computer readable storage medium on which a computer program is stored; anda synergy automatic driving vehicles of the redundant communication system. Redundant communication method for collaborative autonomous driving fleet. The method ensures stability of V2V communication. The drawing shows the flow chart of the method. 101Involves sending application data to at least two Vehicle-to-vehicle devices102At least two Vehicle-to-vehicle devices is controlled and received the application data to apply the application", "instruction": "Please summarize the input" }, { "input": "Road cloud cooperative automatic driving control method of road end main controlThe invention claims a road cloud cooperative automatic driving control method and method for road end main control, comprising the following steps: S1, building a cloud platform database: S1.3, an ID is allocated for each vehicle in the static road environment model and a corresponding file library is established according to the ID, the vehicle comprises an automatic driving vehicle and a non-automatic driving vehicle, the automatic driving vehicle is allocated with a permanent ID, the non-automatic driving vehicle is allocated with a temporary ID, when the non-automatic driving vehicle drives out of the whole control area for 1 week, the cloud platform automatically deletes the temporary ID of the non-automatic driving vehicle and the corresponding file base; S2, determining the local path planning of the controlled vehicle; The invention endows the automatic driving vehicle and the non-automatic driving vehicle with the ID, and establishes a matched file library, and the data of the non-automatic driving vehicle is periodically deleted according to the rule, which ensures that the database will not store excessive and unused information, which is convenient for the rapid management and application between the edge computing centre and the cloud platform, The transmission is updated.|1. A vehicle road cloud cooperative automatic driving control method for road end main control, wherein The control architecture comprises: S1, constructing a cloud platform database: S1.1, the cloud platform collects the high-precision map through the road side terminal, extracts the content related to the driving and removes the unrelated information; S1.2, establishing a dimension-reducing static road environment model according to the content related to the driving; S1.3, distributing ID for each vehicle in the static road environment model and establishing corresponding file library according to the ID, the vehicle comprises an automatic driving vehicle and a non-automatic driving vehicle, the automatic driving vehicle is distributed with permanent ID, the non-automatic driving vehicle is distributed with temporary ID, when the non-automatic driving vehicle drives out of the whole control area for 1 week, the cloud platform automatically deletes the temporary ID of the non-automatic driving vehicle and the corresponding file base; S1.4, the information of the static road environment model updated by S1.3 is recorded to the database in real time and shared to each edge computing centre according to the area fragment, and returned to S1.2 for updating by timing or event trigger; S2, determining the local path planning of the controlled vehicle: S2.1, the cloud platform determines the controlled vehicle according to the application requirement, and determines the vehicle scheduling instruction and the overall path planning according to the static road environment model; S2.2, the edge computing centre receives the road environment model, the vehicle scheduling instruction and the overall path planning; establishing a real-time dynamic traffic environment model according to the road environment model and the real-time collected dynamic data; S2.3, the edge computing centre according to the real-time dynamic traffic environment model, according to the vehicle scheduling instruction and total path planning for each automatic driving vehicle for local path planning; S3, the controlled vehicle executes the local path planning.\n | 2. The road cloud cooperative automatic driving control method for road end main control according to claim 1, wherein In S1.3, the archive of the auto-driving vehicle includes basic information and dynamic information of the vehicle, and the archive of the non-auto-driving vehicle includes vehicle information sensed by the roadside device.\n | 3. The road cloud cooperative automatic driving control method for road end main control according to claim 2, wherein the basic information comprises vehicle type, size, power parameter, braking parameter, steering ability, real-time electric quantity, fault state, history state and maintenance record; the dynamic information is the self state data uploaded by the vehicle in real time.\n | 4. The road cloud cooperative automatic driving control method for road end main control according to claim 1, wherein In S1.4, the event is an important road event reported by an edge computing centre, a vehicle or a person.\n | 5. The road cloud cooperative automatic driving control method for road end main control according to claim 1, wherein In S2.3, the local path planning is updated according to the frequency of 50 HZ, the local path planning comprises ten planning path points, each planning path point carries a coordinate point and a time to reach the path point, and the adjacent distance of the ten planning path points is inversely proportional to the vehicle speed.\n | 6. The road cloud cooperative automatic driving control method for road end main control according to claim 1, wherein in S1-S3, the information with high time delay requirement of the controlled vehicle realizes V2I direct connection communication through the PC5 interface of the 5G-OBU module and the edge computing centre, the information with low time delay requirement of the controlled vehicle realizes V2N communication through the Uu interface of the 5G-OBU and the cloud platform; each edge computing centre communicates with the cloud platform through the optical fibre Ethernet.", "output": "The vehicle road cloud cooperative automatic driving control method involves constructing a cloud platform database. A cloud platform collects a high-precision map through a road side terminal. A real-time dynamic traffic environment model is established according to a road environment model and real time collected dynamic data. A local path planning of a controlled vehicle is determined. The cloud platform determines the controlled vehicle according to an application requirement. A vehicle scheduling instruction and an overall path planning are determined according to the static road environmental model. The edge computing center receives the road environment models, the vehicle scheduling instructions and the overall path plans. Vehicle road cloud cooperative automatic driving control method for road end main control of vehicle, such as automobile. The data of the non-automatic driving vehicle is periodically deleted according to the rule, which ensures that the database will not store excessive and unused information, which is convenient for the rapid management and application between the edge computing center and the cloud platform. The drawing shows a flow chart of the vehicle road cloud cooperative automatic driving control method. (Drawing includes non-English language text).", "instruction": "Please summarize the input" }, { "input": "Method of using a multi-input and multi-output antenna (MIMO) array for high-resolution radar imaging and wireless communication for advanced driver assistance systems (ADAS) and autonomous drivingA method of using a multi-input multi-output (MIMO) antenna array for high-resolution radar imaging and wireless communication for advanced driver assistance systems (ADAS) utilizes a MIMO radar and at least one base station. The MIMO radar establishes wireless communication with the base station via an uplink signal. Likewise, the base station sends a downlink signal to the MIMO radar. Further, unlike conventional vehicle-to-everything (V2X) systems that filter the reflected uplink signal, the MIMO radar uses the reflected uplink signal to detect a plurality of targets. Accordingly, the MIMO radar derives spatial positioning data for each target from the reflected uplink signal.What is claimed is: \n | 1. A method of using a multi-input and multi-output (MIMO) antenna array for radar imaging and wireless communication for advanced driver assistance systems (ADAS) and autonomous driving, the method comprises the steps of: \n(A) providing a multi-input and multi-output (MIMO) radar and at least one base station; \n(B) transmitting an uplink signal from the MIMO radar to the at least one base station; \n(C) receiving a downlink signal from the at least one base station with the MIMO radar; \n(D) receiving a reflected uplink signal with the MIMO radar, wherein the reflected signal is reflected off objects surrounding the MIMO radar; \nproviding a plurality of transmitters, a plurality of receivers, and a RF controller for the MIMO radar; \nproviding a PN-code regulator managed by the MIMO radar, wherein the reflected uplink signal is encoded with a spread spectrum coding scheme; \nreceiving an ambient signal with the MIMO radar; \ncancelling a cross-talk portion of the ambient signal with the RF controller during step (D), wherein the cross-talk portion is generated from direct communication between the plurality of transmitters and the plurality of receivers; \nfiltering the reflected uplink signal from the ambient signal with the RF controller during step (D); \ndispreading the reflected uplink signal through the PN-code regulator with the RF controller; \nestimating a detection time delay for the spatial positioning data for each target with the RF controller; \n(E) processing communication data from the downlink signal with the MIMO radar; \n(F) detecting a plurality of targets within the reflected uplink signal with the MIMO radar; and \n(G) deriving spatial positioning data for each target from the reflected uplink signal with the MIMO radar. \n | 2. The method as claimed in claim 1 further comprises the steps of: \nproviding a pseudo-noise (PN) generator managed by the MIMO radar; and \nencoding the uplink signal through the PN generator with the MIMO radar during step (B), wherein a spread spectrum coding scheme is applied to the uplink signal by the PN generator. \n | 3. The method as claimed in claim 1 comprises: \nproviding a RF controller for the MIMO radar; \nreceiving an ambient signal with the MIMO radar; and \nfiltering the downlink signal from the ambient signal with the RF controller during step (C). \n | 4. The method as claimed in claim 1 further comprises the steps of: \nproviding an adaptive noise canceller for the MIMO radar; and \ncapturing the cross-talk portion of the ambient signal with the adaptive noise canceller. \n | 5. The method as claimed in claim 1 further comprises the steps of: \nexecuting a plurality of iterations for steps (B) through (G); \ntransmitting an omni-directional uplink signal during step (B) of an initial iteration, wherein the initial iteration is from the plurality of iterations; \nreceiving a reflected omni-directional uplink signal during step (D) of the initial iteration; and \ndetecting a plurality of targets during step (F) of the initial iteration. \n | 6. The method as claimed in claim 1 further comprises the steps of: \nexecuting a plurality of iterations for steps (B) through (G); \nbeamforming a uni-directional uplink signal towards each target detected in a previous iteration during step (B) of an arbitrary iteration, wherein the arbitrary iteration is any iteration from the plurality of iterations, and wherein the previous iteration precedes the arbitrary iteration in the plurality of iterations; \nreceiving a uni-directional reflected uplink signal for each target detected in the previous iteration during step (D) of the arbitrary iteration; and \ndetecting a plurality of targets during step (F) of the arbitrary iteration, wherein each target corresponds to the uni-directional reflected uplink signal for each target detected in the previous iteration.", "output": "The method involves transmitting an uplink signal from a Multi-Input Multi-Output (MIMO) radar to at least one base station, and receiving a downlink signal from the at least one base station with the MIMO radar. A reflected uplink signal reflected off objects surrounding the MIMO radar is received with the MIMO radar. Communication data from the downlink signal is processed with the MIMO radar. Multiple targets within the reflected uplink signal are detected with the MIMO radar. Spatial positioning data for each target is derived from the reflected uplink signal with the MIMO radar. Method of using a Multi-Input Multi-Output (MIMO) antenna array for high-resolution radar imaging and communication for Advanced Driver Assistance Systems (ADAS) and autonomous driving of vehicle. By encoding the reflected uplink signal with a spread spectrum coding scheme, the bandwidth of the uplink signal is spread and the uplink signal is made more resistant to jamming and noise. Beamforming is made possible by transmitting in-phase signals through each antenna in the antenna array which allows the transmittance of the high-energy uni-directional uplink signal towards each target. The drawing is a schematic diagram of a vehicle communication and radar sensing system.", "instruction": "Please summarize the input" }, { "input": "AUTONOMOUS VEHICLE ACTIVE INTERACTION WITH SURROUNDING ENVIRONMENTAn automated vehicle (AV) which automatically interacts with objects in a surrounding environment based on the objects determined intention and predicted actions determined based on their intention. Data is collected from an external environment by cameras, sensors, and optionally other devices on an AV. The data is processed to identify objects and a state for each object, and an interaction scenario is identified. For objects within the interaction scenario, an intention for each object is determined, and the action of the object is predicted. The AV generates a decision to perform an action to communicate the AV's action to one or more objects. Commands are generated to execute the decision, and the intention of the AV is implemented by executing the commands using one or more output mechanisms (horn, turn signal, display, and/or other mechanisms) for the \n\nAV.|1. An autonomous vehicle system for automatically interacting with a surrounding environment, the system comprising: \na data processing system comprising one or more processors, a memory, a planning module, and a control module, the data processing system to: \ndetect, from received sensor data, an object in an interaction scenario in an external environment; \nmonitor the object in response to the detecting the interaction scenario; \ndetermine an intention for the object within the external environment based on the monitoring, wherein the intention is determined based on detected object gestures and a detected object state; \ngenerating an object prediction based on the determined object intention and the detected object state; and \ngenerate one or more commands to indicate an intention of the autonomous vehicle in response to the generated prediction of the object. \n | 2. The system of claim 1, the data processing system further to: \npredict an action of the object based on the determined intention of the object; and \ndetermine an action to indicate the intention of the autonomous vehicle to the object, the one or more commands generated to implement the action. \n | 3. The system of claim 1, the data processing system further to: \ndetect the interaction scenario based on the received sensor data; and \nmonitor an activity of the object within the interaction scenario. \n | 4. The system of claim 1, wherein the object includes a pedestrian or a vehicle.\n | 5. The system of claim 1, wherein the intention of the object is determined at least in part based on gestures performed by the object and detected by the data processing system.\n | 6. The system of claim 1, wherein the received sensor data include semantic information to describe the object.\n | 7. The system of claim 1, wherein the received sensor data includes one or more of the following: a vehicle location, a vehicle action, a pedestrian location, and a pedestrian action.\n | 8. The system of claim 1, wherein the data processing system is further configured to select the intention of the autonomous vehicle based on the current object state.\n | 9. The system of claim 1, wherein the autonomous vehicle is configured to signal the intention via at least one of the following: a visual indication, an audio indication, and a Vehicle-to-everything (V2X) communication.\n | 10. The system of claim 1, wherein the interaction of the autonomous vehicle is determined at least in part on policies associated with traffic rules.\n | 11. The system of claim 1, wherein the intention of the object is determined at least in part based on gestures performed by the object and detected by the data processing system.\n | 12. A method for automatically interacting with a surrounding environment by an autonomous vehicle, the method comprising: \ndetecting, by a data processing system from received sensor data, an object in an interaction scenario in an external environment; \nmonitoring the object in response to the detecting the interaction scenario; \ndetermine an intention for the object within the external environment based on the monitoring, wherein the intention is determined based on detected object gestures and a detected object state; \ngenerating an object prediction based on the determined object intention and the detected object state; and \ngenerate one or more commands to indicate an intention of the autonomous vehicle in response to the generated prediction of the object. \n | 13. The method of claim 12, the data processing system further to: \npredict an action of the object based on the determined intention of the object; and \ndetermine an action to indicate the intention of the autonomous vehicle to the object, the one or more commands generated to implement the action. \n | 14. The method of claim 12, the data processing system further to: \ndetect the interaction scenario based on the received sensor data; and \nmonitor an activity of the object within the interaction scenario. \n | 15. The method of claim 12, wherein the object includes a pedestrian or a vehicle.\n | 16. The method of claim 12, wherein the intention of the object is determined at least in part based on gestures performed by the object and detected by the data processing system.\n | 17. The method of claim 12, wherein the received sensor data include semantic information to describe the object.\n | 18. The method of claim 12, wherein the received sensor data includes one or more of the following: a vehicle location, a vehicle action, a pedestrian location, and a pedestrian action.\n | 20. A non-transitory computer readable storage medium having embodied thereon a program, the program being executable by a processor to perform a method for automatically interacting with a surrounding environment by an autonomous vehicle, the method comprising: \ndetecting, by a data processing system from received sensor data, an object in an interaction scenario in an external environment; \nmonitoring the object in response to the detecting the interaction scenario; \ndetermine an intention for the object within the external environment based on the monitoring, wherein the intention is determined based on detected object gestures and a detected object state; \ngenerating an object prediction based on the determined object intention and the detected object state; and \ngenerate one or more commands to indicate an intention of the autonomous vehicle in response to the generated prediction of the object.", "output": "The autonomous vehicle system comprises a data processing system having multiple processors, a memory, a planning module (412), and a control module (414). The data processing system detects an object in an interaction scenario in an external environment (510) from received sensor data. The object is monitored in response to the detecting the interaction scenario. An intention is determined for the object within the external environment based on the monitoring. An object prediction is generated based on the determined object intention and the detected object state. The commands are generated to indicate an intention of the autonomous vehicle in response to the generated prediction of the object. INDEPENDENT CLAIMS are included for the following:a method for automatically interacting with a surrounding environment by an autonomous vehicle; anda non-transitory computer readable storage medium. Autonomous vehicle system for automatically interacting with a surrounding environment. Enhances the safety and efficiency of the automated vehicle at intersections. The drawing shows a block representation of a system for automatically interacting with a surrounding environment by an autonomous vehicle.412Planning module414Control module420Perception Module510External environment530Monitoring Module", "instruction": "Please summarize the input" }, { "input": "Surface Detection Via a Directed Autonomous VehicleA number of illustrative variations may include the steps of providing a first vehicle including at least one sensor, a controller configured to process sensor data, and a vehicle communication system; providing a driving surface having an actual coefficient of friction; determining at least one estimated driving surface coefficient of friction; communicating the at least one estimated driving surface coefficient from the first vehicle to the vehicle communication system; and communicating the at least one estimated driving surface coefficient from the vehicle communication system to at least one other vehicle directly or indirectly.What is claimed is: \n | 1. A method comprising: \nproviding a first vehicle comprising at least one sensor, a controller configured to process sensor data, and a vehicle communication system; \ndriving the first vehicle on a driving surface having an actual coefficient of friction; \ndetermining at least one estimated driving surface coefficient of friction; \ncommunicating the at least one estimated driving surface coefficient from the first vehicle to the vehicle communication system; \ncommunicating the at least one estimated driving surface coefficient from the vehicle communication system to at least one other vehicle. \n | 2. A method as set forth in claim 1 wherein the first vehicle further comprises a braking system configured to manipulate a brake set, a steering system configured to adjust a roadwheel direction, and a propulsion system configured to deliver driving power to the road wheels; and \nwherein determining at least one estimated driving surface coefficient of friction comprises manipulating at least one of the braking system, steering system, or propulsion system of the first vehicle. \n | 3. A method as set forth in claim 1 wherein determining at least one estimated driving surface coefficient of friction is accomplished via the at least one sensor.\n | 4. A method as set forth in claim 1 wherein the vehicle communication system is a cloud-based vehicle-to-vehicle communication system.\n | 5. A method as set forth in claim 1 wherein the vehicle communication system is a vehicle-to-everything communication system.\n | 6. A method as set forth in claim 1 wherein the first vehicle is an unmanned ground vehicle.\n | 7. A method as set forth in claim 1 wherein the first vehicle is an unmanned aerial vehicle.\n | 8. A method as set forth in claim 1, further comprising using the at least one estimated driving surface coefficient of friction to manipulate at least one of a braking system, a steering system, or a propulsion system of the at least one other vehicle.\n | 9. A method comprising: \nproviding an unmanned ground vehicle comprising at least one sensor, a controller configured to process sensor data, and a vehicle communication system; \ndriving the first vehicle on a driving surface having an actual coefficient of friction; \ndetermining at least one estimated driving surface coefficient of friction via the at least one sensor; and \ncommunicating the at least one estimated driving surface coefficient from the unmanned ground vehicle to the vehicle communication system. \n | 10. A method as set forth in claim 9, further comprising communicating the at least one estimated driving surface coefficient from the vehicle communication system to at least one other vehicle.\n | 11. A method as set forth in claim 10 wherein the at least one other vehicle comprises a braking system configured to manipulate a brake set, a steering system configured to adjust a roadwheel direction, and a propulsion system configured to deliver driving power to the road wheels; and \nfurther comprising using the at least one estimated driving surface coefficient of friction to manipulate at least one of the braking system, steering system, and propulsion system of the other vehicle. \n | 12. A method as set forth in claim 9 wherein the vehicle communication system is a cloud-based vehicle-to-vehicle communication system.\n | 13. A method as set forth in claim 9 wherein the vehicle communication system is a vehicle-to-everything communication system\n | 14. A method as set forth in claim 9 wherein determining at least one estimated driving surface coefficient of friction via the at least one sensor additionally comprises performing unmanned ground vehicle maneuvers comprising manipulating at least one of vehicle speed, acceleration, direction, or braking.\n | 15. A method comprising: \nproviding an unmanned aerial vehicle comprising at least one sensor, a controller configured to process sensor data, and a vehicle communication system; \ndetermining at least one estimated driving surface coefficient of friction of a driving surface via the at least one sensor; and \ncommunicating the at least one estimated driving surface coefficient from the unmanned aerial vehicle to the vehicle communication system. \n | 16. A method as set forth in claim 15, further comprising communicating the at least one estimated driving surface coefficient from the vehicle communication system to at least one other vehicle.\n | 17. A method as set forth in claim 16 wherein the at least one other vehicle comprises a braking system configured to manipulate a brake set, a steering system configured to adjust a roadwheel direction, and a propulsion system configured to deliver driving power to the road wheels; and \nfurther comprising using the at least one estimated driving surface coefficient of friction to manipulate at least one of the braking system, steering system, or propulsion system of the at least one other vehicle. \n | 18. A method as set forth in claim 16 wherein the vehicle communication system is a cloud-based vehicle-to-vehicle communication system.\n | 19. A method as set forth in claim 16 wherein the vehicle communication system is a vehicle-to-everything communication system.\n | 20. A method as set forth in claim 2 wherein the manipulating at least one of the braking system, steering system, or propulsion system of the first vehicle is performed at the maximum capability of the first vehicle.\n | 21. A method as set forth in claim 2 wherein the manipulating at least one of the braking system, steering system, or propulsion system of the first vehicle is performed without a passenger in the vehicle at the a capability of the first vehicle that would otherwise result in injury to a passenger in the vehicle.\n | 22. A method as set forth in claim 2 wherein the manipulating at least one of the braking system, steering system, or propulsion system of the first vehicle is performed without cargo in the vehicle at the a capability of the first vehicle that would otherwise result in damage to a cargo in the vehicle.\n | 23. A method as set forth in claim 11 wherein the manipulating at least one of the braking system, steering system, or propulsion system of the first vehicle is performed at the maximum capability of the first vehicle.\n | 24. A method as set forth in claim 11 wherein the manipulating at least one of the braking system, steering system, or propulsion system of the first vehicle is performed without a passenger in the vehicle at the a capability of the first vehicle that would otherwise result in injury to a passenger in the vehicle.\n | 25. A method as set forth in claim 11 wherein the manipulating at least one of the braking system, steering system, or propulsion system of the first vehicle is performed without cargo in the vehicle at the a capability of the first vehicle that would otherwise result in damage to a cargo in the vehicle.", "output": "The method involves providing a vehicle (14) with a sensor (16), where a control unit (18) is set up to process a sensor data (20). The vehicle communication system (22) is provided, where the vehicle is driven (24) on a roadway (26) with an actual coefficient of friction (28). The estimated road surface friction coefficient (32) is determined (30), where the estimated road surface coefficient is communicated (34) from the vehicle to the vehicle communication system. The estimated road surface coefficient is communicated (36) from the vehicle communication system to another vehicle (38). The vehicle is provided with a braking system (40) to actuate a set of brakes (42). Method for performing the surface detection by a guided autonomous vehicle. The control unit is set up to process a sensor data, where the vehicle is driven on a roadway with an actual coefficient of friction, and hence enables preventing the unintended imbalances in the driving force transferred from each wheel to a vehicle and performs the surface detection by a guided autonomous vehicle effectively. The drawing shows a flowchart of a method for performing the surface detection by a guided autonomous vehicle. 12Providing a vehicle with a sensor14,38Vehicles16Sensor18Control unit20Sensor data22Vehicle communication system24Driving a vehicle on a roadway with an actual coefficient of friction26Roadway28Actual coefficient of friction30Determining an estimated road surface friction coefficient32Estimated road surface friction coefficient34Communicating an estimated road surface coefficient from a vehicle to a vehicle communication system36Communicating a road surface coefficient from a vehicle communication system to another vehicle40Braking system42Brakes", "instruction": "Please summarize the input" }, { "input": "Safety method for a modular autonomous vehicle and a control device thereforA safety method, performed by a control device for a vehicle assembled from a set of modules, the vehicle including at least two modules, including at least one drive module and at least one functional module. The control device is in any of the at least two modules. The at least one drive module has a pair of wheels and is configured to be autonomously operated. The method includes detecting (s 101) an emergency situation in any of the at least two modules of the assembled vehicle, transmitting (s102) information about the detected emergency situation to a control center and controlling (s103) the module associated with the emergency situation to physically disconnect from the assembled vehicle. Also to a computer program, a computer-readable medium, a control device and a vehicle are included.The invention claimed is: \n | 1. A safety method, performed by a control device for a vehicle assembled from a set of modules, the vehicle comprising one or more of at least two modules, including: \nat least one drive module; and \nat least one functional module; \n wherein the control device is comprised in any one or more of the at least two modules and wherein the at least one drive module comprises a pair of wheels and is configured to be autonomously operated; \nthe method comprising: \ndetecting, by a first sensor element, an emergency situation in any one or more of the at least two modules of the assembled vehicle; \ntransmitting, by a transmitter, information about the detected emergency situation to a control center; and \ncontrolling the module associated with the emergency situation to physically disconnect from the assembled vehicle. \n | 2. The method according to claim 1, further comprising: \ncontrolling the disconnected module to move away from the at least one remaining module of the assembled vehicle and/or controlling the at least one remaining module of the assembled vehicle to move away from the disconnected module. \n | 3. The method according to claim 1, wherein, after transmitting information about the detected emergency situation to a control center, and before controlling the module associated with the emergency situation, physically disconnecting the module from the assembled vehicle; \nthe method further comprising: \nreceiving, from the control center, a command to physically disconnect the at least one drive module from the assembled vehicle. \n | 4. The method according to claim 1, wherein, before controlling the module associated with the emergency situation to physically disconnect from the assembled vehicle, the method further comprises: \nactivating an alarm informing about the emergency situation. \n | 5. The method according to claim 1, further comprising: \nidentifying a safe space where the emergency situation in the assembled vehicle will have a reduced impact on the environment; and \ncontrolling the assembled vehicle to move to the identified safe space prior to physically disconnecting the module. \n | 6. The method according to claim 5, wherein identifying the safe space where the emergency situation in the assembled vehicle will have a reduced impact on the environment is performed by means of a second sensor element comprising a radar, a lidar or a camera.\n | 7. The method according to claim 5, wherein identifying the safe space where the emergency situation in the assembled vehicle will have a reduced impact on the environment is based on information from the control center via 4G, 5G, V2I, Wi-Fi or any other wireless communication means.\n | 8. The method according to claim 5, wherein identifying the safe space where the emergency situation in the assembled vehicle will have a reduced impact on the environment is based on a type of the at least one functional module.\n | 9. The method according to claim 5, wherein identifying the safe space where the emergency situation in the assembled vehicle will have a reduced impact on the environment is based on a type of load in the at least one functional module.\n | 10. The method according to claim 5, wherein identifying the safe space where the emergency situation in the assembled vehicle will have a reduced impact on the environment is based on a type of emergency situation in the assembled vehicle.\n | 11. The method according to claim 1, wherein controlling the module associated with the emergency situation to physically disconnect from the assembled vehicle also comprises controlling the module to electrically disconnect from the assembled vehicle.\n | 12. The method according to claim 1, wherein the assembled vehicle comprises two drive modules and the at least one functional module, and wherein one of the drive modules is configured to operate as a master and the other drive module is configured to operate as a slave; \nthe method further comprises, when an emergency situation is detected in the master drive module: \ncontrolling the drive module configured to operate as a slave to operate as master. \n | 13. The method according to claim 12, further comprising, when an emergency situation is detected in the at least one functional module: \ncontrolling both drive modules to physically disconnect from the assembled vehicle. \n | 14. The method according to claim 1, further comprising detecting an emergency situation by means of the first sensor element including a temperature sensor, a pressure sensor, a smoke sensor, a particle sensor, a gas sensor and/or a camera arranged on the assembled vehicle.\n | 15. A computer memory storing program instructions which, when the program instructions are executed by a computer, causes the computer to carry out a method performed by the computer for a vehicle assembled from a set of modules, wherein the vehicle comprises one or more of at least two modules including: at least one drive module and at least one functional module, wherein the computer is comprised in any one or more of the at least two modules and wherein the at least one drive module comprises a pair of wheels and is configured to be autonomously operated, wherein the method comprises: \ndetecting, by a first sensor element, an emergency situation in any one or more of the at least two modules of the assembled vehicle; \ntransmitting, by a transmitter, information about the detected emergency situation to a control center; and \ncontrolling the module associated with the emergency situation to physically disconnect from the assembled vehicle. \n | 16. A control device of a vehicle assembled from a set of modules, the vehicle comprising one or more of at least two modules, including: \nat least one drive module; and \nat least one functional module; \n wherein the control device is comprised in any one or more of the at least two modules, and wherein the at least one drive module comprises a pair of wheels and is configured to be autonomously operated; \nthe control device being configured to: \ndetect, by a first sensor element, an emergency situation in any one or more of the at least two modules of the assembled vehicle; \ntransmit, by a transmitter, information about the detected emergency situation to a control center; and \ncontrol the module associated with the detected emergency situation to physically disconnect from the assembled vehicle. \n | 17. A vehicle assembled from a set of modules, wherein the vehicle comprises at least one control device, wherein the set of modules comprises one or more of at least two modules including: at least one drive module, and at least one functional module; and wherein the control device is comprised in any one or more of the at least two modules; and wherein the at least one drive module comprises a pair of wheels and is configured to be autonomously operated; and the control device being configured to: \ndetect, by a first sensor element, an emergency situation in any one or more of the at least two modules of the assembled vehicle; \ntransmit, by a transmitter, information about the detected emergency situation to a control center; and \ncontrol the module associated with the detected emergency situation to physically disconnect from the assembled vehicle.", "output": "The method involves detecting (s101) an emergency situation in any of two modules of an assembled vehicle. The information about the detected emergency situation is transmitted (s102) to a control centre. The module associated with the emergency situation is controlled (s103) to physically disconnect from the assembled vehicle. The disconnected module is controlled to move away from the remaining module of the assembled vehicle, and/or the remaining module of the assembled vehicle is controlled to move away from the disconnected module. The command is received to physically disconnect the drive module from the assembled vehicle. INDEPENDENT CLAIMS are included for the following:a computer program for vehicle assembled from set of modules;a control device of vehicle assembled from set of modules; anda vehicle assembled from set of modules. Safety method for vehicle such as bus and truck assembled from set of modules. The assembled vehicle is quickly and easily disassembled without manual work. The safe distance to the surrounding objects is maintained, and the accidents are avoided. The assembled vehicle is controlled to move to the identified safe space prior to physically disconnecting the module. The drawing shows a flowchart illustrating the safety process. s101Step for involves detecting emergency situation in any of two modules of assembled vehicles102Step for transmitting information about detected emergency situation to control centres103Step for controlling module associated with the emergency situation to physically disconnect from assembled vehicle", "instruction": "Please summarize the input" }, { "input": "FRICTION MONITORING SYSTEM FOR A VEHICLE AND A METHOD PERTAINING TO SUCH A SYSTEMA friction monitoring system (2) for vehicles (4, 16) which comprises a slipperiness detection device (6) suited to making measurements of at least one parameter related to slipperiness of a roadway close to a first vehicle (4), to determining at least one friction value on the basis of the measurement and to generating a friction signal (8) comprising said friction value determined. Also provided is a processing device (10) adapted to receiving said friction signal (8) and to generating a slipperiness information signal (12) comprising said friction value. The friction monitoring system (2) comprises also a first communication device (14) situated in the first vehicle (4) and adapted to receiving said slipperiness information signal (12) and to transmitting a processed slipperiness information signal (15) wirelessly in a format such that one or more other vehicles (16) can receive the processed signal (15), process it and, where necessary, activate at least one skid protection system (17) in said other vehicle on the basis of the information contained in the processed signal (15), said slipperiness information signal (12) being arranged to be passed on and, where necessary, to activate at least one skid protection system (22) of the first vehicle (4) in accordance with a set of dynamic activation rules.|1. A friction monitoring system (2) for vehicles (4, 16) which comprises a slipperiness detection device (6) suited to making measurements of at least one parameter related to slipperiness of a roadway close to a first vehicle (4), to determining at least one friction value on the basis of the measurement and to generating a friction signal (8) comprising said friction value determined, a processing device (10) adapted to receiving said friction signal (8) and to generating a slipperiness information signal (12) comprising said friction value, a first communication device (14) situated in said first vehicle (4) and adapted to receiving said slipperiness information signal (12) and to transmitting a processed slipperiness information signal (15) wirelessly in a format such that one or more other vehicles (16) can receive the processed signal (15), process it and, where necessary, activate at least one skid protection system (17) in said other vehicle on the basis of the information in the processed signal received (15), said slipperiness information signal (12) is arranged to be passed on and, where necessary, to activate at least one skid protection system (22) of the first vehicle (4) in accordance with a set of dynamic activation rules characterised in that said set of dynamic activation rules comprises parameters related to nearby vehicles.\n | 2. The friction monitoring system (2) according to claim 1, in which said nearby vehicles are part of a vehicle train.\n | 3. The friction monitoring system (2) according to claim 2, in which said parameters comprise the length of the vehicle train.\n | 4. The friction monitoring system (2) according to any one of claims 1-3, which comprises a location determination device (19) adapted to determining the location of the first vehicle, to determining a location value on the basis of the location determined and to generating a location signal (20) on the basis of said location value determined, said processing device (10) being adapted to receiving said location signal (20) and to generating said slipperiness information signal (12) comprising coordinated friction values and location values.\n | 5. The friction monitoring system (2) according to claim 4, in which the processing device (10) is adapted to relating each friction value to a location value so that a specific friction value unambiguously indicates how slippery the roadway is at a given location.\n | 6. The friction monitoring system (2) according to any one of claims 1-5, in which at least one of said first and second vehicles (4,16) is part of a vehicle train.\n | 7. The friction monitoring system (2) according to any one of claims 1-6, in which at least one of said first and second vehicles (4,16) is an autonomous vehicle in a vehicle train.\n | 8. The friction monitoring system (2) according to any one of the foregoing claims, in which said format for the processed slipperiness information signal (15) is suited to vehicle-to-vehicle transmission.\n | 9. The friction monitoring system (2) according to any one of the foregoing claims, in which said format for the processed slipperiness information signal (15) is suited to vehicle-to-infrastructure transmission.\n | 10. A method for a skid protection system for vehicles, which method comprises \n * - making measurements of at least one parameter related to slipperiness of a roadway close to a first vehicle,\n * - determining at least one friction value on the basis of the measurement,\n * - generating a friction signal comprising said friction value determined,\n * - receiving said friction signal in a processing device and generating a slipperiness information signal comprising said friction value,\n * - receiving said slipperiness information signal in a communication device in said first vehicle,\n * - sending a processed slipperiness information signal out wirelessly in a format such that one or more other vehicles can receive the signal,\n * - processing the processed slipperiness information signal received and, where necessary, activating at least one skid protection system in said other vehicle on the basis of the information in the slipperiness information signal received, and\n * - acting upon and, where necessary, activating at least one skid protection system of the first vehicle in accordance with a set of dynamic activation rules and characterised in that said set of dynamic activation rules comprises parameters related to nearby vehicles.\n | 11. The method according to claim 10, in which said nearby vehicles are part of a vehicle train.\n | 12. The method according to claim 11, in which said parameters comprise the length of the vehicle train.\n | 13. The method according to any one of claims 10-12, comprises \n * - determining the location of the first vehicle in a location measuring device,\n * - determining a location value on the basis of the location determined,\n * - generating a location signal on the basis of said location value,\n * - receiving said location signal in said processing device,\n * - generating said slipperiness information signal comprising coordinated friction values and location values.\n | 14. The method according to claim 13, in which the processing device is adapted to relating each friction value to a location value so that a specific friction value unambiguously indicates how slippery the roadway is at a given location.\n | 15. The method according to any one of claims 10-14, in which at least one of said first and second vehicles is part of a vehicle train.\n | 16. The method according to any one of claims 10-15, in which at least one of said first and second vehicles is an autonomous vehicle in a vehicle train.\n | 17. The method according to any one of claims 10-16, in which said format for the processed slipperiness information signal is suited to vehicle-to-vehicle transmission.\n | 18. The method according to any one of claims 10-17, in which said format for the processed slipperiness information signal is suited to vehicle-to-infrastructure transmission.\n | 19. A computer programme (P) for vehicles, which programme (D) comprises programme code for causing a processing device (10; 500) or another computer (500) connected to the processing device (10; 500) to perform steps of the method according to any one of claims 10-18.\n | 20. A computer programme product comprising a programme code stored on a computer-readable medium for performing method steps according to any one of claims 10-18 when said programme code is run on a processing device (10; 500) or another computer (500) connected to the processing device (10; 500).", "output": "The friction monitoring system has a communication device (14) in a host vehicle (4) to receive a slipperiness information signal (12) and transmit wirelessly a processed slipperiness information signal (15) to other vehicles (16), activating the skid protection system (17) of the other vehicle. The slipperiness information signal is also used to active the skid protection system (22) of the host vehicle in accordance with set of dynamic activation rules. INDEPENDENT CLAIMS are included for the following:a method for a skid protection system for vehicles;a computer program for vehicles; anda computer program product. Friction monitoring system for vehicles, such as autonomous vehicles. Slipperiness information can be disseminated to other vehicles included in a vehicle train, improving traffic safety as well as safety of the vehicle. The drawing shows the block diagram of a friction monitoring system. 4Host vehicle12Slipperiness information signal14Communication device15Processed slipperiness information signal16Other vehicles17Skid protection system of other vehicles22Skid protection system of host vehicle", "instruction": "Please summarize the input" }, { "input": "Positioning quality filter for the V2X technologiesThis provides methods and systems for V2X applications, such as forward collision warning, electronic emergency brake light, left turn assist, work zone warning, signal phase timing, and others, mainly relying on a GNSS positioning solution transmitted via the Dedicated Short-Range Communications (DSRC) to/from the roadside units and onboard units in other V2X-enabled vehicles. However, the positioning solution from a GNSS may be deteriorated by noise and/or bias due to various error sources, e.g., time delay, atmospheric effect, ephemeris effect, and multipath effect. This offers a novel quality filter that can detect noise and the onset of drift in GNSS signals by evaluating up to four metrics that compare the qualities of kinematic variables, speed, heading angle change, curvature, and lateral displacement, obtained directly or derived from GNSS and onboard vehicle sensors. This is used for autonomous cars and vehicle safety, with various examples/variations.The invention claimed is: \n | 1. A method for positioning quality filter for a global navigation system for a vehicle, said method comprising: \na central computer receiving global positioning system location data; \nsaid central computer receiving sensors data from vehicle sensors; \nsaid sensors data from said vehicle sensors comprises data from a vehicle speed sensor, a vehicle direction sensor, and a vehicle yaw rate sensor on said vehicle; \nsaid central computer calculating a first metric value based on said sensors data from said vehicle sensors, based on said vehicle speed, said vehicle direction, and said vehicle yaw rate; \na processor receiving a first threshold; \nsaid processor comparing said first metric value with said first threshold; \nafter determining that said first metric value is larger than or equal to said first threshold, said processor receiving a second threshold; \na) said central computer calculating a second metric value; \nb) said processor comparing said second metric value with said second threshold; \nc) after determining that said second metric value is smaller than said second threshold, said processor receiving a third threshold; \na. said central computer calculating a third metric value; \nb. said processor comparing said third metric value with said third threshold; \nc. after determining that said third metric value is larger than or equal to said third threshold, said processor receiving a fourth threshold; \na. said central computer calculating a fourth metric value; \nb. said processor comparing said fourth metric value with said fourth threshold; \nc. after determining that said fourth metric value is smaller than said fourth threshold, said processor setting said global navigation system value as valid; \nsaid central computer validating said global positioning system location data 9 using said global navigation system value, for safety, operation, or navigation of said vehicle;\nsaid central computer sending a notice to a vehicle warning device; \nsaid central computer correcting a navigation of said vehicle; said central computer adjusting direction of said vehicle. \n | 2. The method for positioning quality filter for a global navigation system for a vehicle, as recited in claim 1, wherein said first threshold is not greater than 1.\n | 3. The method for positioning quality filter for a global navigation system for a vehicle, as recited in claim 1, said method comprises: warning an operator.\n | 4. The method for positioning quality filter for a global navigation system for a vehicle, as recited in claim 1, said method comprises: warning an driver.\n | 5. The method for positioning quality filter for a global navigation system for a vehicle, as recited in claim 1, said method comprises: warning headquarters.\n | 6. The method for positioning quality filter for a global navigation system for a vehicle, as recited in claim 1, said method comprises: warning a central server.\n | 7. The method for positioning quality filter for a global navigation system for a vehicle, as recited in claim 1, said method comprises: warning another driver.\n | 8. The method for positioning quality filter for a global navigation system for a vehicle, as recited in claim 1, said method comprises: warning another car.\n | 9. The method for positioning quality filter for a global navigation system for a vehicle, as recited in claim 1, said method comprises: communicating with pedestrians.\n | 10. The method for positioning quality filter for a global navigation system for a vehicle, as recited in claim 1, said method comprises: communicating with cloud.\n | 11. The method for positioning quality filter for a global navigation system for a vehicle, as recited in claim 1, said method comprises: communicating with server farms.\n | 12. The method for positioning quality filter for a global navigation system for a vehicle, as recited in claim 1, said method comprises: communicating with police.\n | 13. The method for positioning quality filter for a global navigation system for a vehicle, as recited in claim 1, said method comprises: communicating with a grid, a secured network, or outside car sensors.\n | 14. The method for positioning quality filter for a global navigation system for a vehicle, as recited in claim 1, said method comprises: resolving conflict between sensors and/or received data.", "output": "The positioning method involves use of processor for comparing primary metric value with primary threshold. The processor compares fourth metric value with fourth threshold. The processor sets the global navigation system value as valid in case fourth metric value is smaller than fourth threshold. The processor sets the global navigation system value as invalid when fourth metric value is larger than or equal to fourth threshold. A central computer validates global positioning system location data using global navigation system value, for safety, operation, or navigation of vehicle. An INDEPENDENT CLAIM is also included for a method for positioning quality filter for a positioning system for an automated or autonomous vehicle. Positioning method for quality filter of global navigation system for vehicle e.g. automated or autonomous vehicle. A weighted-averaging process based on the redundancies between coverage of different units, to weighted-average of the data for more accurate results, with more weights for the more reliable units or sources, or higher weights for the results that are closer to the center of curve representing the distribution of values, eliminating or reducing the fringe results or erroneous data. Such estimates and statistics for patterns or behaviors for people are very valuable for marketing and sales people who want to predict and plan ahead. The drawing shows a representation of development of fully automated vehicles, in stages.", "instruction": "Please summarize the input" }, { "input": "Methods and systems for V2X congestion control using directional antennas, and determining OBU transmission power based on the weather data received from vehicle CANSelf-driving and autonomous vehicles are very popular these days for scientific, technological, social, and economical reasons. In one aspect of this technology, one of the main concerns for an implementation of any V2X technology on a large scale is the issue of congestion control. In large cities and crowded highways during rush hours, each host vehicle can get messages from over 200 other vehicles and several road side units, all working on the same channel and trying to send and receive messages at the same time. With respect to the weather effect on signal, the signal path loss occurs whenever there is moderate (or moderate plus) rain, and because of that, the OBU communications packets are prone to get lost, or communication coverage region gets diminished, depending upon the intensity, speed, angle and temperature of the rainfall/snowfall droplets. We have provided the solutions for these 2 problems, with variations.The invention claimed is: \n | 1. A method for an autonomous or automated vehicle operation, said method comprising: \na central computer receiving a vehicle state for said vehicle; \na communication channel for said vehicle transmitting in an omni-directional radiation pattern; \nsaid communication channel for said vehicle receiving messages from all directions; \nin case said vehicle state indicating a congestion on said communication channel for said vehicle, said communication channel for said vehicle changing to transmission in a specific directional pattern, and said communication channel for said vehicle continuing receiving messages from all directions. \n | 2. The method for an autonomous or automated vehicle operation, as recited in claim 1, said method comprises: \nin case channel congestion is detected, said communication channel for said vehicle changing to transmission in a directional pattern toward front and back of said vehicle, and said communication channel for said vehicle continuing receiving messages from all directions. \n | 3. The method for an autonomous or automated vehicle operation, as recited in claim 2, said method comprises: \nsaid communication channel for said vehicle switching between transmitting in an omni-directional radiation pattern and transmitting in a directional pattern toward said front and back of said vehicle, based on channel congestion. \n | 4. The method for an autonomous or automated vehicle operation, as recited in claim 2, said method comprises: \nsaid communication channel for said vehicle switching between transmitting in an omni-directional radiation pattern and transmitting in a directional pattern toward said front and back of said vehicle, based on driving speed. \n | 5. The method for an autonomous or automated vehicle operation, as recited in claim 1, said method comprises: \nmonitoring said state for said vehicle, or \nmonitoring said congestion on said communication channel for said vehicle. \n | 6. The method for an autonomous or automated vehicle operation, as recited in claim 1, said method comprises: \nsaid communication channel for said vehicle switching to transmitting in an omni-directional radiation pattern. \n | 7. The method for an autonomous or automated vehicle operation, as recited in claim 1, said method comprises: \nmonitoring a threshold value for said congestion on said communication channel for said vehicle. \n | 8. The method for an autonomous or automated vehicle operation, as recited in claim 1, said method comprises: \nmonitoring a threshold value based on number of messages, or rate of messages for said congestion on said communication channel for said vehicle, or \nmonitoring a threshold value based on number of cars for said congestion on said communication channel for said vehicle. \n | 9. The method for an autonomous or automated vehicle operation, as recited in claim 1, said method comprises: \nmonitoring a threshold value based on bandwidth capacity of said communication channel for said congestion on said communication channel for said vehicle. \n | 10. The method for an autonomous or automated vehicle operation, as recited in claim 1, said method comprises: \nusing multiple of directional transmission schemes or patterns. \n | 11. The method for an autonomous or automated vehicle operation, as recited in claim 1, said method comprises: \nusing map of a road for switching on transmission methods. \n | 12. The method for an autonomous or automated vehicle operation, as recited in claim 1, said method comprises: \nusing map and elevation data of a road for optimization of switching on transmission methods. \n | 13. The method for an autonomous or automated vehicle operation, as recited in claim 1, said method comprises: \nusing a combination of directional transmission and non-directional transmission simultaneously. \n | 14. The method for an autonomous or automated vehicle operation, as recited in claim 1, said method comprises: \nusing 2 of directional transmission schemes or patterns. \n | 15. The method for an autonomous or automated vehicle operation, as recited in claim 1, said method comprises: \nusing antenna arrays, a group of antennas, Cassegrain antenna, or parabolic antenna. \n | 16. The method for an autonomous or automated vehicle operation, as recited in claim 1, said method comprises: \nusing map and intersections data as potential danger points for a road for optimization of switching on transmission methods. \n | 17. The method for an autonomous or automated vehicle operation, as recited in claim 1, said method comprises: \nusing a transition mode of transmission between and for optimization of switching on transmission methods. \n | 18. The method for an autonomous or automated vehicle operation, as recited in claim 1, said method comprises: \nindependently optimizing transmission and listening modes or methods. \n | 19. The method for an autonomous or automated vehicle operation, as recited in claim 1, said method comprises: \ncombining optimization of transmission and listening modes or methods. \n | 20. The method for an autonomous or automated vehicle operation, as recited in claim 1, said method comprises: \noptimizing of transmission based on multiple thresholds, conditions, or triggers.", "output": "The method involves comprising a central computer receiving a vehicle state for the vehicle. A communication channel is for the vehicle transmitting in an omni-directional radiation pattern. The communication channel for the vehicle receives messages from all directions. The communication channel for the vehicle changing to transmission in a specific directional pattern, and the communication channel for the vehicle continues receiving messages from all directions in case the vehicle state indicates congestion on the communication channel for the vehicle. Method for operation of autonomous or automated vehicle such as car, sedan, truck, bus, pickup truck, sport utility vehicle (SUV), tractor, agricultural machinery, entertainment vehicles, motorcycle, bike, bicycle, and hybrid. The overall number of messages each target gets is reduced and the channel congestion is avoided. The drawing shows a block diagram for a system with monitoring congestion with multiple modes of transmission.", "instruction": "Please summarize the input" }, { "input": "Automatic driving method and systemThe embodiment of the invention claims an automatic driving system, comprising: obtaining the image data of the peripheral vehicle of the automatic driving vehicle by an image processing unit, and processing the image data to obtain the first driving state information of the peripheral vehicle; communicating with the peripheral vehicle in the V2X communication distance by the V2X communication unit to obtain the second driving state information of the peripheral vehicle; through the decision unit, according to the first driving state information of the peripheral vehicle, or the first driving state information and the second driving state information, determining the driving action of the automatic driving vehicle, The invention solves the problem that the traffic safety is influenced by the wrong judgement of the surrounding vehicle caused by the special condition of the automatic driving vehicle in the related technology, and improves the automatic driving safety.|1. An automatic driving system, wherein it is set on the automatic driving vehicle, comprising: an image processing unit for obtaining the image data of the peripheral vehicle of the automatic driving vehicle and processing the image data to obtain the first driving state information of the peripheral vehicle; a vehicle network V2X communication unit for obtaining the second driving state information of the peripheral vehicle by communicating with the peripheral vehicle in the V2X communication distance; a decision unit for determining the driving action of the automatic driving vehicle according to the first driving state information of the peripheral vehicle or the first driving state information and the second driving state information.\n | 2. The system according to claim 1, wherein the first driving state information of the peripheral vehicle comprises at least one of the following: vehicle profile information; vehicle orientation information.\n | 3. The system according to claim 1, wherein the second driving state information of the peripheral vehicle comprises at least one of: vehicle identification information; vehicle latitude and longitude information; vehicle driving speed information; vehicle driving direction information.\n | 4. The system according to claim 1, wherein the driving action of the automatic driving vehicle comprises at least one of the following: running at uniform speed; accelerating to drive; reducing speed to drive; Road change.\n | 5. The system according to claim 1, wherein the decision unit further comprises: a distance outer decision sub-unit for calculating the driving speed and driving direction of the peripheral vehicle according to the first driving state information outside the V2X communication distance, and determining the driving action of the automatic driving vehicle according to the driving speed and driving direction of the peripheral vehicle.\n | 6. The system according to claim 1, wherein the decision unit further comprises: a distance inner decision sub-unit, for in the V2X communication distance, judging whether the first driving state information and the second driving state information are the information of the same peripheral vehicle, if so, then according to the first driving state information and the second driving state information, determining the driving speed and driving direction of the peripheral vehicle, if not, then according to the second driving state information, determining the driving speed and driving direction of the peripheral vehicle, and determining the driving action of the automatic driving vehicle according to the driving speed and the driving direction of the peripheral vehicle.\n | 7. An automatic driving method, wherein it is used for automatic driving vehicle, comprising: obtaining the image data of the peripheral vehicle of the automatic driving vehicle, and processing the image data to obtain the first driving state information of the peripheral vehicle; obtaining second driving state information of the peripheral vehicle by communicating with the peripheral vehicle within a V2X communication distance; determining the driving action of the automatic driving vehicle according to the first driving state information of the peripheral vehicle, or the first driving state information and the second driving state information.\n | 8. The method according to claim 7, wherein the step of determining the driving action of the automatic driving vehicle according to the first driving state information comprises: out of the V2X communication distance, calculating the driving speed and driving direction of the peripheral vehicle according to the first driving state information; and determining the driving action of the automatic driving vehicle according to the driving speed and the driving direction of the peripheral vehicle.\n | 9. The method according to claim 7, wherein the step of determining the driving action of the automatic driving vehicle according to the first driving state information and the second driving state information comprises: in the V2X communication distance, judging whether the first driving state information and the second driving state information are the information of the same peripheral vehicle, if so, determining the driving speed and driving direction of the peripheral vehicle according to the first driving state information and the second driving state information, if not, determining the driving speed and driving direction of the peripheral vehicle according to the second driving state information; and determining the driving action of the automatic driving vehicle according to the driving speed and the driving direction of the peripheral vehicle.\n | 10. A computer readable storage medium, wherein the computer readable storage medium is stored with a computer program, wherein the computer program is executed by a processor to realize the method according to any one of claims.\n | 11. An electronic device, comprising a memory, a processor and a computer program stored on the memory and capable of being operated on the processor, wherein the method according to any one of claims is realized when the computer program is executed by the processor.", "output": "The system (20) has an image processing unit (210) that is provided for obtaining the image data of the peripheral vehicle of the automatic driving vehicle and processes the image data to obtain the first driving state information of the peripheral vehicle. A vehicle network vehicle to vehicle (V2V) communication unit (220) is provided for obtaining the second driving state information of the peripheral vehicle by communicating with the peripheral vehicle in the communication distance. A decision unit (230) is provided for determining the driving action of the automatic driving vehicle according to the first driving state information of the peripheral vehicle or the first driving state information and the second driving state information. INDEPENDENT CLAIMS are included for the following:an automatic driving method applied to an automatic driving vehicle;a computer readable storage medium storing program for providing the automatic driving of vehicle; andan electronic device. Automatic driving system for automatic driving vehicle. The system solves the defect that the peripheral vehicle state information in the existing technology is not accurate, and the different system modules are used at different vehicle distances to improve the precision of the obtained peripheral vehicle state information, so as to make the automatic driving vehicle to perform better reaction action, so as to achieve safer driving target. The drawing shows a block diagram of the automatic driving system. (Drawing includes non-English language text) 20Automatic driving system210Image processing unit220Vehicle to vehicle communication unit230Decision unit", "instruction": "Please summarize the input" }, { "input": "WAFER TRANFERING AUTOMATION SYSTEM AND OPEPRATING METHOD THEREOFThe wafer transport automation system according to the present invention loads at least one airtight container including a plurality of Front Opening Unified Pods (FOUPs) and performs vehicle-to-vehicle communication. An autonomous robot that transfers or loads at least one airtight container and moves the loaded airtight container so as to closely dock with a partition wall dividing a clean room and a non-clean room, and the at least one airtight container loaded on the autonomous robot An interface device that transfers each of a plurality of FOUPs between a container and a conveyor in the clean room, and a transport that performs wireless communication with the autonomous vehicles, the autonomous robot, and the interface device to perform wafer transport automation May contain automation servers.|1. Autonomous vehicles that load at least one airtight container including a plurality of Front Opening Unified Pods (FOUPs) and perform vehicle-to-vehicle communication;\nan autonomous robot that transfers or loads the at least one airtight container from each of the autonomous vehicles and moves the loaded airtight container so that the airtight container is closely docked with a bulkhead dividing a clean room and a non-clean room; \nan interface device for transferring each of a plurality of FOUPs between the at least one airtight container loaded in the autonomous robot and the conveyor of the clean room; and a transfer automation server that performs wireless communication with the self-driving vehicles, the self-driving robot, and the interface device to perform wafer transfer automation. \n | 2. The wafer transport automation system according to claim 1, wherein the at least one sealing container includes a sealing unit performing an internal/external blocking function.\n | 3. The wafer transport automation system of claim 1, wherein the at least one sealed container includes at least one damped vibration unit performing a low/no vibration function.\n | 4. The wafer transport automation system according to claim 1, wherein the at least one airtight container inputs and outputs nitrogen gas to maintain constant temperature/humidity/low moisture.\n | 5. The wafer transport automation system according to claim 1, wherein the at least one sealed container includes a Radio Frequency Identification (RFID) tag for computerized lot management.\n | 6. The method of claim 1, wherein each of the self-driving vehicles performs a function of automatically opening and closing a loading box door for inputting/dispensing of the airtight container, maintaining a constant temperature/humidity inside the vehicle body, or performing a function of maintaining a constant temperature/humidity inside the vehicle body while driving. A wafer transfer automation system characterized in that it performs a low-vibration / no-vibration maintenance function.\n | 7. The wafer transport automation system according to claim 1, wherein each of the autonomous vehicles performs parallel input output (PIO) communication with an autonomous robot or wireless communication with the transport automation server.\n | 8. The method of claim 1, wherein the self-driving robot supplies nitrogen gas to the at least one airtight container, performs PIO (Parallel Input Output) communication with each of the self-driving vehicles, or communicates with the transfer automation server and the wireless A wafer transfer automation system characterized in that it performs communication.\n | 9. The method of claim 1, wherein the interface device opens and closes a door of the bulkhead, opens and closes a door of the at least one airtight container, performs PIO (Parallel Input Output) communication with the autonomous robot, or the transfer automation server. And the wafer transfer automation system, characterized in that performing the wireless communication.\n | 10. A method of operating a wafer transport automation system, comprising: moving an airtight container including a plurality of Front Opening Unified Pods (FOUPs) from a first factory using an autonomous vehicle;\ntransferring the airtight container of the self-driving vehicle to the self-driving robot; \nclosely docking the airtight container to a partition wall between a clean room and a non-clean room of a second factory; \nopening the airtight container by an interface device and dispensing each of the plurality of FOUPs to a conveyor of the clean room; and transferring the ejected FOUP to the conveyor.", "output": "The system has an airtight container (130) including multiple front opening unified pods (FOUPs) for performing vehicle-to-vehicle communication. An autonomous robot (200) transfers or loads the container from each of autonomous vehicles (100) and moves the loaded container such that the container is closely docked with a bulkhead dividing a clean room and a non-clean room. An interface device (300) transfers each of the FOUPs between the container loaded in the autonomous robot and a conveyor of the clean room. A transfer automation server (400) performs wireless communication with the autonomous vehicles. An INDEPENDENT CLAIM is also included for a method for operating an automation wafer transfer system for use during semiconductor manufacturing process. Automation wafer transfer system for use during semiconductor manufacturing process. The system achieves complete automation of wafer transport by transferring an airtight container having wafer between factories through an autonomous vehicle between factories, reduces the transport waiting time by eliminating manual work outside logistics, and minimizes the safety hazards to workers. The drawing shows a schematic view of an automation wafer transfer system for use during semiconductor manufacturing process.(Drawing includes non-English language text).100Autonomous vehicle130Airtight container200Autonomous robot300Interface device400Transfer automation server", "instruction": "Please summarize the input" }, { "input": "Electronic device for supporting wireless mobile communication for vehicle and operation method of the sameProvided are an electronic device for supporting vehicle-to-everything (V2X) communication on which an autonomous driving vehicle technology, a cooperative-intelligent transport systems (C-ITS) technology, etc. are based and an operation method of the electric device. The electronic device mounted to a vehicle to support wireless mobile communication for the vehicle includes: a dedicated short range communication (DSRC) module configured to perform wireless communication by using DSRC technology; a cellular V2X (C-V2X) module configured to perform wireless communication by using C-V2X technology; an antenna; and a processor configured to control a switch to connect one of the DSRC module and the C-V2X module to the antenna.What is claimed is: \n | 1. An electronic device mounted to a vehicle to support wireless mobile communication for the vehicle, the electronic device comprising: \na dedicated short range communication (DSRC) module configured to perform wireless communication by using DSRC technology; \na cellular vehicle-to-everything (C-V2X) module configured to perform wireless communication by using C-V2X technology; \nan antenna; \na switch; and \na processor configured to control the switch to connect the DSRC module or the C-V2X module to the antenna, \nwherein the antenna includes a pair of sub-antennas for diversity transmission and diversity reception, and \nwherein the switch is configured to receive two output signals from the DSRC module for the pair of sub-antennas or two output signals from the C-V2X module for the pair of sub-antennas and output two signals to the pair of sub-antennas. \n | 2. The electronic device of claim 1, further comprising a telematics control unit (TCU).\n | 3. The electronic device of claim 1, wherein each of the DSRC module and the C-V2X module includes a V2X modem and a radio frequency (RF) transceiver.\n | 4. The electronic device of claim 1, wherein the processor is further configured to: \nselect a module from the DSRC module and the C-V2X module based on location information of the vehicle; and \ncontrol the switch to connect the module selected from the DSRC module and the C-V2X module to the antenna. \n | 5. The electronic device of claim 4, wherein the location information of the vehicle includes a global positioning system (GPS) signal for the vehicle, \nwherein the module selected from the DSRC module and the C-V2X module is selected based on information about a V2X communication technology corresponding to a location of the vehicle and the location information of the vehicle, and \nwherein the module selected from the DSRC module and the C-V2X module supports a V2X communication technology corresponding to a current location of the vehicle. \n | 6. The electronic device of claim 1, wherein the processor is further configured to: \nselect a module from the DSRC module and the C-V2X module based on information about a base station that is performing cellular communication with the electronic device; and \ncontrol the switch to connect the module selected from the DSRC module and the C-V2X module to the antenna. \n | 7. The electronic device of claim 6, wherein the module selected from the DSRC module and the C-V2X module is selected based on information about a V2X communication technology corresponding to the base station and the information about the base station, and \nwherein the module selected from the DSRC module and the C-V2X module supports the V2X communication technology corresponding to the base station. \n | 8. The electronic device of claim 1, wherein the processor is further configured to: \nselect a module from the DSRC module and the C-V2X module by periodically comparing a DSRC signal received via the DSRC module with a C-V2X signal received via the C-V2X module; and \ncontrol the switch to connect the module selected from the DSRC module and the C-V2X module to the antenna. \n | 9. The electronic device of claim 8, wherein the module selected from the DSRC module and the C-V2X module is selected based on comparing packet error rate (PER), packet reception rate (PRR), latency, and/or strength of the DSRC signal and the C-V2X signal to each other.\n | 10. The electronic device of claim 9, wherein the processor is further configured to: \ncontrol the switch to perform diversity communication by using one of the DSRC module and the C-V2X module and the pair of sub-antennas included in the antenna; \ncontrol the switch to receive the DSRC signal via a first sub-antenna of the pair of sub-antennas and the C-V2X signal via a second sub-antenna at preset time periods; and \ndetermine whether to change V2X communication technology based on a result of the comparing the DSRC signal with the C-V2X signal. \n | 11. The electronic device of claim 1, wherein the processor is further configured to: \nobtain surrounding environment information and select a module from the DSRC module and the C-V2X module based on the obtained surrounding environment information; and \ncontrol the switch to connect the module selected from the DSRC module and the C-V2X module to the antenna. \n | 12. The electronic device of claim 1, wherein the processor is further configured to: \nobtain a captured image of the vehicle's surrounding environment as surrounding environment information; and \nidentify, in the captured image, an entity supporting vehicle-to-infrastructure (V2I) communication with the vehicle and select a module from the DSRC module and the C-V2X module as a module supporting a V2X communication technology corresponding to the identified entity. \n | 13. An operation method of an electronic device mounted to a vehicle to support wireless mobile communication for the vehicle, the operation method comprising: \nselecting a module from a dedicated short range communication (DSRC) module configured to perform wireless communication by using DSRC technology and a cellular vehicle-to-everything (C-V2X) module configured to perform wireless communication by using C-V2X technology; \ncontrolling a switch to connect the selected module to an antenna; and \nperforming V2X communication via the selected module, \nwherein the antenna includes a pair of sub-antennas for diversity transmission and diversity reception, and \nwherein the method further comprises: \nreceiving two output signals from the DSRC module for the pair of sub-antennas or two output signals from the C-V2X module for the pair of sub-antennas; and \noutputting two signals to the pair of sub-antennas. \n | 14. The operation method of claim 13, wherein location information of the vehicle includes a global positioning system (GPS) signal for the vehicle, \nwherein the selecting of the module further comprises: \nobtaining the location information of the vehicle; and \nselecting the module from the DSRC module and the C-V2X module based on information about a V2X communication technology corresponding to a location of the vehicle and the location information of the vehicle, and \nwhere the module selected from the DSRC module and the C-V2X module supports a V2X communication technology corresponding to a current location of the vehicle. \n | 15. The operation method of claim 13, wherein the selecting of the module further comprises: \nobtaining information about a base station that is performing cellular communication with the electronic device; and \nselecting the module from the DSRC module and the C-V2X module based on information about a V2X communication technology corresponding to the base station and the information about the base station, \nwherein the module selected from the DSRC module and the C-V2X module supports the V2X communication technology corresponding to the base station. \n | 16. The operation method of claim 13, wherein the module selected from the DSRC module and the C-V2X module is selected based on periodically comparing a DSRC signal received via the DSRC module with a C-V2X signal received via the C-V2X module.\n | 17. The operation method of claim 13, wherein the selecting of the module further comprises: \nobtaining surrounding environment information of the vehicle; and \nselecting the module from the DSRC module and the C-V2X module based on the obtained surrounding environment information. \n | 18. A non-transitory computer-readable recording medium having stored therein a program for performing an operation method of an electronic device mounted to a vehicle to support wireless mobile communication for the vehicle, the operation method comprising: \nselecting a module from a dedicated short range communication (DSRC) module configured to perform wireless communication by using DSRC technology and a cellular vehicle-to-everything (C-V2X) module configured to perform wireless communication by using C-V2X technology; \ncontrolling a switch to connect the selected module to an antenna; and \nperforming V2X communication via the selected module, \nwherein the antenna includes a pair of sub-antennas for diversity transmission and diversity reception, and \nwherein the operation method further comprises: \nreceiving two output signals from the DSRC module for the pair of sub-antennas or two output signals from the C-V2X module for the pair of sub-antennas; and \noutputting two signals to the pair of sub-antennas.", "output": "The electronic device comprises a dedicated short range communication (DSRC) module configured to perform wireless communication by using DSRC technology. A cellular vehicle-to-everything (C-V2X) module is configured to perform wireless communication by using C-V2X technology. A processor is provided to control a switch to connect the DSRC module or the C-V2X module to the antenna. The DSRC module and the C-V2X module include a V2X modem and a radio frequency (RF) transceiver. The antenna includes a pair of sub-antennas for diversity transmission and diversity reception. A module is selected from the DSRC module and the C-V2X module based on location information of the vehicle (30,100). INDEPENDENT CLAIMS are included for the following:a method for an electronic device mounted for supporting wireless mobile communication for vehicle through wired or wireless networks; anda computer-readable recording medium having stored instructions for implementing the method for supporting wireless mobile communication for vehicle through wired or wireless networks. Electronic device for supporting wireless mobile communication for vehicle, such as car through wired or wireless networks, such as third generation , fourth generation and fifth generation networks. Electronic device can select the V2X module, which achieves better performance by comparing packet error rate (PER), packet reception rate (PRR), latency, or the strength of the DSRC signal and the C-V2X signal. Multiple weight values can be modified to reduce or minimize a loss or cost value obtained by the AI model during a training process. The drawing shows a perspective view of electronic device for supporting wireless mobile communication for vehicle. 10Network20Infrastructure30,100Vehicle40Pedestrian", "instruction": "Please summarize the input" }, { "input": "Method and apparatus for operating autonomous driving controller of vehicleProvided is a method and apparatus for operating an autonomous driving controller, the method including generating route information for the vehicle based on a rule, transitioning from an autonomous driving mode to an autonomous driving disable mode, in response to the driving route information not being generated for an amount of time greater than or equal to a threshold, tracking at least one neighboring vehicle based on data sensed by a sensor, and generating temporary driving route information based on a movement of the at least one neighboring vehicle.What is claimed is: \n | 1. A method of driving a vehicle, the method comprising: \ngenerating driving route information for the vehicle based on a rule; and \nin response to the driving route information not being generated based on the rule for an amount of time greater than or equal to a threshold: \ndisabling an autonomous driving mode using the driving route information; \ntracking at least one neighboring vehicle based on data sensed by a sensor; \ngenerating temporary driving route information based on a movement of the at least one neighboring vehicle; and \ndriving the vehicle based on the temporary driving route information. \n | 2. The method of claim 1, wherein the generating of the driving route information comprises: \nrecognizing a surrounding environment of the vehicle based on the data sensed by the sensor; and \ngenerating the driving route information based on the recognized surrounding environment and the rule. \n | 3. The method of claim 1, wherein the tracking of the movement of the at least one neighboring vehicle comprises: \nperiodically determining a location of the at least one neighboring vehicle; and \ntracking the movement of the at least one neighboring vehicle based on a change of the location of the at least one neighboring vehicle. \n | 4. The method of claim 1, wherein the generating of the temporary driving route information comprises: \ndetermining a difference in movement between a first neighboring vehicle and a second neighboring vehicle, in response to movements of two neighboring vehicles being tracked; and \ngenerating the temporary driving route information based on the difference in movement between the first neighboring vehicle and the second neighboring vehicle. \n | 5. The method of claim 4, wherein the determining of the difference comprises determining the difference by comparing a first surrounding environment in which the first neighboring vehicle moves to a second surrounding environment in which the second neighboring vehicle does not move, and \nthe generating of the temporary driving route information comprises generating the temporary driving route information based on the first surrounding environment including the difference. \n | 6. The method of claim 1, wherein the generating of the temporary driving route information comprises generating the temporary driving route information to move the vehicle based on a change of a location of the at least one neighboring vehicle.\n | 7. The method of claim 1, wherein the at least one neighboring vehicle comprises a vehicle moving in a direction identical to a direction of the vehicle.\n | 8. The method of claim 1, wherein the at least one sensor comprises any one or any combination of a camera, a lidar, and a radar.\n | 9. The method of claim 1, wherein the generating of the temporary driving route information comprises generating the temporary driving route information in response 25 to an absence of a movement of a vehicle in a direction different from a direction of the vehicle.\n | 10. The method of claim 1, further comprising, in response to the driving route information not being generated based on the rule for the amount of time greater than or equal to the threshold, updating log information stored in a memory.\n | 11. The method of claim 10, further comprising: \nsearching the memory for the log information corresponding to a current circumstance, in response to a route generating mode being transitioned to an autonomous driving disable mode; and \ngenerating the temporary driving route information based on the log information corresponding to the current circumstance. \n | 12. The method of claim 11, wherein the current circumstance comprises any one or any combination of a type of an obstacle, a size of the obstacle, weather conditions, type of a road, road conditions, a size of a lane, and a number of lanes, and \nthe searching comprises searching the memory for the log information having greatest similarity with the current circumstance. \n | 13. The method of claim 1, further comprising: \nreceiving the temporary driving route information through wireless communication or vehicle to vehicle (V2V) communication with the at least one neighboring vehicle, in \nresponse to a route generating mode being transitioned to an autonomous driving disable mode. \n | 14. A non-transitory computer-readable medium storing instructions that, when executed by a processor, causes the processor to perform the method of claim 1.\n | 15. An autonomous driving controller comprising: \na memory configured to store instructions; and \na processor configured to execute the instructions to generate route information for a vehicle based on a rule, and in response to the driving route information not being generated based on the rule for an amount of time greater than or equal to a threshold, configured to transition from an autonomous driving mode to an autonomous driving disable mode, track at least one neighboring vehicle based on data sensed by a sensor, generate temporary driving route information based on a movement of the at least one neighboring vehicle and control the vehicle based on the temporary driving route information. \n | 16. A method of controlling autonomous driving controller, the method comprising: \ngenerating route information for a vehicle based on a rule; \nin response to the driving route information not being generated based on the rule for an amount of time greater than or equal to a preset time: \ntransitioning from an autonomous driving mode to an autonomous driving disable mode; \nsearching a memory for log information having greatest similarity with a current circumstance, in response to the transitioning to the autonomous driving disable mode; \ngenerating temporary driving route information based on the log information; and \ncontrolling the vehicle based on the temporary driving route information. \n | 17. A method of driving a vehicle, the method comprising: \ngenerating route information for the vehicle being driven in an autonomous driving mode; and \nin response to the route information violating a rule for a time period greater than a threshold: \ndisabling the autonomous driving mode; \ntracking a change of a location of at least one neighboring vehicle based on data sensed by a sensor; \ngenerating temporary driving route information based on the change of the location of the at least one neighboring vehicle; and \ndriving the vehicle based on the temporary driving route information. \n | 18. The method of claim 17, wherein the generating of the temporary driving route information comprises generating a temporary driving route based on the change of the location of the at least one neighboring vehicle and a surrounding environment in which the at least one neighboring vehicle moves.\n | 19. The method of claim 17, further comprising: \nupdating log data, in a memory, in response to the generating of the temporary driving route information. \n | 20. The method of claim 19, wherein the generating of the temporary driving route information comprises generating a temporary driving route based on the change of the location of the at least one neighboring vehicle and log entry, stored in the memory, corresponding to a surrounding environment of the vehicle.", "output": "The method involves generating (710) the route information for the vehicle based on a rule. A neighboring vehicle is tracked based on the data sensed by a sensor. The temporary driving route information is generated (760) based on a movement of the neighboring vehicle. A surrounding environment of the vehicle is recognized based on the data sensed by the sensor. The sensor is combination of a camera, light detection and ranging and a radar. A memory is searched for the log information corresponding to a current circumstance. INDEPENDENT CLAIMS are included for the following:a non-transitory computer-readable medium for storing instructions; andan autonomous driving controller has a memory. Method for driving a vehicle. The method involves generating the route information for the vehicle based on a rule, where neighboring vehicle is tracked based on the data sensed by a sensor and the temporary driving route information is generated based on a movement of the neighboring vehicle, and thus enables to recognize surrounding objects and generates a driving route that meets traffic regulations and avoids contact with the surrounding objects. The drawing shows a flowchart of a method for operating an autonomous driving controller. 710Generating the route information for the vehicle720Verifying whether the driving route information is generated based on the rule730Transitioning a route generating mode from an autonomous driving mode to an autonomous driving disable mode740Tracking the movement of a neighboring vehicle based on the data sensed by the sensor760Generating the temporary driving route information", "instruction": "Please summarize the input" }, { "input": "Non-invasive handling of sleep apnea, snoring and emergency situationsA monitoring non-invasive device for handling of sleep apnea, snoring and emergency situations operates for breathing assistance by means of transdermal stimulation of muscle groups including the pectoralis majoris, the serratus anterior, and the abdominal muscles. A wrist mounted version may alarm drivers or others requiring focus or concentration when they fall asleep and may alert a medical center. The invention may have a pulse oximeter on a person's wrist/finger to monitor their breathing while asleep, and in the event of a serious snoring or sleep apnea episode, activate the breathing assistance pulses.What is claimed is: \n | 1. A sleep apnea, snoring, emergency situations and breath assistance device configured for use by a person having a body, skin, a mouth, airways, first, second, third, and fourth pairs of abdominal muscles, and four chest muscles including first and second pectoral muscles and first and second serratus anterior muscles, the sleep apnea, snoring, emergency situations and breath assistance device comprising: \na control module having operative electrical connections to a plurality of dermal electrodes configured to be attached to such skin of such person, whereby the control module is in communication with the dermal electrodes, the control module configured so as to be worn on such person's body; \na first one of the plurality of dermal electrodes configured to be disposed on such skin of such person at one such chest muscle; \na second one of the plurality of dermal electrodes configured to be disposed on such skin of such person at one such abdominal muscle; \neach of the dermal electrodes configured to deliver a plurality of pulse trains to one such respective muscle; \nthe control module having a stimulation module operative to send a first pulse train to such chest muscle and a second pulse train to such abdominal muscle; \nthe first pulse train operative to stimulate such chest muscle so as to cause a first contraction of such chest muscle; \nthe second pulse train operative to stimulate such abdominal muscle so as to cause a second contraction of such abdominal muscle; \nwhereby at least one breath is stimulated. \n | 2. The sleep apnea, snoring, emergency situations and breath assistance device of claim 1, configured for use with agarment worn on such body by such person, wherein: \nthe control module, the dermal electrodes and the operative electrical connections are configured so as to be worn on such body concealed within such garment. \n | 3. The sleep apnea, snoring, emergency situations and breath assistance device of claim 2, further comprising: \na third one of the plurality of dermal electrodes configured to be disposed on such skin of such person at a second such abdominal muscle; \na fourth one of the plurality of dermal electrodes configured to be disposed on such skin of such person at a third such abdominal muscle; \na fifth one of the plurality of dermal electrodes configured to be disposed on such skin of such person at a fourth such abdominal muscle; \nthe control module further operative to send the second pulse train to such second, third and fourth abdominal muscles. \n | 4. The sleep apnea, snoring, emergency situations and breath assistance device of claim 3, further comprising: \na sixth one of the plurality of dermal electrodes configured to be disposed on such skin of such person at a second such chest muscle; \na seventh one of the plurality of dermal electrodes configured to be disposed on such skin of such person at a third such chest muscle; \nan eighth one of the plurality of dermal electrodes configured to be disposed on such skin of such person at a fourth such chest muscle; \nthe control module further operative to send the first pulse train to such second, third and fourth chest muscles. \n | 5. The sleep apnea, snoring, emergency situations and breath assistance device of claim 4, the pulse train further comprising: \na group of pulses consisting of a plurality of individual pulses increasing in amplitude with time, the group of pulses having a duration of 500 ms to 900 ms; \na second time out period of 2 to 3 seconds during which no pulses are sent; \nrepetitions of the group of pulses and the second time out period for a breath assist time period defined to last either until an autonomic breath occurs or for a period of time of no more than 3 seconds. \n | 6. The sleep apnea, snoring, emergency situations and breath assistance device of claim 5, further comprising: \nat least one pulse oximeter configured to be attached to such user; \nthe pulse oximeter sensor in operative communication with the control module; \nthe control module further comprising an analysis module operative to receive a data from the pulse oximeter sensor and analyze the data to determine if such person is exhibiting an autonomic breath and if such person is not exhibiting an autonomic breath for a period of 3 seconds, the control module further operative to send the pulse trains. \n | 7. The sleep apnea, snoring, emergency situations and breath assistance device of claim 6, wherein the pulse oximeter sensor is further operative to alert such person by means of a signal when it sends such pulse trains.\n | 8. The sleep apnea, snoring, emergency situations and breath assistance device of claim 5, further comprising: \nat least one breath sensor in operative communication with the control module, the breath sensor configured to be disposed on such skin of such person; \n the control module further comprising an analysis module operative to receive a data from the breath sensor and analyze the data to determine if such person is exhibiting an autonomic breath and if such person is not exhibiting an autonomic breath, the control module further operative to send the pulse trains. \n | 9. The sleep apnea, snoring, emergency situations and breath assistance device of claim 5, further comprising: \nat least one blood oxygen level sensor, the blood oxygen level sensor in operative communication with the control module, the blood oxygen level sensor configured to be disposed on such skin of such person; \nthe control module further comprising an analysis module operative to receive a data from the blood oxygen level sensor and analyze the data to determine if such person is exhibiting an oxygen level indicative of a normal breathing pattern and if such person is not, the control module further operative to send the pulse trains. \n | 10. The sleep apnea, snoring, emergency situations and breath assistance device of claim 9, further comprising: \nan RF communication module; \nthe control module having a non-volatile memory and a central processor unit, the analysis module stored in the non-volatile memory, the control module having a start button operative to activate the sleep apnea, snoring, emergency situations and breath assistance device to begin an operating cycle, using a first set of preset operating parameters also stored in the non-volatile memory; \na mobile device having an operative RF connection to the RF communication module of the control module and further having a touch screen operative to display a set of data collected by the device and enable control of the secretion clearance and cough assistance device; \nthe start button further operative to establish the operative RF connection to the mobile device; \nthe mobile device having a module operative to provide wireless control of the operation of the control module; \nthe mobile device operative to collect data, provide for wireless setup and wireless maintenance of the breath assistance device. \n | 11. The sleep apnea, snoring, emergency situations and breath assistance device of claim 10, wherein the mobile device is operative to provide control of the control module by one mode selected from the group consisting of: manual control input to the mobile device and the control module, manual control input to the mobile device and from the mobile device to the control module, adaptive heuristic control by an artificial intelligence module loaded in the mobile device and the control module, adaptive heuristic control by an artificial intelligence module loaded in the mobile device and from the mobile device to the control module, remote control from a remote location via communication with the mobile device and from the mobile device to the control module, and combinations thereof.\n | 12. The sleep apnea, snoring, emergency situations and breath assistance device of claim 10, wherein the control module is further operative to alert such person by means of a signal from such mobile device when it sends such pulse trains.\n | 13. The sleep apnea, snoring, emergency situations and breath assistance device of claim 10, configured for use with a vehicle being driven by such person, such vehicle having autonomous driving capability, wherein the control module further comprises: \na communication protocol allowing the control module to control such vehicle; \nthe control module operative to assume control of such vehicle when it sends such pulse trains. \n | 14. The sleep apnea, snoring, emergency situations and breath assistance device of claim 13, wherein the communication protocol further comprises one member selected from the group consisting of: V2X, Bluetooth, WiFi, and combinations thereof.\n | 15. The sleep apnea, snoring, emergency situations and breath assistance device of claim 10, configured for use by such person in a job requiring attention and focus.\n | 16. The sleep apnea, snoring, emergency situations and breath assistance device of claim 5, further comprising: \nat least one blood pressure sensor, the blood pressure sensor in operative communication with the control module, the blood pressure sensor configured to be disposed on such skin of such person; \nthe control module further comprising an analysis module operative to receive a data from the blood pressure sensor and analyze the data to determine if such person is exhibiting normal autonomic breathing and if sch person is not exhibiting normal autonomic breathing, the control module further operative to send the pulse trains. \n | 17. The sleep apnea, snoring, emergency situations and breath assistance device of claim 5, further comprising: \nat least one sensor of at least one heart rate sensor, the heart rate sensor in operative communication with the control module, the heart rate sensor configured to be disposed on such skin of such person; \nthe control module further comprising an analysis module operative to receive a data from the heart rate sensor and analyze the data to determine if such person is exhibiting normal autonomic breathing and if such person is not exhibiting normal autonomic breathing, the control module further operative to send the pulse trains.", "output": "The device has a main portion having a shape dimensioned and configured to be worn on such arm. A control module (1514) includes a CPU within the device main portion, and has operative electrical connections to a first electrode. The first electrode is in contact with arm. A control module comprises an analysis module operative to receive data from the sensor and analyze the data to determine if such a person (1500) exhibits autonomic breath and if such person is not exhibiting such autonomic breath. The control module is operative to carry out task related to sending of a first pulse train to the first electrode, making an alert noise, alert vibration, and communicating with a vehicle and a first preferred remote terminal through a RF communication module (1516). An INDEPENDENT CLAIM is included for a method of breath assistance for use by a person. Sleep apnea, snoring, emergency situations and breath assistance device for use by person in autonomic breath. The control module is operative to alert person by audible/vibration/transmitted signal when it sends such pulse trains. The device is efficient to have the stimulation happen concurrently with the breathing or perhaps even after, or there are multiple rounds of stimulation for each breath, and so on. The device analyzes the monitored data, and examines the stimulation history, and then actually optimizes the parameters of the stimulation, thus providing a unique and optimized stimulation from moment to moment or from breath to breath. The communication protocol is selected from group consisting of Wireless Fidelity (Wi-Fi) , Bluetooth standards. The drawing shows a front view of the sleep apnea, snoring, emergency situations and breath assistance device. 1500Person1514Control module1516RF communication module1518Control device1520Sensor", "instruction": "Please summarize the input" }, { "input": "SYSTEMS AND METHODS FOR IMPROVED OPERATION OF A WORKING VEHICLEVarious apparatus and procedures for improved operation of a working vehicle are provided. One embodiment provides for vehicle-to-vehicle communications using cellular modems to provide information from one vehicle to another vehicle that has lost internet connectivity. Another embodiment provides a method for improving safety of a work area where an autonomous or remotely controlled vehicle is operating by scanning for unknown Bluetooth modules in the vicinity of a working vehicle. Another embodiment provides for intercepting and modifying signals from vehicle controls and passing the modified signals to a control unit of the vehicle.|1. A system for performing a work operation in a work area comprising: \na plurality of vehicles wherein each vehicle is equipped with a GNSS unit and a modem, and wherein the modem of each vehicle is configured to receive location corrections from an RTK network; \na processor connected to each vehicle, wherein each processor is configured to receive location information from its respective GNSS unit and location corrections from its respective modem; \nwherein each processor is further configured to run mission plan software for controlling operation of its respective vehicle; and \nwherein each processor is further configured to detect a loss of connection to the RTK network, connect to a local wireless network, and query any other vehicle of the plurality of vehicles for location corrections. \n | 2. The system of claim 1 wherein real-time collision avoidance information is communicated in addition to the location corrections.\n | 3. The system of claim 1 wherein dynamic job optimization information is communicated in addition to the location corrections.\n | 4. The system of claim 1 wherein the plurality of vehicles comprise autonomous vehicles.\n | 5. A system for improving safety in a work area comprising: \none or more vehicles wherein each vehicle is equipped with a Bluetooth module configured to send and receive signals from other Bluetooth modules; \na processor connected to each vehicle, wherein each processor is configured to communicate with its respective Bluetooth module; and \nwherein each processor is further configured to shut down its respective vehicle if a signal transmitted by an unknown Bluetooth module is detected by the vehicle's Bluetooth module. \n | 6. The method of claim 5 wherein the one or more vehicles is autonomous.\n | 7. The method of claim 5 wherein the one or more vehicles is remotely controlled.\n | 8. A method for improving safety in a work area comprising: \noperating one or more vehicles in the work area, wherein each vehicle is equipped with a vehicle Bluetooth module configured to send and receive signals from other Bluetooth modules, and each vehicle is equipped with a processor configured to communicate with its respective vehicle Bluetooth module; \nscanning for Bluetooth signals transmitted by one or more other Bluetooth modules; and \nshutting down the one or more vehicles if its associated vehicle Bluetooth module receives a signal transmitted by an unknown Bluetooth module. \n | 9. The method of claim 5 wherein the plurality of vehicles is autonomous.\n | 10. The method of claim 5 wherein the plurality of vehicles is remotely controlled.\n | 11. A method for autonomously controlling a vehicle comprising: \nproviding an interceptor configured to intercept one or more messages communicated by one or more armrest controls of the vehicle to an engine control unit of the vehicle; \ninserting autonomous control instructions into the one or more intercepted messages to create a modified message; and \ncommunicating the modified message to the engine control unit of the vehicle. \n | 12. The method of claim 11 wherein the one or messages communicated by one or more armrest controls of the vehicle to the engine control unit of the vehicle and the modified message are communicated on a CAN bus of the vehicle.", "output": "The system has multiple vehicles where each vehicle (10) is connected with a global navigation satellite system (GNSS) unit (40) and a modem (710), where the modem of each vehicle is configured to receive location corrections from an RTK network. A processor is connected to each vehicle, where each processor is configured to receive location information from the respective GNSS unit and location corrections from its respective modem. Each processor is configured to run mission plan software for controlling operation of its respective vehicle. Each processor is configured to detect a loss of connection to the RTK network, connect to a local wireless network, and query any other vehicle of the plurality of vehicles for location corrections. INDEPENDENT CLAIMS are included for: (1) a system for improving safety in a work area; (2) the method for improving safety in a work area; (3) a method for autonomously controlling a vehicle. System for performing work operation in work area performed using a manned vehicle or by an autonomous or remotely controlled vehicle such as agricultural vehicle, a mower. The operator of the vehicle makes judgment calls about selecting a safe evacuation location and steering the vehicle quickly toward the evacuation location while avoiding injury or damage to the vehicle or to objects or people in the vehicle path. The speed of vehicle is reduced to avoid damage to vehicle, when readings captured by GNSS unit indicate that the vehicle is approaching or within the slow zone. The drawing shows a schematic view of system for performing work operation in work area .performed using a manned vehicle.10Vehicle 20Control implement 30Computer 35Microprocessor 40GNSS unit 700Base station 710Modem ", "instruction": "Please summarize the input" }, { "input": "Mobile payment system for traffic prioritization in self-driving vehiclesA self-driving or autonomous vehicle transmits a vehicle-to-vehicle offer message from a user of a vehicle-connected mobile communication device riding in the self-driving vehicle to a second user of a second mobile communication device riding in a second vehicle to pay for a traffic prioritization relative to the second vehicle. The first mobile communication device receives a reply message and sends a payment to the second mobile communication device or an account associated with the second mobile communication device to obtain the traffic prioritization relative to the other vehicle. For example, the traffic prioritization may enable one vehicle to pass the other vehicle, to take precedence at an intersection or to be given priority to take a parking place or any other traffic-related advantage.The invention claimed is: \n | 1. A non-transitory computer-readable medium storing computer-readable instructions in code which when executed by a processor of a mobile communication device cause the mobile communication device to: \ncommunicatively connect to a mobile communication interface of a self-driving vehicle in which the mobile communication device is located; \nreceive user input defining an offer to pay for a traffic prioritization that prioritizes the self-driving vehicle relative to a second vehicle; \nautomatically generate an offer message in response to the user input; \nautomatically transmit the offer message to the mobile communication interface of the self-driving vehicle for communicating the offer message to a second mobile communication device in the second vehicle; \nautomatically receive a reply message from the second mobile communication device; \nautomatically determine if the reply message constitutes an acceptance or rejection of the offer; and \nin response to determining that the reply message indicates the acceptance of the offer, send a payment to the second mobile communication device or to an account associated with the second mobile communication device to pay for the traffic prioritization. \n | 2. The non-transitory computer-readable medium of claim 1 further comprising code that causes the mobile communication device to receive a confirmation message to confirm receipt of the payment.\n | 3. The non-transitory computer-readable medium of claim 2 further comprising code that causes the mobile communication device to receive an acknowledgement message that the second vehicle will maneuver as soon as traffic regulations and traffic conditions permit to grant priority to the self-driving vehicle.\n | 4. The non-transitory computer-readable medium of claim 1 further comprising code that causes the mobile communication device to output an alert that an estimated time of arrival at a destination will be later than originally predicted and presenting a user interface element to pay to prioritize the self-driving vehicle in traffic.\n | 5. The non-transitory computer-readable medium of claim 1 further comprising code that causes the mobile communication device to receive a third-party request to expedite travel, the third-party request including a third-party payment to prioritize the self-driving vehicle in traffic, wherein the code is configured to automatically generate and transmit a third-party offer message using the third-party payment to the second mobile communication device.\n | 6. A non-transitory computer-readable medium storing computer-readable instructions in code which when executed by a processor of a mobile communication device cause the mobile communication device to: \ndisplay on a user interface of the mobile communication device a fee-for-transport interface of a fee-for-transport application executing on the mobile communication device to enable a user to summon a self-driving vehicle to transport the user from a starting point to a destination for a fee; \nreceive user input from the user to define the destination, wherein the starting point is either a current location of the mobile communication device or a user-specified pickup location; \ndisplay pricing options based on a plurality of different levels of traffic prioritization for transport to the destination from the starting point to the destination; \nreceive a user-selected traffic prioritization; and \ncommunicate a pickup request to the self-driving vehicle, the pickup request including the user-selected traffic prioritization to enable the self-driving vehicle to automatically offer one or more payments to one or more other vehicles to obtain the user-selected traffic prioritization along the route to the destination. \n | 7. The non-transitory computer-readable medium of claim 6 wherein the code causes the mobile communication device to display a trip report upon arrival at the destination that indicates that the self-driving vehicle has determined that a portion of the fee allocated for prioritization payments has been unused, and the portion of the fee that was unused has been refunded to an account associated with a user of the mobile communication device.\n | 8. The non-transitory computer-readable medium of claim 6 comprising code that causes the mobile communication device to display on the user interface of the mobile communication device an amount payable to arrive at the destination at a user-specified time, to present a user interface element to pay the amount, and to communicate this amount and the user-specified time to the self-driving vehicle.\n | 9. The non-transitory computer-readable medium of claim 6 comprising code that causes the mobile communication device to receive real-time traffic data, to detect a traffic jam based on the real-time traffic data by determining that the self-driving vehicle is moving below a speed limit, and to send a plurality of offer messages to a plurality of vehicles to pay for prioritization.\n | 10. The non-transitory computer-readable medium of claim 9 wherein the offer messages are conditional offers that are conditional on acceptance by all of the plurality of vehicles.\n | 11. The non-transitory computer-readable medium of claim 6 wherein the code to display the pricing options includes code to display travel times for the pricing options.\n | 12. The non-transitory computer-readable medium of claim 6 comprising code to cause the mobile communication device to use an event stored in a calendar application on the mobile communication device to determine the travel time to the event, and then automatically recommend a prioritization level to arrive at the event on time.\n | 13. The non-transitory computer-readable medium of claim 6 wherein the pricing options are based on historical prioritization data that include the probabilities of offers being accepted at various price points.\n | 14. The non-transitory computer-readable medium of claim 6 comprising code that causes the mobile communication device to receive a third-party request to expedite travel, the third-party request including a third-party payment to prioritize the self-driving vehicle in traffic.\n | 15. A non-transitory computer-readable medium storing computer-readable instructions in code which when executed by a processor of a mobile communication device cause the mobile communication device to: \ngenerate an emergency request for an emergency, the emergency request requesting that a self-driving vehicle be prioritized in traffic due the emergency; \ntransmit the emergency request to a governmental authority emergency server to request emergency prioritization; and \nreceive an emergency prioritization authorization from the governmental authority emergency server, the emergency prioritization authorization comprising a first cryptographic token to be broadcast by the self-driving vehicle to other vehicles to obtain priority in traffic and a second cryptographic token that is recognizable by law enforcement entities permitting the self-driving vehicle to exceed a speed limit due to the emergency. \n | 16. The non-transitory computer-readable medium of claim 15 wherein the emergency request is generated in response to detecting a 911 call being made by the mobile communication device.\n | 17. The non-transitory computer-readable medium of claim 15 wherein the emergency request is generated in response to a biometric sensor detecting the emergency, the biometric sensor being in the mobile communication device or in communication with the mobile communication device.\n | 18. The non-transitory computer-readable medium of claim 15 comprising code that causes the mobile communication device to: \ndetermine an emergency destination to replace a destination originally specified by the user; and \nre-route the self-driving vehicle to the emergency destination. \n | 19. The non-transitory computer-readable medium of claim 18 comprising code that causes the mobile communication device to: \nconstrain the cryptographic token to be valid only for a new route to the emergency destination. \n | 20. The non-transitory computer-readable medium of claim 16 comprising code that causes the mobile communication device to: \ndetermine an emergency destination to replace a destination originally specified by the user; and \nre-route the self-driving vehicle to the emergency destination.", "output": "The medium has set of instructions for communicatively connecting to a first mobile communication interface (1000) of a self-driving vehicle (10) in which the first mobile communication device is located. User input defining offer to pay for traffic prioritization that prioritizes the self-driving vehicle relative to a primary vehicle is received. Offer message is automatically generated in response to the user input. The offer message is automatically transmitted to the first mobile communication interface of the self-driving vehicle for communicating the offer message to a second mobile communication device (1100) in the primary vehicle. Reply message is automatically received from the second mobile communication device. Judgment is made to check whether the reply message constitutes acceptance or rejection of the offer. Payment is transmitted to the second mobile communication device or to an account associated with the second mobile communication device to pay for the traffic prioritization in response to determining that the reply message indicates the acceptance of the offer. Non-transitory computer readable storage medium for realizing traffic prioritization in a self-driving vehicle i.e. car (from drawings) by a mobile payment system through a mobile communication device e.g. smartphone, cell phone, tablet, smartwatch, wearable smart device and laptop. The medium enables mutually sensing self-driving vehicles in a preset area of a road by utilizing various sensors for collision avoidance and communication through vehicle-to-vehicle messaging protocols. The drawing shows a schematic diagram of a mobile payment system.10Self-driving vehicle 11Vehicle-to-vehicle messages 1000First mobile communication interface 1100Second mobile communication interface 1101First user 1101aSecond user 1105Processor 1105aCPU 1110Mobile device memory 1115Mobile device display screen 1120Mobile device global navigation satellite system chip 1130Cellular transceiver 1140Mobile device data interface 1150User interface element 1200First vehicle-to-vehicle data transceiver 1200aSecond vehicle-to-vehicle data transceiver ", "instruction": "Please summarize the input" }, { "input": "Vehicle-to-vehicle payment system for traffic prioritization in self-driving vehiclesA self-driving or autonomous vehicle has a traffic-prioritization processor to send or receive a payment to or from a central server to obtain a traffic prioritization for a route or to accept a traffic de-prioritization for the route. The central server receives and distributes payments to other vehicles traveling the route. The vehicle communicates with the central server to receive a plurality of levels of prioritization which range from a highest prioritization to a lowest prioritization, and the costs or payouts associated with each of the levels.The invention claimed is: \n | 1. A self-driving vehicle comprising: \na vehicle chassis; \na motor supported by the chassis for providing propulsive power for the vehicle; \na braking system; \na steering system; \na plurality of sensors; \na self-driving processor configured to receive signals from the sensors and to generate steering, acceleration and braking control signals for controlling the steering system, the motor and the braking system of the vehicle; \na Global Navigation Satellite System (GNSS) receiver for receiving satellite signals and for determining a current location of the self-driving vehicle; \na radiofrequency data transceiver; and \na traffic-prioritization processor configured to cooperate with the radiofrequency data transceiver to: \nreceive from a central server a price to obtain a traffic prioritization for a route or to accept a traffic de-prioritization for the route, wherein the central server determines the price based on offers and requests to be prioritized or deprioritized from other vehicles traveling the route and wherein the central server receives payments from prioritized vehicles traveling the route and distributes payments to de-prioritized vehicles traveling the route; and \nsend or receive a payment to or from the central server to obtain the traffic prioritization for the route or to accept the traffic de-prioritization for the route. \n | 2. The self-driving vehicle of claim 1 wherein the traffic-prioritization processor is configured to cooperate with the radiofrequency data transceiver to receive, from the central server a plurality of levels of prioritization which range from a highest prioritization to a lowest prioritization, and the costs or payouts associated with each of the levels.\n | 3. The self-driving vehicle of claim 2 wherein the traffic-prioritization processor is configured to cooperate with the radiofrequency data transceiver to receive, from the central server, travel times for the levels of prioritization.\n | 4. The self-driving vehicle of claim 3 comprising a user interface to display the costs or payouts for the levels of prioritization and the travel times for each of the levels of prioritization to enable a user to select the level of prioritization for the route.\n | 5. The self-driving vehicle of claim 1 wherein the user interface provides an alert indicating that an estimated time of arrival at a destination will be later than originally predicted and providing a user interface element to enable a user to pay to expedite travel to the destination.\n | 6. The self-driving vehicle of claim 4 wherein the user interface displays the cost to pay to obtain the traffic prioritization to the destination.\n | 7. A self-driving vehicle comprising: \na vehicle chassis; \na motor supported by the chassis for providing propulsive power for the vehicle; \na braking system; \na steering system; \na plurality of sensors; \na self-driving processor configured to receive signals from the sensors and to generate steering, acceleration and braking control signals for controlling the steering system, the motor and the braking system of the vehicle; \na Global Navigation Satellite System (GNSS) receiver for receiving satellite signals and for determining a current location of the self-driving vehicle; \na radiofrequency data transceiver; and \na traffic-prioritization processor that cooperates with the radiofrequency data transceiver to: \nreceive, from a central server, pricing for different levels of traffic prioritization for a route, the pricing including a cost to obtain a higher traffic prioritization for the route and a payout to accept a lower traffic prioritization for the route. \n | 8. The self-driving vehicle of claim 7 wherein the traffic-prioritization processor is configured to cooperate with the radiofrequency data transceiver to send to the central server a payment equal to the cost of obtaining the higher traffic prioritization for the route.\n | 9. The self-driving vehicle of claim 7 wherein the traffic-prioritization processor is configured to cooperate with the radiofrequency data transceiver to receive from the central server a payment equal to the payout for accepting the lower traffic prioritization for the route.\n | 10. The self-driving vehicle of claim 7 further comprising a user interface presenting costs and payouts for three or more different levels of traffic prioritization.\n | 11. The self-driving vehicle of claim 10 wherein the user interface also presents costs and payouts based on times of day.\n | 12. The self-driving vehicle of claim 10 wherein the user interface also presents costs and payouts based on segments of the route.\n | 13. The self-driving vehicle of claim 10 wherein the user interface also presents travel times for the different levels of traffic prioritization.\n | 14. An autonomous vehicle comprising: \na self-driving processor configured to receive signals from sensors to generate steering, acceleration and braking control signals for controlling a steering system, a motor and a braking system of the vehicle; \na Global Navigation Satellite System (GNSS) receiver for receiving satellite signals and for determining a current location of the vehicle; \na radiofrequency data transceiver; and \na traffic-prioritization processor cooperating with the radiofrequency data transceiver to: \ncommunicate with a central server to receive pricing for a traffic prioritization or de-prioritization for a route; and \nsend or receive a payment to or from the central server for the traffic prioritization or de-prioritization for the route. \n | 15. The autonomous vehicle of claim 14 wherein the pricing includes costs and payouts for different segments of the route.\n | 16. The autonomous vehicle of claim 15 wherein the costs and payouts for the different segments depend on a time of day.\n | 17. The autonomous vehicle of claim 14 comprising a user interface to present the costs and payouts to enable selection of a level of prioritization.\n | 18. The autonomous vehicle of claim 17 wherein the user interface indicates whether the costs and payouts are above normal market prices for that particular time and place.\n | 19. The autonomous vehicle of claim 14 wherein the pricing includes a bid and an ask for each segment of the route and for each level of prioritization, the bid defining a price being offered for the prioritization and the ask defining a price that is being asked to accept the prioritization.\n | 20. The autonomous vehicle of claim 14 wherein the self-driving processor and the traffic-prioritization processor are integrated in a vehicle computing device.", "output": "The vehicle (10) has a vehicle chassis (12) for supporting a motor for providing propulsive power for the vehicle. A self-driving processor (100) receives signals from sensors and for generating steering, acceleration and braking control signals. A Global Navigation Satellite System (GNSS) receiver (260) receives satellite signals and determines a current location of the self-driving vehicle. A traffic-prioritization processor (200) cooperates with a radio frequency data transceiver (220) for sending or receiving a payment to or from a central server to obtain traffic prioritization for a route or to accept traffic de-prioritization for the route. Autonomous or self-driving vehicles such as car, van, minivan, sports utility vehicle (SUV), crossover-type vehicle, bus, minibus, truck, tractor-trailer, semi-trailer, construction vehicle, work vehicle, tracked vehicle, semi-tracked vehicle, offroad vehicle, electric cart and a dune buggy for utilizing sensors such as RADAR, LIDAR and/or cameras to provide signals to a processor or controller that generates and outputs steering, acceleration and braking signals to the vehicle. The vehicle allows self-driving vehicles in a given area of a road to mutually sense presence of each other using various sensors for collision avoidance through vehicle-to-vehicle messaging protocols. The vehicle can automatically perform an adjustment to own routing on the benefit of the prioritization, e.g., to pass the second vehicle, upon transfer of the payment. The drawing shows a side view of an autonomous or self-driving vehicle.10Vehicle 12Vehicle chassis 100Self-driving processor 200Traffic-prioritization processor 220Radio frequency data transceiver 260GNSS receiver ", "instruction": "Please summarize the input" }, { "input": "Vehicle-to-vehicle payment system for traffic prioritization in self-driving vehiclesA self-driving or autonomous vehicle comprises a processor to transmit an offer message to another vehicle and to receive a reply message from the other vehicle, and to transfer a payment to the other vehicle to obtain a traffic prioritization relative to the other vehicle. For example, the traffic prioritization may enable one vehicle to pass the other vehicle, to take precedence at an intersection or to be given priority to take a parking place or any other traffic-related advantage.The invention claimed is: \n | 1. A self-driving vehicle comprising: \na vehicle chassis; \na motor supported by the chassis for providing propulsive power for the vehicle; \na braking system; \na steering system; \na plurality of sensors; \na processor configured to receive signals from the sensors and to generate steering, acceleration and braking control signals for controlling the steering system, the motor and the braking system of the vehicle; \na Global Navigation Satellite System (GNSS) receiver for receiving satellite signals and for determining a current location of the self-driving vehicle; \na radiofrequency data transceiver; and \nwherein the processor is configured to: \ntransmit an offer message to a second vehicle; \nreceive a reply message from the second vehicle; and \ntransfer a payment to the second vehicle to obtain a traffic prioritization relative to the second vehicle. \n | 2. The self-driving vehicle of claim 1 wherein the processor is configured to receive a counteroffer from the second vehicle and is configured to accept or reject the counteroffer.\n | 3. The self-driving vehicle of claim 1 wherein the processor cooperates with the radiofrequency data transceiver to communicate with a first payment server to transfer payment to a second payment server associated with the second vehicle.\n | 4. The self-driving vehicle of claim 3 wherein the second payment server requests that the first payment server verify that funds are available, wherein the first payment server confirms to the second payment server that the funds are available, and wherein the second payment server confirms to the second vehicle that the funds are available.\n | 5. The self-driving vehicle of claim 4 wherein the processor requests that the first payment server transfer the funds in response to receiving an acknowledgement from the second vehicle that the availability of the funds has been verified.\n | 6. The self-driving vehicle of claim 5 wherein the processor receives a confirmation from the second vehicle that the second vehicle has initiated a manoeuver to reprioritize the self-driving vehicle in traffic relative to the second vehicle.\n | 7. The self-driving vehicle of claim 1 wherein the processor cooperates with the radiofrequency data transceiver to communicate two parallel offer messages to the second vehicle and to a third vehicle.\n | 8. The self-driving vehicle of claim 7 wherein each of the offer messages contains bits in a data field indicating that the offer is conditional on which of the second and third vehicles is first to reply.\n | 9. The self-driving vehicle of claim 1 wherein the processor cooperates with the radiofrequency data transceiver to send two conditional offer messages to the second vehicle and to a third vehicle ahead of the second vehicle.\n | 10. The self-driving vehicle of claim 9 wherein the conditional offer messages each contains bits in a data field indicating that the offer is conditional on both the second and third vehicles accepting.\n | 11. The self-driving vehicle of claim 1 further comprising a user interface presenting pricing and timing data for two routes to enable a user of the self-driving vehicle to select one of the two routes based on both pricing and timing.\n | 12. The self-driving vehicle of claim 1 further comprising a user interface presenting costs and payouts for different traffic prioritizations.\n | 13. The self-driving vehicle of claim 1 further comprising a user interface presenting bid-ask pricing for different levels of traffic prioritization for different road segments, wherein bid prices are prices being offered by the self-driving vehicle to the second vehicle for the traffic prioritization and ask prices are prices the second vehicle is asking from the self-driving vehicle to grant the traffic prioritization.\n | 14. The self-driving vehicle of claim 1 wherein the processor is configured to receive user-configurable multipliers for setting prices for various types of traffic manoeuvers.\n | 15. The self-driving vehicle of claim 1 wherein the traffic prioritization is precedence for a parking space.\n | 16. The self-driving vehicle of claim 1 wherein the traffic prioritization is precedence at an intersection.\n | 17. The self-driving vehicle of claim 1 wherein the vehicle is a truck and wherein the traffic prioritization is precedence at a loading dock of a warehouse or store.\n | 18. The self-driving vehicle of claim 1 wherein the processor is configured to grant precedence to an emergency vehicle upon wirelessly receiving a special code.\n | 19. The self-driving vehicle of claim 1 wherein the processor automatically generates the offer message based on predetermined user settings representing priority levels set by a user wherein the priority levels are set based on time and location.\n | 20. The self-driving vehicle of claim 1 wherein the payment comprises a transfer to the second vehicle of redeemable points that are stored in a database and are redeemable for a subsequent traffic prioritization in favor of the second vehicle.", "output": "The self-driving vehicle has a vehicle chassis, a motor supported by the chassis for providing propulsive power for the vehicle, a braking system, a steering system and several sensors. A processor is configured to receive signals from the sensors and to generate steering, acceleration and braking control signals for controlling the steering system, the motor and the braking system of the vehicle. A global navigation satellite system (GNSS) receiver configured for receiving satellite signals and for determining a current location of the self-driving vehicle and a radiofrequency data transceiver. The processor is configured to transmit an offer message to a second vehicle. A reply message is received from the second vehicle. A payment is transferred to the second vehicle to obtain a traffic prioritization relative to the second vehicle. The self-driving vehicles use sensors such as radio detection and ranging, light detection and ranging or cameras to provide signals to the processor or controller that generates and outputs steering, acceleration and braking signals to the vehicle. Uses included but are not limited to encompass any vehicle such as a car, van, minivan, sports utility vehicle, crossover-type vehicle, bus, minibus, truck, tractor-trailer, semi-trailer, construction vehicle, work vehicle, tracked vehicle, semi-tracked vehicle, offroad vehicle, electric cart, dune buggy. The receiving vehicle have a rule defining a monetary threshold to automatically accept an offer from a requesting vehicle. The emergency vehicle makes the request without offering any payment because the vehicle is an emergency vehicle in a first paradigm. The drawing shows a schematic view of the system for V2V payments for traffic reprioritization.10Autonomous vehicle 222Base station transceiver 250Internet 300First payment server 302Payment processing server ", "instruction": "Please summarize the input" }, { "input": "Mobile payment system for traffic prioritization in self-driving vehiclesA self-driving or autonomous vehicle transmits a vehicle-to-vehicle offer message from a user of a vehicle-connected mobile communication device riding in the self-driving vehicle to a second user of a second mobile communication device riding in a second vehicle to pay for a traffic prioritization relative to the second vehicle. The first mobile communication device receives a reply message and sends a payment to the second mobile communication device or an account associated with the second mobile communication device to obtain the traffic prioritization relative to the other vehicle. For example, the traffic prioritization may enable one vehicle to pass the other vehicle, to take precedence at an intersection or to be given priority to take a parking place or any other traffic-related advantage.The invention claimed is: \n | 1. A self-driving vehicle comprising: \na vehicle chassis; \na motor supported by the chassis for providing propulsive power for the vehicle; \na braking system; \na steering system; \na plurality of sensors; \na processor configured to receive signals from the sensors and to generate steering, acceleration and braking control signals for controlling the steering system, the motor and the braking system of the vehicle; \na Global Navigation Satellite System (GNSS) receiver for receiving satellite signals and for determining a current location of the self-driving vehicle; \na mobile communication interface communicatively connected to a first mobile communication device of a first user riding in the self-driving vehicle, the mobile communication interface receiving from the first mobile communication device an offer message from the first user to pay for a traffic prioritization relative to a second self-driving vehicle; \na vehicle-to-vehicle data transceiver communicatively connected to the mobile communication interface to transmit the offer message to the second self-driving vehicle to be relayed via a second mobile communication interface to a second mobile communication device of a second user riding in the second self-driving vehicle; and \nwherein the mobile communication interface, via the vehicle-to-vehicle data transceiver, receives a reply message from the second mobile communication device and transmits the reply message to the first mobile communication device to cause the first mobile communication device to make a payment from a first account of the first user to a second account of the second user to obtain the traffic prioritization relative to the second self-driving vehicle; and \nwherein the mobile communication interface receives a payment message from the first mobile communication device and transmits, via the vehicle-to-vehicle data transceiver, the payment message to the second mobile communication device to confirm that the payment has been being made. \n | 2. The self-driving vehicle of claim 1 wherein the mobile communication interface receives a counteroffer from the second mobile communication device and relays the counteroffer to the first mobile communication device to accept or reject the counteroffer, wherein the first mobile communication device is configured to either present the counteroffer and receive user input to accept or reject the counteroffer or automatically accept or reject the counteroffer based on a user setting.\n | 3. The self-driving vehicle of claim 1 comprising a fee-for-transport processor that computes a fee to transport the first user from a starting point along a route to a destination, wherein the fee is determined based on distance or travel time and is further based on a user-specified traffic prioritization received from the first mobile communication device.\n | 4. The self-driving vehicle of claim 3 wherein the fee-for-transport processor communicates to the first mobile communication device a plurality of pricing options for the route based on different levels of traffic prioritization.\n | 5. The self-driving vehicle of claim 4 wherein the fee-for-transport processor computes the travel times for the route for each of the different levels of traffic prioritization, wherein the travel times are computed using real-time traffic data for the route and historical prioritization data for the route for the time of day, the historical prioritization data indicative of probabilities of traffic prioritization requests being accepted for the route at the time of day.\n | 6. The self-driving vehicle of claim 5 wherein the fee-for-transport processor receives a user selection of one of the different levels of traffic prioritization from the first mobile communication device, the fee-for-transport processor then automatically offering payments to other vehicles along the route to obtain traffic prioritizations and, when offers are accepted, automatically disbursing payments to the other vehicles.\n | 7. The self-driving vehicle of claim 1 wherein the mobile communication interface is a Bluetooth? interface and the vehicle-to-vehicle data transceiver is a dedicated vehicle-to-vehicle short-range communications (DSRC) transceiver operating in a 5.7-5.9 GHz band.\n | 8. A non-transitory computer-readable medium storing computer-readable instructions in code which when executed by a processor of a mobile communication device cause the mobile communication device to: \ncommunicatively connect to a mobile communication interface of a self-driving vehicle in which the mobile communication device is located; \nreceive user input defining an offer to pay for a traffic prioritization that prioritizes the self-driving vehicle relative to a second vehicle; \nautomatically generate an offer message in response to the user input, the offer message being a datagram in a predetermined data format; \nautomatically transmit the offer message to the mobile communication interface of the self-driving vehicle for communicating the offer message to a second mobile communication device in the second vehicle, the second mobile communication device being configured to automatically read the datagram; \nautomatically receive a reply message from the second mobile communication device; \nautomatically determine if the reply message constitutes an acceptance or rejection of the offer; and \nin response to determining that the reply message indicates the acceptance of the offer, send a payment to the second mobile communication device or to an account associated with the second mobile communication device to pay for the traffic prioritization. \n | 9. The non-transitory computer-readable medium of claim 8 further comprising code that causes the mobile communication device to receive a confirmation message to confirm receipt of the payment.\n | 10. The non-transitory computer-readable medium of claim 9 further comprising code that causes the mobile communication device to receive an acknowledgement message that the second vehicle will maneuver as soon as traffic regulations and traffic conditions permit to grant priority to the self-driving vehicle.\n | 11. The non-transitory computer-readable medium of claim 8 further comprising code that causes the mobile communication device to output an alert that an estimated time of arrival at a destination will be later than originally predicted and presenting a user interface element to pay to prioritize the self-driving vehicle in traffic.\n | 12. The non-transitory computer-readable medium of claim 8 further comprising code that causes the mobile communication device to receive a third-party request to expedite travel, the third-party request including a third-party payment to prioritize the self-driving vehicle in traffic, wherein the code is configured to automatically generate and transmit a third-party offer message using the third-party payment to the second mobile communication device.\n | 13. A non-transitory computer-readable medium storing computer-readable instructions in code which when executed by a processor of a mobile communication device cause the mobile communication device to: \ndisplay on a user interface of the mobile communication device a fee-for-transport interface of a fee-for-transport application executing on the mobile communication device to enable a user to summon a self-driving vehicle also executing the fee-for-transport application to transport the user from a starting point to a destination for a fee; \nreceive user input from the user to define the destination, wherein the starting point is either a current location of the mobile communication device or a user-specified pickup location; \ndisplay pricing options based on a plurality of different levels of traffic prioritization for transport to the destination, wherein the pricing options are also based on either distance or travel time to from the starting point to the destination; \nreceive a user-selected traffic prioritization; and \ncommunicate a pickup request to the self-driving vehicle, the pickup request including the user-selected traffic prioritization to enable the self-driving vehicle to automatically offer one or more payments to one or more other vehicles to obtain the user-selected traffic prioritization along the route to the destination. \n | 14. The non-transitory computer-readable medium of claim 13 wherein the code causes the mobile communication device to display a trip report upon arrival at the destination that indicates that the self-driving vehicle has determined that a portion of the fee allocated for prioritization payments has been unused, and the portion of the fee that was unused has been refunded to an account associated with a user of the first mobile communication device.\n | 15. The non-transitory computer-readable medium of claim 13 comprising code that causes the mobile communication device to display on the user interface of the mobile communication device an amount payable to arrive at the destination at a user-specified time, to present a user interface element to pay the amount, and to communicate this amount and the user-specified time to the self-driving vehicle.\n | 16. The non-transitory computer-readable medium of claim 12 comprising code that causes the mobile communication device to receive real-time traffic data, to detect a traffic jam based on the real-time traffic data by determining that the self-driving vehicle is moving at an average speed less than 20% of a speed limit, and to send a plurality of offer messages to a plurality of vehicles to pay for prioritization.\n | 17. The non-transitory computer-readable medium of claim 16 wherein the offer messages are conditional offers that are conditional on acceptance by all of the plurality of vehicles.\n | 18. The non-transitory computer-readable medium of claim 13 comprising code that causes the mobile communication device to: \ngenerate an emergency request in response to detecting a 911 call being made by the mobile communication device to signify an emergency, the emergency request requesting that the self-driving vehicle be prioritized in traffic due the emergency; \ntransmit the emergency request to a governmental authority emergency server to request emergency prioritization; and \nreceive an emergency prioritization authorization from the governmental authority emergency server, the emergency prioritization authorization comprising a first cryptographic token to be broadcast by the self-driving vehicle to other vehicles to obtain priority in traffic and a second cryptographic token that is recognizable by law enforcement entities permitting the self-driving vehicle to exceed a speed limit due to the emergency. \n | 19. The non-transitory computer-readable medium of claim 18 comprising code that causes the mobile communication device to: \ndetermine an emergency destination to replace the destination originally specified by the user; \nre-route the self-driving vehicle to the emergency destination; and \nconstrain the cryptographic token to be valid only for a new route to the emergency destination. \n | 20. The non-transitory computer-readable medium of claim 18 comprising code that causes the mobile communication device to: \ndetect an emergency using a sensor in, or communicatively connected to, the mobile communication device; \ngenerate an emergency request requesting that the self-driving vehicle be prioritized in traffic in response to detecting the emergency; \ntransmit the emergency request to a governmental authority emergency server to request emergency prioritization; and \nreceive an emergency prioritization authorization from the governmental authority emergency server, the emergency prioritization authorization comprising one or both of: a first cryptographic token to be broadcast by the self-driving vehicle to other vehicles to obtain priority in traffic and a second cryptographic token recognizable by law enforcement entities permitting the self-driving vehicle to exceed a speed limit due to the emergency.", "output": "The vehicle (10) has a mobile communication interface for receiving a reply message from a second mobile communication device through a vehicle-to-vehicle data transceiver, and transmitting the reply message to a first mobile communication device to cause the first mobile communication device to make a payment from a first account of a first user to a second account of a second user to obtain traffic prioritization relative to a second self-driving vehicle. The mobile communication interface receives a payment message from the first mobile communication device and transmits the payment message to the second mobile communication device to confirm that the payment is made through the vehicle-to-vehicle data transceiver. An INDEPENDENT CLAIM is included for a non-transitory computer-readable medium storing computer-readable instructions for operating a self-driving vehicle. Self-driving vehicle i.e car. The vehicle in a given area of a road mutually sense each other's presence using various sensors for collision avoidance and can communicate through vehicle-to-vehicle messaging protocols with each other to avoid collisions. The drawing shows a schematic diagram of a system for V2V payments for traffic reprioritization.10, 10aSelf-driving vehicles 11Exchange v2v messages 222Base stations transceiver 250Internet 300, 302Payment processing servers ", "instruction": "Please summarize the input" }, { "input": "SYSTEMS AND METHODS FOR AN AUTONOMOUS CONVOY WITH LEADER VEHICLEA module for a leader vehicle of a convoy can have a suite of sensors, a communication system, and a controller. The sensor suite can have at least one feature sensor that detects features and/or terrain in an environment and at least one location sensor that determines a location of the leader vehicle. Via the sensor suite, the controller can detect features as the leader vehicle travels along a route through the environment as well as the route of the leader vehicle. The controller can build a map for at least part of the environment with the detected route therethrough. Data indicative of the map and the detected route can then be transmitted to one or more follower vehicles. In some embodiments, the leader vehicle is manually driven while the follower vehicles operate autonomously.", "output": "The system has a convoy leader module (200) that is used for a leader vehicle of a convoy, and comprises a first suite of sensors (202). The first suite comprises at least one feature sensor operable to detect features or terrain in an environment to be traversed by the leader vehicle and at least one location sensor operable to determine a location of the leader vehicle. A first communication system (204) is operable to transmit one or more signals between the leader vehicle and one or more follower vehicles in the convoy. The route of the leader vehicle is detected through the environment via the at least one location sensor. A map for at least portion of the environment with the detected route is built based at least in portion on the detected one or more features and the detected route. The first data indicative of the map and the detected route are transmitted to the one or more follower vehicles in the convoy via the first communication system. An INDEPENDENT CLAIM is included for a convoy. Convoy system for autonomous vehicles with leader vehicle e.g. manned leader vehicle. The method allows the leader vehicle and the autonomous follower vehicles in the convoy to share a common map, thus improving the efficiency of the convoy. The method enables the convoy leader module to be mounted on and/or integrated with a leader vehicle, so that the leader module can use the detected features and route to construct a map, which can be shared with the follower vehicles. The follower vehicles can have their own sensors that detect the features within the environment and can use detected features to improve the route following. The shared map can include information regarding an environmental aspect such as a slip condition, roadway features, area susceptible to dust generation, and the follower vehicle can implement remedial measures at or in advance of a location of that environmental aspect. The drawing shows a simplified schematic diagram of the manned vehicle with convoy leader module. 200Convoy leader module202Sensor suite204Communication system206Control system208Data storage system", "instruction": "Please summarize the input" }, { "input": "Vehicle-to-vehicle sensor verification using sensor fusion networkA vehicle-to-vehicle sensor authentication using a sensor fusion network is used. The invention claims a system and method for sensor verification using a sensor fusion network. The sensor fusion network may include a plurality of sensors associated with one or more vehicles having autonomous or partial autonomous driving functions.|1. A vehicle sensor verification system, comprising: a coordination processor; the coordination processor is operable for performing data communication with a plurality of vehicles; a sensor fusion network; the sensor fusion network comprises a plurality of sensors; each sensor is in data communication with the coordination processor; the sensor fusion network comprises at least a first sensor operable to generate first data and a second sensor operable to generate second data; wherein the first sensor is associated with a first vehicle of the plurality of vehicles; the first data indicates a first detection state of the object; and the second data indicates a second detection state of the object.\n | 2. The system according to claim 1, wherein the second sensor is associated with a second vehicle of the plurality of vehicles.\n | 3. The system according to claim 1, wherein the plurality of sensors further comprises a third sensor operable to generate third data indicative of a detected state of the object, and wherein the coordinating processor is operable to be based on the first data; the second data and the third data generating coordination data, the coordination data comprises a weighted detection state of the object.\n | 4. The system according to claim 3, wherein based on the first data, the second data and the third data use multiple voting algorithms to generate the coordinated data.\n | 5. The system according to claim 3, wherein the second sensor is associated with the first vehicle and the third sensor is associated with the second vehicle.\n | 6. The system according to claim 1, wherein the coordinating processor comprises a neural network operable to identify whether a potential trajectory of the first vehicle is free of obstacles.\n | 7. The system according to claim 1, wherein the coordination processor is operable to dynamically define the sensor fusion network as the subset of the plurality of sensors according to the proximity of each of the plurality of sensors to the first vehicle.\n | 8. The system according to claim 1, wherein the coordination processor is operable to detect a potential fault condition in the first sensor.\n | 9. The system according to claim 1, wherein each sensor of the plurality of sensors in the sensor fusion network comprises a specified accuracy; and the coordinating processor is operable to dynamically define the sensor fusion network as subset of sensors based on the specified accuracy of each sensor.\n | 10. The system according to claim 9, wherein the coordination processor is operable to select a sensor included in the sensor fusion network based on a minimum specified accuracy.\n | 11. The system according to claim 1, wherein the first sensor comprises a sensor type selected from a group of sensor types; the group of sensor types comprises a radar sensor, a laser radar sensor, a proximity sensor, a camera sensor, an infrared sensor and an ultraviolet sensor; an ultrasonic sensor or a sound wave sensor.\n | 12. The system according to claim 11, wherein the second sensor comprises a sensor type different from the first sensor.\n | 13. The system according to claim 1, wherein the first data further comprises a first confidence value, and the second data further comprises a second confidence value.\n | 14. A method for object verification by using sensor fusion network, wherein the sensor fusion network comprises a plurality of sensors, wherein at least the first sensor is associated with the first vehicle, the method comprises: based on the gap measurement of the first sensor to generate a first object data, the first object data comprises a first confidence value of the first object state and the first object state; generating a second object data based on the gap measurement of the second sensor of the plurality of sensors; the second object data comprises a second confidence value of the second object state and the second obstacle state; and generating coordination verification data, the coordination verification data indicates the coordination object state generated by using the first obstacle data and the second obstacle data.\n | 15. The method according to claim 14, wherein the second sensor is associated with the second vehicle.\n | 16. The method according to claim 15, further comprising generating a third obstacle data based on a gap measurement of a third sensor to a potential trajectory, the third obstacle data including a third confidence value of a third object state and a third object state; and using the first object data, the second object data and the third object data to generate the coordination verification data, wherein the third sensor is associated with the third vehicle.\n | 17. The method according to claim 14, wherein the coordination verification data is generated in response to a majority vote algorithm, the majority vote algorithm utilizing at least a first obstacle data, a second obstacle data and a third obstacle data as input.\n | 18. The method according to claim 17, wherein the majority of voting algorithms use weighting factors to generate track feasibility data, the weighting factors being based on a first confidence value, a second confidence value, and a third confidence value.\n | 19. The method according to claim 14, wherein the coordinated verification data further indicates a coordinated confidence value associated with the coordinated object state.", "output": "The system has a coordination processor for performing data communication with multiple vehicles. A sensor fusion network is provided with multiple sensors. Each sensor is in data communication with the coordination processor. The sensor fusion network is provided with two sensors operable to generate two data. A neural network is operable to identify whether potential trajectory of the vehicles is free of obstacles. An INDEPENDENT CLAIM is included for method for performing object verification by using a sensor fusion network. Vehicle-to-vehicle sensor verification system. The system realizes autonomous or partial autonomous driving functions of the vehicles. The drawing shows a top view of a vehicle-to-vehicle sensor verification system.", "instruction": "Please summarize the input" }, { "input": "Method, device, and computer program for controlling stop of autonomous vehicle using speed profileProvided are a method, a device, and a computer program for controlling stop of an autonomous vehicle using a speed profile. The method of controlling, by a computing device, stop of an autonomous vehicle using a speed profile includes obtaining surrounding information of an autonomous vehicle, determining candidate routes for controlling stop of the autonomous vehicle on the basis of the surrounding information, calculating scores for candidate driving plans for the autonomous vehicle to travel the determined candidate routes according to a preset speed profile, and finalizing a driving plan for the autonomous vehicle on the basis of the calculated scores.What is claimed is: \n | 1. A method of controlling, by a computing device, stop of an autonomous vehicle using a speed profile, the method comprising: \nobtaining surrounding information of an autonomous vehicle; \ndetermining candidate routes for controlling stop of the autonomous vehicle on the basis of the surrounding information; \ncalculating scores for candidate driving plans for the autonomous vehicle to travel the determined candidate routes according to a preset speed profile; and \nfinalizing a driving plan for the autonomous vehicle on the basis of the calculated scores. \n | 2. The method of claim 1, wherein the calculating of the scores comprises calculating the score for the candidate driving plan for the autonomous vehicle to travel the determined candidate route and then stop at the determined candidate stop location according to a first speed profile by applying the first speed profile to the autonomous vehicle, \nwherein, when a current speed of the autonomous vehicle is v 0, a current acceleration is a0, and a distance to the determined candidate stop location is starget, the first speed profile increases or reduces a speed of the autonomous vehicle from v0 to a preset target speed of vtarget using the current acceleration of a0 and a preset sectional acceleration profile, maintains the speed of the autonomous vehicle at vtarget for a certain period from a time point at which the speed of the autonomous vehicle becomes vtarget, and reduces the speed of the autonomous vehicle from vtarget to zero using the preset sectional acceleration profile and stops the autonomous vehicle at the determined candidate stop location after the certain period, and\nwherein the certain period is set such that a distance traveled by the autonomous vehicle according to the first speed profile becomes s target.\n | 3. The method of claim 1, wherein the calculating of the scores comprises calculating the score for the candidate driving plan for the autonomous vehicle to travel the determined candidate route and then stop at the determined candidate stop location according to a second speed profile by applying the second speed profile to the autonomous vehicle, \nwherein, when a current speed of the autonomous vehicle is v 0, a current acceleration is a0, and a distance to the determined candidate stop location is starget, the second speed profile increases or reduces a speed of the autonomous vehicle from v0 to a preset target speed of vtarget using the current acceleration of a0 and a preset sectional acceleration profile, maintains the speed of the autonomous vehicle at vtarget for a first period from a time point at which the speed of the autonomous vehicle becomes vtarget, reduces the speed of the autonomous vehicle from vtarget to vtail using the preset sectional acceleration profile after the first period, maintains the speed of the autonomous vehicle at vtail for a second period from a time at which the speed of the autonomous vehicle becomes vtail, and reduces the speed of the autonomous vehicle from vtail to zero using the preset sectional acceleration profile and stops the autonomous vehicle at the determined candidate stop location after the second period,\nwherein the first period is set such that a distance traveled by the autonomous vehicle according to the second speed profile becomes a difference between s target and stail, and\nthe second period is set such that a distance traveled by the autonomous vehicle according to the second speed profile becomes s tail.\n | 4. The method of claim 1, wherein the calculating of the scores comprises calculating the score for the candidate driving plan for the autonomous vehicle to travel the determined candidate route and then stop at the determined candidate stop location according to a third speed profile by applying the third speed profile to the autonomous vehicle, \nwherein, when a current speed of the autonomous vehicle is v 0, a current acceleration is a0, a distance to the determined candidate stop location is starget, the third speed profile reduces a speed of the autonomous vehicle from v0 to zero using the current acceleration of a0, a target acceleration of adecel of the autonomous vehicle, and a preset sectional acceleration profile and stops the autonomous vehicle at the determined candidate stop location, and\nwherein a decel is set to a value such that a distance traveled by the autonomous vehicle according to the third speed profile becomes starget.\n | 5. The method of claim 1, wherein the calculating of the scores comprises calculating the score for the candidate driving plan for the autonomous vehicle to travel the determined candidate route and then stop at the determined candidate stop location according to a fourth speed profile by applying the fourth speed profile to the autonomous vehicle, \nwherein, when a current speed of the autonomous vehicle is v 0, a current acceleration is a0, and a distance to the determined candidate stop location is starget, the fourth speed profile reduces a speed of the autonomous vehicle from v0 to vtail using the current acceleration of a0, a target acceleration of adecel of the autonomous vehicle, and a preset sectional acceleration profile, maintains the speed of the autonomous vehicle at vtail for a certain period from a time point at which the speed of the autonomous vehicle becomes vtail, and reduces the speed of the autonomous vehicle from vtail to zero using the preset sectional acceleration profile and stops the autonomous vehicle at the determined candidate stop location after the certain period, and\nwherein the certain period is set such that a distance traveled by the autonomous vehicle from the time point at which the speed of the autonomous vehicle becomes v tail becomes a difference value stail between starget and a distance stravel,ramp traveled by the autonomous vehicle until the speed of the autonomous vehicle reaches vtail.\n | 6. The method of claim 1, wherein the calculating of the scores comprises calculating the score for the candidate driving plan for the autonomous vehicle to travel the determined candidate route according to a fifth speed profile by applying the fifth speed profile to the autonomous vehicle, \nwherein, when a current speed of the autonomous vehicle is v 0 and a current acceleration is a0, the fifth speed profile increases or reduces a speed of the autonomous vehicle from v0 to vtarget using the current acceleration of a0 and a preset sectional acceleration profile and causes the autonomous vehicle to travel while maintaining the speed of the autonomous vehicle at vtarget.\n | 7. The method of claim 1, wherein the calculating of the scores comprises calculating the score for the candidate driving plan for the autonomous vehicle to travel the determined candidate route according to a sixth speed profile by applying the sixth speed profile to the autonomous vehicle, \nwherein, when a current speed of the autonomous vehicle is v 0, a current acceleration is a0, and a distance to a location at which a preset target speed of vtarget of the autonomous vehicle will be achieved is starget, the sixth speed profile increases or reduces a speed of the autonomous vehicle from v0 to vtarget using the current acceleration of a0, a target acceleration of aadjust of the autonomous vehicle, and a preset sectional acceleration profile and causes the autonomous vehicle to travel while maintaining the speed of the autonomous vehicle at vtarget, and\na adjust is set such that a distance traveled by the autonomous vehicle until the speed of the autonomous vehicle reaches vtarget becomes starget.\n | 8. The method of claim 1, wherein the calculating of the scores comprises calculating the score for the candidate driving plan for the autonomous vehicle to travel the determined candidate route and stop according to a seventh speed profile by applying the seventh speed profile to the autonomous vehicle, \nwherein, when a current speed of the autonomous vehicle is v 0 and a current acceleration is a0, the seventh speed profile reduces a speed of the autonomous vehicle from v0 to zero and stops the autonomous vehicle using the current acceleration of a0, a target acceleration of atarget of the autonomous vehicle, and a preset sectional acceleration profile.\n | 9. The method of claim 1, wherein the calculating of the scores comprises calculating the score for the candidate driving plan for the autonomous vehicle to travel the determined candidate route and stop according to an eighth speed profile by applying the eighth speed profile to the autonomous vehicle, \nwherein, when a current speed of the autonomous vehicle is v 0, the eighth speed profile reduces a speed of the autonomous vehicle from v0 to zero and stops the autonomous vehicle using a preset acceleration of aemergency, and\na emergency is a value preset without considering a current acceleration of a0 of the autonomous vehicle and a preset sectional acceleration profile.\n | 10. The method of claim 1, wherein the calculating of the scores comprises determining whether the determined candidate stop locations correspond to a preset no-stopping zone and correcting the scores calculated for the determined candidate stop locations according to a result of determining whether the determined candidate stop locations correspond to the preset no-stopping zone.\n | 11. The method of claim 1, wherein the calculating of the scores comprises calculating the scores for the candidate driving plans for traveling the determined candidate routes using a processor included in the computing device, \nwhen there are a plurality of candidate driving plans for which scores will be calculated because there are the plurality of determined candidate routes or the plurality of candidate stop locations are determined on the determined candidate routes, the calculating of the scores comprises calculating the scores for the plurality of candidate driving plans for traveling the determined candidate routes using a plurality of different processors included in the computing device, the plurality of candidate driving plans including continuously traveling the plurality of candidate routes without stopping or traveling the plurality of candidate routes and then stopping at any one of the plurality of candidate stop locations determined on the plurality of candidate routes, and \nthe finalizing of the driving plan comprises collecting the scores calculated by the plurality of different processors and finalizing the candidate driving plan having the highest score as the driving plan for the autonomous vehicle. \n | 12. The method of claim 1, further comprising: \nreceiving a target stop location for the autonomous vehicle from a user; \ntransmitting information on the received target stop location to a server and receiving a control command, which is determined according to scores calculated for the target stop location and a driving plan including a driving method to the target stop location on the basis of the preset speed profile, from the server; and \ncontrolling the autonomous vehicle to stop at the target stop location according to the control command. \n | 13. The method of claim 1, further comprising providing guide information of the finalized stop location, \nwherein the providing of the guide information comprises providing information on the finalized route, the finalized stop location, and the finalized driving plan through a display provided in the autonomous vehicle, providing the information on the finalized route, the finalized stop location, and the finalized driving plan to another vehicle adjacent to the autonomous vehicle through vehicle-to-vehicle communication, or displaying the finalized stop location on a road on which the autonomous vehicle is traveling through a location display module provided in the autonomous vehicle. \n | 14. A device for controlling stop of an autonomous vehicle using a speed profile, the device comprising: \na processor; \na network interface; \na memory; and \na computer program which is loaded into the memory and executed by the processor, \nwherein the computer program comprises; \nan instruction of obtaining surrounding information of an autonomous vehicle; \nan instruction of determining candidate routes for controlling stop of the autonomous vehicle on the basis of the surrounding information; \nan instruction of calculating scores for candidate driving plans for the autonomous vehicle to travel the determined candidate routes according to a preset speed profile; \nan instruction of finalizing a driving plan for the autonomous vehicle on the basis of the calculated scores; and \nan instruction of determining candidate stop locations on the determined candidate routes, \nwherein the determining of the candidate stop locations comprises determining, as a candidate stop location, at least one of a location which is spaced a certain distance from a stop line on the determined candidate route, a location which is spaced a certain distance from a location at which an object present on the determined candidate route has stopped or is predicted to stop, and a location input by a driver or a passenger of the autonomous vehicle, \nthe calculating of the scores comprises calculating scores for the determined candidate stop locations and the candidate driving plans including driving methods to the determined candidate stop locations, and \nthe finalizing of the driving plan comprises finalizing a route, a stop location, and a driving plan including a driving method to the stop location for the autonomous vehicle on the basis of the calculated score. \n | 15. A non-transitory computer-readable recording medium storing a computer program, and configured to be coupled to a computer hardware, the program includes instructions to execute operations of: \nobtaining surrounding information of an autonomous vehicle; \ndetermining candidate routes for controlling stop of the autonomous vehicle on the basis of the surrounding information; \ncalculating scores for candidate driving plans for the autonomous vehicle to travel the determined candidate routes according to a preset speed profile; \nfinalizing a driving plan for the autonomous vehicle on the basis of the calculated scores; and \ndetermining candidate stop locations on the determined candidate routes, \nwherein the determining of the candidate stop locations comprises determining, as a candidate stop location, at least one of a location which is spaced a certain distance from a stop line on the determined candidate route, a location which is spaced a certain distance from a location at which an object present on the determined candidate route has stopped or is predicted to stop, and a location input by a driver or a passenger of the autonomous vehicle, \nthe calculating of the scores comprises calculating scores for the determined candidate stop locations and the candidate driving plans including driving methods to the determined candidate stop locations, and \nthe finalizing of the driving plan comprises finalizing a route, a stop location, and a driving plan including a driving method to the stop location for the autonomous vehicle on the basis of the calculated score.", "output": "The method involves obtaining surrounding information of an autonomous vehicle (10). Candidate routes (31) are determined for controlling stop of the autonomous vehicle on the basis of the surrounding information. Scores are calculated for candidate driving plans for the vehicle (21) to travel the determined candidate routes according to a preset speed profile. A driving plan is finalized for the vehicles based on the calculated scores. Candidate stop locations (41,42) are determined on the candidate routes. A route, a stop location, and a driving plan including a driving method to the stop location are finalized based on a calculated score by a computing device e.g. personal computer. INDEPENDENT CLAIMS are included for the following:a device for controlling stop of autonomous vehicle using speed profile; anda computer program. Method for controlling stop of autonomous vehicle using speed profile. The method enables preventing the autonomous vehicle from stopping at an inappropriate location e.g. on a crosswalk, in a no-stopping or parking zone, at a crossroad, and close to a fire hydrant. The drawing shows the diagram exemplifying candidate routes and candidate stop locations.10Autonomous vehicle 22Vehicle 31First candidate route 41First candidate stop location 42Second candidate stop location ", "instruction": "Please summarize the input" }, { "input": "METHOD, APPARATUS AND COMPUTER PROGRAM FOR GENERATING SURROUNDING ENVIRONMENT INFORMATION FOR AUTOMATIC DRIVING CONTROL OF VEHICLEProvided are a method, device, and computer program for generating surrounding environment information for autonomous driving control of a vehicle. According to various embodiments of the present disclosure, a method for generating surrounding environment information for autonomous driving control of a vehicle is a method performed by a computing device, comprising the steps of collecting first sensor data about the surrounding environment of a first vehicle; Generating environmental information about the first vehicle by using first sensor data obtained from the first sensor; and correcting the generated surrounding environment information by using.|1. A method performed by a computing device, comprising: collecting first sensor data relating to a surrounding environment of a first vehicle;\ngenerating surrounding environment information about the first vehicle by using the collected first sensor data; and correcting the generated surrounding environment information using second sensor data related to a surrounding environment of the second vehicle collected from a second vehicle located adjacent to the first vehicle, wherein the generated surrounding environment information includes: Correcting the information may include setting a reference object using the collected first sensor data and the collected second sensor data; \ncalculating an error for the set reference object by comparing information on the set reference object included in the collected first sensor data with information on the set reference object included in the collected second sensor data; and correcting the collected second sensor data using the calculated error, and correcting the generated surrounding environment information using the corrected second sensor data. A method for generating surrounding environment information. \n | 2. The method of claim 1, wherein the correcting of the generated ambient environment information using the corrected second sensor data comprises converting the generated ambient environment information to the first sensor using the collected first sensor data. dividing a shaded area including a location where data is not collected into a non-shaded area including a location where the first sensor data is collected; and converting the shaded area into the non-shaded area by correcting the shaded area using the corrected second sensor data.\n | 3. The method of claim 2, wherein the converting of the shaded area into the non-shaded area comprises: correcting surrounding environment information of the second vehicle generated according to the collected second sensor data using the calculated error; and correcting the shaded area using the corrected surrounding environment information of the second vehicle.\n | 4. delete\n | 5. The method of claim 1, wherein the calculating of the error with respect to the set reference object comprises time information included in the collected first sensor data and the collected second sensor data using a time protocol. synchronizing the received time information; and comparing the information on the set reference object included in the first sensor data with which the time information is synchronized with the information on the set reference object included in the second sensor data with which the time information is synchronized to determine the set reference object. A method for generating surrounding environment information for autonomous driving control of a vehicle, comprising calculating an error for\n | 6. The method of claim 1, wherein the calculating of the error with respect to the set reference object comprises, when there is a history of occurrence of an event for the second vehicle, the event from the second vehicle based on a history of occurrence of the event. collecting information about a first point in time when\ncalculating a time error between the collected first sensor data and the collected second sensor data by comparing the first time point with a second time point when the first vehicle detects an event generated from the second vehicle; and correcting time information included in the collected first sensor data and time information included in the collected second sensor data by using the calculated time error, and correcting the time information included in the corrected first sensor data. Comparing information on the reference object with information on the set reference object included in the calibrated second sensor data to calculate an error for the set reference object, How to generate information. \n | 7. The method of claim 1, wherein the calculating of the error for the set reference object comprises, when two or more reference objects are set, information on the set two or more reference objects included in the collected first sensor data and the collected calculating two or more position errors for each of the two or more set reference objects by comparing information on the set two or more reference objects included in the second sensor data; and determining a position error between the collected first sensor data and the collected second sensor data by optimizing the calculated sum of the two or more position errors to have a minimum value. A method for generating surrounding environment information for\n | 3. The method of claim 2, wherein the converting of the shaded area to the non-shaded area comprises: the calculated error—the calculated error when the surrounding environment information of the second vehicle includes dynamic object information; Time information, location information, and direction information of the dynamic object included in the dynamic object information are corrected using - including time error, position error, and direction error for the object, and the corrected dynamic object information is used to A method of generating surrounding environment information for autonomous driving control of a vehicle, comprising correcting a shaded area.\n | 9. The method of claim 1, wherein the setting of the reference object comprises position information about the set reference object included in the collected first sensor data and information about the set reference object included in the collected first sensor data. Comparing location information to calculate a location difference value; and determining whether a reference object set using the collected first sensor data and a reference object set using the second sensor data are the same object according to whether the calculated position difference value is within a predetermined value. A method for generating surrounding environment information for autonomous driving control of a vehicle, comprising:\n | 3. The method of claim 2, wherein the converting the shaded area into the non-shaded area comprises, when two or more second sensor data are collected from two or more second vehicles adjacent to the first vehicle, the collected two or more second sensor data. Correcting the shaded area using each of the second sensor data, but calculating the importance of the two or more second vehicles when the collected two or more second sensor data collected at the location corresponding to the shaded area are different and correcting the shadow area using only the second sensor data collected from the second vehicle having the highest calculated importance.\n | 11. The method of claim 1, wherein second sensor data about the surrounding environment of the second vehicle is collected from the second vehicle by being directly connected to the second vehicle through V2V communication (Vehicle-to-Vehicle Communication), or a plurality of Connecting to a control server that collects sensor data on the surrounding environment of the vehicle and receiving second sensor data on the surrounding environment of the second vehicle from the control server, further comprising: How to generate environmental information.\n | 12. The method of claim 1, wherein the generating of the surrounding environment information comprises: generating a grid map for a predetermined range based on the first vehicle, wherein the grid map includes a plurality of grids; And by recording the collected first sensor data on a grid corresponding to a location where the collected first sensor data was collected, a non-shaded area including a grid on which the collected first sensor data was recorded and the collected first sensor data Generating ambient environment information including a shaded area including a grid in which first sensor data is not recorded, and correcting the generated ambient environment information using the corrected second sensor data, A method of generating ambient environment information for controlling autonomous driving of a vehicle, comprising correcting second sensor data collected at a location corresponding to a grid included in the shaded area and recording the corrected second sensor data in a grid included in the shaded area.\n | 13. Processor;\nNetwork interface; \nMemory; and a computer program loaded into the memory and executed by the processor, wherein the computer program includes instructions for collecting first sensor data related to a surrounding environment of the first vehicle; \ninstructions for generating surrounding environment information about the first vehicle by using the collected first sensor data; and instructions for correcting the generated surrounding environment information using second sensor data related to the surrounding environment of the second vehicle collected from a second vehicle located adjacent to the first vehicle, wherein the generated surrounding environment information includes: The instructions for correcting information may include instructions for setting a reference object using the collected first sensor data and the collected second sensor data; \ninstructions for calculating an error for the set reference object by comparing information on the set reference object included in the collected first sensor data with information on the set reference object included in the collected second sensor data; and an instruction for correcting the collected second sensor data using the calculated error and correcting the generated surrounding environment information using the corrected second sensor data. A computing device that performs a method for generating surrounding environment information. \n | 14. coupled with the computing device, collecting first sensor data relating to the surrounding environment of the first vehicle;\ngenerating surrounding environment information about the first vehicle by using the collected first sensor data; and correcting the generated surrounding environment information using second sensor data related to a surrounding environment of the second vehicle collected from a second vehicle located adjacent to the first vehicle, wherein the generated surrounding environment information includes: Correcting the information may include setting a reference object using the collected first sensor data and the collected second sensor data; \ncalculating an error for the set reference object by comparing information on the set reference object included in the collected first sensor data with information on the set reference object included in the collected second sensor data; and correcting the collected second sensor data using the calculated error, and correcting the generated surrounding environment information using the corrected second sensor data. A computer program stored in a recording medium readable by a computing device in order to execute an environmental information generating method.", "output": "The method involves setting a reference object using the collected first sensor data and the collected second sensor data. An error for the set reference object is calculated by comparing information on the set reference object included in the collected first sensor data with information on the set reference object included in the collected second sensor data. The collected second sensor data is corrected using the calculated error, and the generated surrounding environment information is corrected using the corrected second sensor data. INDEPENDENT CLAIMS are included for the following:a computing device for generating surrounding environment information for autonomous driving control of vehicle; anda computing program for generating surrounding environment information for autonomous driving control of vehicle. Method for generating surrounding environment information for autonomous driving control of vehicle. The safer self-driving control of the host vehicle is enabled by removing an area in which the host vehicles cannot perceive it as a result of the correction. The drawing shows a flowchart illustrating the process for generating surrounding environment information for autonomous driving control of vehicle. (Drawing includes non-English language text) S110Step for collecting first sensor data about the surrounding environment of the first vehicleS120Step for generating surrounding environment information about the first vehicleS130Step for using second vehicle data collected from the second vehicle adjacent to the first vehicle", "instruction": "Please summarize the input" }, { "input": "Method for automatically driving vehicle in group queue, device and electronic deviceThe invention claims a method for automatically driving vehicle in group team, device and electronic device; Wherein, the method comprises: the RSU side obtains the information sent by the target automatic driving vehicle, and periodically broadcasts the group request information to confirm at least one group automatic driving vehicle of the application group in the rest automatic driving vehicle; obtaining the team vehicle attribute information of the team automatic driving vehicle, and according to the target vehicle attribute information and team vehicle attribute information for team planning, obtaining the target team strategy; the target group strategy respectively sent to the target automatic driving vehicle and team automatic driving vehicle, so that the target automatic driving vehicle and team automatic driving vehicle group according to the target group strategy; so as to realize the remote distance (namely exceeds the vehicle communication distance) automatic driving vehicle information interaction through the RSU side and the way group, improves the high efficiency and safety of the automatic driving vehicle in the group.|1. A method for automatically driving vehicle in group queue, wherein it is applied to the RSU side; the RSU side is in communication connection with the vehicle side; wherein the vehicle side comprises: a plurality of automatic driving vehicle in the same driving direction, and the distance between any two of the automatic driving vehicle is not less than the vehicle communication distance, the method comprises: obtaining the information sent by the target automatic driving vehicle; Wherein, the information includes: target vehicle attribute information and team request information; periodically broadcasting the group request information to confirm at least one group automatic driving vehicle of the application group in the rest of the automatic driving vehicle; obtaining the team vehicle attribute information of the team automatic driving vehicle, and according to the target vehicle attribute information and the team vehicle attribute information for team planning, obtaining the target team strategy; sending the target group strategy to respectively target automatic driving vehicle and the group automatic driving vehicle, so that the target automatic driving vehicle and the group automatic driving vehicle group according to the target group strategy.\n | 2. The method according to claim 1, wherein the vehicle attribute information of each said automatic driving vehicle comprises: static information and dynamic information; Wherein, the static information includes: vehicle body parameter and vehicle engine power; the dynamic information comprises: real time position and real time speed; according to the target vehicle attribute information and the team vehicle attribute information of the team planning step, comprising: according to the target dynamic information of the target automatic driving vehicle and the team dynamic information of the team automatic driving vehicle, performing team planning according to the oil consumption as the target, obtaining the target team strategy; Wherein, the target group policy comprises: in the formation process, the target automatic driving vehicle corresponding to the first group speed, the group automatic driving vehicle corresponding to the second group speed, and the group finishing time.\n | 3. The method according to claim 2, wherein the step of sending the target team strategy to respectively target automatic driving vehicle and the team automatic driving vehicle comprises the following steps: sending the first group team speed and the group team finishing time to the target automatic driving vehicle, and sending the second group team speed and the group team finishing time to the group team automatic driving vehicle, so that the target automatic driving vehicle according to the first group team speed; the group automatic driving vehicle group according to the second group speed.\n | 4. The method according to claim 1, wherein the method further comprises: obtaining the team information sent by the target automatic driving vehicle; Wherein, the team information comprises at least one of the following: group ID information, the ID information of the pilot vehicle, the driving direction of the team, the current position of the team, each vehicle information in the team, the cruising speed of the team, the train space of the team, the member number of the team, the ID list of the team member, the length of the team and the driving route of the team; sending the team information to the team automatic driving vehicle.\n | 5. The method according to claim 1, wherein each of the automatic driving vehicle is further configured with a free cruise mode and a group cruise mode; the method further comprises: when monitoring the group is finished, generating mode switching instruction; the mode switching instruction respectively sent to the target automatic driving vehicle and the group automatic driving vehicle, so that the target automatic driving vehicle and the group automatic driving vehicle are switched from the free cruise mode to the group cruise mode according to the mode switching instruction.\n | 6. The method according to claim 1, wherein the group request information further carries with priority information; the method further comprises: obtaining the group request information set; wherein the group request information set comprises a plurality of the group request information of the same time, each of the group request information corresponding to different of the automatic driving vehicle; based on the priority information carried by the group request information, determining the target group request information.\n | 7. The method according to claim 1, wherein the step of confirming at least one group automatic driving vehicle of the application group in the rest of the automatic driving vehicles comprises: when monitoring the confirmation application team information, the other said automatic driving vehicle, the confirmation application group information corresponding to the automatic driving vehicle is confirmed as the team automatic driving vehicle.\n | 8. An automatic driving vehicle in the group device, wherein it is applied to the RSU side; the RSU side is connected with the vehicle side communication; wherein the vehicle side comprises: a plurality of automatic driving vehicle in the same driving direction, and the distance between any two of the automatic driving vehicle is not less than the vehicle communication distance, the device comprises: an obtaining module for obtaining the information sent by the target automatic driving vehicle; Wherein, the information includes: target vehicle attribute information and team request information; a broadcast module, for periodically broadcasting the team request information to confirm at least one group automatic driving vehicle of the application group in the rest of the automatic driving vehicle; planning module, for obtaining the team vehicle attribute information of the team automatic driving vehicle, and according to the target vehicle attribute information and the team vehicle attribute information for team planning, obtaining the target team strategy; a sending module, used for sending the target team strategy to respectively target automatic driving vehicle and the team automatic driving vehicle, so that the target automatic driving vehicle and the team automatic driving vehicle group according to the target team strategy.\n | 9. An electronic device, comprising a memory, a processor and a computer program stored on the memory and capable of running on the processor, wherein the processor executes the steps of the method according to any one of claims 1-7 when the computer program is executed.\n | 10. A computer readable storage medium, wherein the computer readable storage medium is stored with a computer program; when the computer program is run by the processor, executing the steps of the method according to any one of the preceding claims 1-7", "output": "The method involves obtaining information sent by a target automatic driving vehicle, where the information includes target vehicle attribute information and team request information. The group request information is periodically broadcasted. The target vehicle attribute information of a team is obtained. A target team strategy is obtained according to target vehicle information and team attribute information. Team planning process is performed on the target team strategy. Each target group strategy is sent to each target automatic drive vehicle and a group automatic drive vehicles. The target automatic driving vehicle and a group automatic driving vehicle group are determined according to the target group strategy. INDEPENDENT CLAIMS are included for: (1) a device for automatically driving vehicle in group queue in RSU side;(2) an electronic device comprising a processor and a memory to execute a set of instructions for performing a method for automatically driving vehicle in group queue in RSU side;(3) a computer readable storage medium for storing a set of instructions for performing a method for automatically driving vehicle in group queue in RSU side. Method for automatically driving vehicle in group queue in restricted stock unit (RSU) side. The method enables realizing long distance automatic driving vehicle information interaction through the RSU side and the way group so as to improve high efficiency and safety of the automatic driving vehicles in the group. The method enables allowing the target vehicle to perform team planning according to the target dynamic information and the team dynamic information, so that the target driving vehicle and team driving vehicle team is formed according to a target team strategy, and improving the high efficiency of the vehicle in the vehicle group. The drawing shows a flow diagram a method for automatically driving vehicle in group queue in RSU side. (Drawing includes non-English language text).", "instruction": "Please summarize the input" }, { "input": "AUTONOMOUS VEHICLE COMMUNICATION FRAMEWORK FOR MULTI-NETWORK SCENARIOSApproaches for Multi-Access Edge Computing (MEC) Vehicle-to-Everything (V2X), Vehicle-To-Vehicle (V2V), and Autonomous Vehicles Distributed Networks (AVDN) functions in a MEC infrastructure are discussed. In various examples, operations and network configurations are described that use a service in an AVDN, including: identifying a service condition (e.g., based on a state of a service and connectivity to an instance of the service); establishing a connection in the AVDN in response to the service condition (e.g., using vehicle-to-vehicle (V2V) or Vehicle-to-Everything (V2X) network communications to the AVDN); and performing a service operation with the service via the AVDN.What is claimed is: \n | 1. A user equipment (UE) of a first autonomous vehicle (AV), comprising: \na network interface configured to perform vehicle-to-vehicle (V2V) or Vehicle-to-Everything (V2X) network communications with an autonomous vehicle distributed network (AVDN); and \nat least one processor configured to: \nidentify a service condition based on a state of a service and connectivity to an instance of the service in an infrastructure network; \nestablish a connection in the AVDN to a second UE at a second AV in response to the service condition, using the V2V or V2X network communications, the AVDN further to provide connectivity between the UE and the second UE for use of the service; and \nperform a service operation with the service via the AVDN, using the connection to the second UE. cm 2. The UE of claim 1, wherein the AVDN is formed among a plurality of AVs, the AVDN connecting at least the UE of the first AV and the second UE of the second AV.\n | 3. The UE of claim 1, wherein the service operation performed by the UE includes providing a service request to the AVDN, wherein the first AV operates as a service requestor, and wherein the second AV operates as a service provider.\n | 4. The UE of claim 1, wherein the service operation performed by the UE includes fulfillment of a service request from the AVDN, wherein the first AV operates as a service provider, and wherein the second AV operates as a service requestor.\n | 5. The UE of claim 1, wherein the service condition is identified in response to a change of the state of the service, and wherein the change of the state of service is associated with one or more of: \navailability of data from the service; \navailability of a resource used by the service; \nunavailability of the instance of the service in the infrastructure network; or \na safety-related scenario involving the first AV, the second AV, or the service. \n | 6. The UE of claim 1, wherein the service condition is identified in response to the UE being located outside a coverage area of the infrastructure network, and wherein the infrastructure network is a wireless network operated from one or more fixed locations and operated in accordance with a standard from a 3rd Generation Partnership Project (3GPP) 5G, Intelligent Transport Systems (ITS)-G5, or Dedicated Short Range Communications (DSRC) family of standards.\n | 7. The UE of claim 1, wherein the service operation relates to: data sharing, decision sharing, or task computation sharing; and \nwherein the service operation provides fulfillment of an application operating at the first AV or the second AV. \n | 8. The UE of claim 1, wherein the service is provided by a Multi-Access Edge Computing (MEC) host, \nwherein the MEC host operates according to a standard from an European Telecommunications Standards Institute (ETSI) MEC standards family, and \nwherein (i) the UE operates as a MEC client and the second UE operates as the MEC host, or (ii) the UE operates as the MEC host and the second UE operates as a MEC client. \n | 9. The UE of claim 1, wherein the service operation is established using an Application Programming Interface (API) for the AVDN, the API for the AVDN providing a standardized interface to invoke the service operation between the UE and the second UE.\n | 10. The UE of claim 1, the at least one processor further configured to: \nperform authentication of the UE with an authentication server of the AVDN, wherein the connection with the AVDN is established in response to successful authentication. \n | 11. The UE of claim 1, wherein the UE is configured by the AVDN to operate as an anchor service provider, wherein the at least one processor is further configured to: \nperform a service request with a third UE of a third AV; \nobtain service response data, in response to the service request with the third UE; and \nprovide the service response data to the second UE. \n | 12. At least one non-transitory machine readable medium including instructions for coordinating service operations from a first user equipment (UE) of an autonomous vehicle (AV) with an autonomous vehicle distributed network (AVDN), wherein the instructions, when executed by processing circuitry, cause the processing circuitry to perform operations comprising: \nidentify a service condition, based on a state of a service and connectivity to an instance of the service in an infrastructure network; \nestablish a connection in the AVDN to a second UE at a second AV in response to the service condition, using vehicle-to-vehicle (V2V) or Vehicle-to-Everything (V2X) network communications to the AVDN, the AVDN further to provide connectivity between the UE and the second UE for use of the service; and \nperform a service operation with the service via the AVDN, using the connection to the second UE. \n | 13. The non-transitory machine readable medium of claim 12, wherein the AVDN is formed among a plurality of AVs, the AVDN connecting at least the UE of the first AV and the second UE of the second AV.\n | 14. The non-transitory machine readable medium of claim 12, wherein the service operation performed by the UE includes providing a service request to the AVDN, wherein the first AV operates as a service requestor, and wherein the second AV operates as a service provider.\n | 15. The non-transitory machine readable medium of claim 12, wherein the service operation performed by the UE includes fulfillment of a service request from the AVDN, wherein the first AV operates as a service provider, and wherein the second AV operates as a service requestor.\n | 16. The non-transitory machine readable medium of claim 12, wherein the service condition is identified in response to a change of the state of the service, and wherein the change of the state of service is associated with one or more of: \navailability of data from the service; \navailability of a resource used by the service; \nunavailability of the instance of the service in the infrastructure network; or \na safety-related scenario involving the first AV, the second AV, or the service. \n | 17. The non-transitory machine readable medium of claim 12, wherein the service condition is identified in response to the UE being located outside a coverage area of the infrastructure network, and wherein the infrastructure network is a wireless network operated from one or more fixed locations and operated in accordance with a standard from a 3rd Generation Partnership Project (3GPP) 5G, Intelligent Transport Systems (ITS)-G5, or Dedicated Short Range Communications (DSRC) family of standards.\n | 18. The non-transitory machine readable medium of claim 12, wherein the service operation relates to: data sharing, decision sharing, or task computation sharing; and \nwherein the service operation provides fulfillment of an application operating at the first AV or the second AV. \n | 19. The non-transitory machine readable medium of claim 12, wherein the service is provided by a Multi-Access Edge Computing (MEC) host, \nwherein the MEC host operates according to a standard from an European Telecommunications Standards Institute (ETSI) MEC standards family, and \nwherein (i) the UE operates as a MEC client and the second UE operates as the MEC host, or (ii) the UE operates as the MEC host and the second UE operates as a MEC client. \n | 20. The non-transitory machine readable medium of claim 12, wherein the service operation is established using an Application Programming Interface (API) for the AVDN, the API for the AVDN providing a defined interface to invoke the service operation between the UE and the second UE.\n | 21. The non-transitory machine readable medium of claim 12, the instructions further to perform operations comprising: \nperforming authentication of the UE with an authentication server of the AVDN, wherein the connection with the AVDN is established in response to successful authentication. \n | 22. The non-transitory machine readable medium of claim 12, wherein the UE is configured by the AVDN to operate as an anchor service provider, the instructions further to perform operations comprising: \nperforming a service request with a third UE of a third AV; \nobtaining service response data, in response to the service request with the third UE; and \nproviding the service response data to the second UE. \n | 23. A system, comprising: \nat least one network communication device adapted to perform vehicle-to-vehicle (V2V) or Vehicle-to-Everything (V2X) network communications; and \nat least one processing device that, when in operation, is configured by instructions to: \noperate the at least one network communication device to establish an autonomous vehicle distributed network (AVDN), using the V2V or V2X network communications; \nreceive a service request via the AVDN, the AVDN further to provide connectivity between the at least one processing device and at least one other device to operate at least one service; \nidentify a service condition based on the service request; and \nperform a service operation with the at least one service, via the AVDN, based on the identified service condition. \n | 24. The system of claim 23, wherein the at least one network communication device and the at least one processing device is included in a first autonomous vehicle (AV), wherein the service operation includes providing a service request to at least a second AV accessible via the AVDN, wherein the first AV operates as a service requestor, and wherein the second AV operates as a service provider.\n | 25. The system of claim 23, wherein the at least one network communication device and the at least one processing device is included in a first autonomous vehicle (AV), wherein the service operation includes fulfillment of a service request from at least a second AV accessible via the AVDN, wherein the first AV operates as a service provider, and wherein the second AV operates as a service requestor.", "output": "The user equipment has a network interface configured to perform vehicle-to-vehicle (V2V) or Vehicle-to-Everything (V2X) network communications with an autonomous vehicle distributed network (AVDN). A processor (1704) identifies a service condition based on a state of a service and connectivity to an instance of the service in an infrastructure network, and establishes a connection in the AVDN to a second UE at a second AV in response to the service condition using the V2V or V2X network communications. The AVDN provides connectivity between the firs UE and the second UE for use of the service. The processor performs a service operation with the service through the AVDN using the connection to the second UE. INDEPENDENT CLAIMS are included for:(1) a non-transitory machine-readable medium including instructions for coordinating service operations from first UE of AV with AVDN;(2) a system for coordinating service operations from first UE of AV with AVDN. User equipment (UE) for a first autonomous vehicle (AV) e.g. car for coordinating service operations from first AV with autonomous vehicle distributed network (AVDN), in network settings such as multi-access edge computing (MEC) infrastructures and in multi-mobile network operator (MNO) scenarios. The method provides reduced latency, increased responsiveness, and more available computing power than offered in traditional cloud network services and wide area network connections. The method allows a cloud consumer to unilaterally provision computing capabilities such as server time and network storage, as needed automatically without requiring human interaction with a service's provider, so that the capabilities can be rapidly and elastically provisioned automatically to quickly scale out and rapidly released with minimal management effort or interaction with the service provider. The drawing shows a block diagram of a compute node system.1700Edge compute node 1702Compute circuitry 1704Processor 1706Memory 1710Data storage ", "instruction": "Please summarize the input" }, { "input": "CLIMATE BASED SELF- SPEED CONTROL SYSTEM IN CAR USING ARTIFICIAL INTELLIGENCEThis system introduces a paradigm shift in vehicular autonomy, integrating adaptive artificial intelligence to not only enable autonomous driving but also dynamically adjust vehicle speed based on real-time climate and environmental conditions. These sensors meticulously capture and feed real-time data on the vehicle's surroundings into a robust artificial intelligence framework. Unlike conventional systems, our solution employs cutting-edge machine learning algorithms to process and fuse this data, facilitating precise decision-making. The self-speed control system transcends the traditional boundaries of autonomous driving by actively responding to an array of environmental factors. It continuously monitors and adapts to factors such as traffic conditions, weather dynamics, temperature fluctuations, wind patterns. Air Quality Index (AQl), road infrastructure, and unforeseen obstacles. Crucially, the system operates with a paramount focus on safety, restraining speed until optimal environmental conditions are assured. This dynamic approach ensures not only safe but also smooth and efficient navigation, even in the most challenging and unpredictable environments. Through this innovative self-speed control system, we pave the way for a safer, more efficient transportation landscape.|1. The Climate based self-speed control system in car using Artificial lnteligence comprises: LiDAR sensor(1), Radar sensors(2), ultrasonic sensor(3), Steering system(4). Throttle and brakes(5), Battery(6), Cellular network(7),V2x Communication(8), GPS and inertial navigation systems(10).\n | 2. LiDAR (Light Detection and Ranging)(1): This sensor emits laser pulses and measures the reflected light to create a highly accurate 3D map of the surroundings. It's like having superpowered vision, able to see obstacles in darkness, fog, and even behind comers.\n | 3. Radar (Radio Detection and Ranging)(2): Similar to LiDAR(1), radar uses radio waves to detect objects and measure their distance and speed. It's a good backup for LiDAR(1), especially in bad weather conditions.\n | 4. Ultrasonic sensors(3): These sensors emit high-frequency sound waves to detect nearby objects, providing short-range obstacle detection, especially useful for parking and manoeuvring in tight spaces.\n | 5. Steering system(4): The car's steering wheel is controlled by electric motors or hydraulic actuators that turn the wheels based on the decisions made by the computer.\n | 6. Throttle and brakes(5): The car's speed is controlled by electronically controlled motors that adjust the throttle and apply the brakes as needed.\n | 7. Battery(6): Self-Speed control cars typically use large batteries to power all the on-board electronics and sensors. Some may have hybrid systems with an additional engine for range extension\n | 8. Cellular network(7): Self-Speed control cars can connect to the cellular network to download maps, traffic updates, and communicate with other vehicles or infrastructure.\n | 9. V2X (Vehicle-to-Everything) communication(8): This technology allows cars to communicate directly with each other and with roadside infrastructure, further enhancing safety and traffic flow.\n | 10. GPS and inertial navigation systems(9): These provide precise location and direction information, even in areas with limited cellular coverage.", "output": "The system has a light detection and ranging (LiDAR) sensor for emitting laser pulses and measuring reflected light to create a highly accurate three-dimensional (3D) map of surroundings. Ultrasonic sensors emit high-frequency sound waves to detect nearby objects for providing short-range obstacle detection. Electric motors or hydraulic actuators control car's steering wheel and turns the wheels based on the decisions made by a computer. Electronically controlled motors control car's speed and adjust a throttle. A battery powers all on-board electronics and sensors. INDEPENDENT CLAIMS are included for: (1) liDAR Light Detection and Ranging: This sensor emits laser pulses and measures the reflected light to create a highly accurate D map of the surroundings. It's like having superpowered vision; (2) radar Radio Detection and Ranging: Similar to LiDAR; (3) ultrasonic sensors: These sensors emit high-frequency sound waves to detect nearby objects; (4) steering system: The car's steering wheel is controlled by electric motors or hydraulic actuators that turn the wheels based on the decisions made by the computer.; (5) throttle and brakes: The car's speed is controlled by electronically controlled motors that adjust the throttle and apply the brakes as needed.; (6) battery: Self-Speed control cars typically use large batteries to power all the on-board electronics and sensors. Some may have hybrid systems with an additional engine for range extension; (7) cellular network: Self-Speed control cars can connect to the cellular network to download maps; (8) vX Vehicle-to-Everything communication: This technology allows cars to communicate directly with each other and with roadside infrastructure; (9) gPS and inertial navigation systems: These provide precise location and direction information. Climate based self-speed control system for autonomous vehicles i.e. self-driving cars. The system operates in a manner that restricts the driver from increasing the speed until the environment reaches a climate-neutral state, thus ensuring a safer and more efficient mode of transportation. By harnessing cutting-edge technologies and their seamless integration, the system aims to redefine the capabilities of autonomous vehicles, specifically targeting their adaptability to varying climate conditions, ultimately enhancing road safely and optimizing transportation efficiency in diverse environmental settings. The system is capable of empowering vehicles to autonomously regulate their speed, navigate diverse and challenging environments, and dynamically adapt to various driving scenarios.", "instruction": "Please summarize the input" }, { "input": "TRANSMISSION CONTROL IN APPLICATION LAYER BASED ON RADIO BEARER QUALITY METRICS IN VEHICULAR COMMUNICATIONMethods, apparatuses, and computer-readable mediums for wireless communication are disclosed by the present disclosure. In an aspect, an application layer in a user equipment (UE) receives, from an access layer in the UE, a quality of service (QoS) indication comprising a metric that represents a quality of one or more radio bearers used for a vehicular communication with one or more other UEs. The application layer performs a transmission control over the vehicular communication based on the QoS indication.What is claimed is: \n | 1. A method of wireless communication, comprising: \nreceiving, by an application layer in a user equipment (UE), from an access layer in the UE, a quality of service (QoS) indication comprising a metric that represents a quality of one or more radio bearers used for a vehicular communication with one or more other UEs; and \nperforming, at the application layer, a transmission control over the vehicular communication based on the QoS indication. \n | 2. The method of claim 1, wherein the metric is indicative of a message reception performance as affected by a presence or an absence of message interference or collision in the one or more radio bearers.\n | 3. The method of claim 1, wherein the performing comprises adjusting a transmission rate of a unicast communication of the UE, according to the QoS indication.\n | 4. The method of claim 1, wherein the performing comprises adjusting a transmission range of a groupcast communication of the UE, according to the QoS indication.\n | 5. The method of claim 1, wherein the performing comprises adjusting a maneuver of the UE, according to the QoS indication.\n | 6. The method of claim 1, wherein the performing comprises adjusting an autonomous driving status of the UE, according to the QoS indication.\n | 7. The method of claim 1, further comprising sharing sensor data of the UE with a remote UE via a unicast communication at a first transmission rate.\n | 8. The method of claim 7, wherein the performing comprises: \ndetermining, by the application layer, based on the QoS indication, a second transmission rate supportable by the unicast communication; and \nadjusting the unicast communication according to the second transmission rate. \n | 9. The method of claim 8, wherein the adjusting comprises performing inter-transmission time (ITT) control at the UE.\n | 10. The method of claim 8, \nwherein the sharing comprises sharing video sensor data of the UE with the remote UE over the unicast communication; and \nwherein the adjusting comprises adjusting a video resolution of a video codec of the UE according to the second transmission rate supportable by the unicast communication. \n | 11. The method of claim 7, wherein the receiving comprises receiving a packet error rate (PER) related to the unicast communication with the remote UE.\n | 12. The method of claim 7, wherein the receiving comprises receiving a negative acknowledgement (NACK) statistic related to the unicast communication with the remote UE.\n | 13. The method of claim 1, further comprising: \ncommunicating, by the UE, with a plurality of other UEs via a groupcast communication; and \nwherein the receiving comprises receiving at least one of a packet error rate (PER) or a negative acknowledgement (NACK) statistic related to the groupcast communication with the plurality of other UEs. \n | 14. The method of claim 13, wherein the performing comprises: \ndetermining, based on the at least one of the PER or the NACK statistic, that a reachable range of the UE fails to comply with a minimum range requirement of a vehicular application configured for controlling a maneuver of the UE. \n | 15. The method of claim 14, wherein the performing further comprises cancelling the maneuver of the UE.\n | 16. The method of claim 14, wherein the performing further comprises postponing the maneuver of the UE.\n | 17. The method of claim 14, wherein the performing further comprises regenerating a driving strategy of the UE to match the reachable range.\n | 18. The method of claim 14, wherein the performing further comprises: \nmodifying a range of the UE according to the reachable range; and \nadjusting the maneuver of the UE based on the range. \n | 19. The method of claim 18, wherein the modifying comprises adjusting a radiated power of the UE.\n | 20. The method of claim 18, wherein adjusting the maneuver comprises slowing down the UE.\n | 21. The method of claim 18, wherein adjusting the maneuver comprises following a stop and go operation at the UE.\n | 22. The method of claim 18, wherein adjusting the maneuver comprises exiting an autonomous driving mode at the UE.\n | 23. The method of claim 14, wherein the maneuver comprises a coordinated intersection crossing.\n | 24. The method of claim 1, wherein the QoS indication comprises one or more of a Packet Error Rate (PER), a Packet Received Rate (PRR), an average number of retransmissions, an average PER, an average PRR, or an acknowledgement (ACK)/negative acknowledgement (NACK) statistic.\n | 25. The method of claim 1, wherein the QoS indication comprises a range statistic of a groupcast group.\n | 26. The method of claim 1, wherein the QoS indication comprises a supported bit rate for a radio bearer.\n | 27. The method of claim 1, wherein the vehicular communication comprises a new radio (NR) vehicle-to-everything (V2X) communication.\n | 28. A user equipment (UE) for wireless communication, comprising: \na memory storing instructions; and \na processor in communication with the memory, wherein the processor is configured to execute the instructions to: \nreceive, by an application layer in the UE, from an access layer in the UE, a quality of service (QoS) indication comprising a metric that represents a quality of one or more radio bearers used for a vehicular communication with one or more other UEs; and \nperform, at the application layer, a transmission control over the vehicular communication based on the QoS indication. \n | 29. The UE of claim 28, wherein the processor is further configured to execute the instructions to adjust a transmission rate of a unicast communication of the UE, according to the QoS indication.\n | 30. The UE of claim 28, wherein the processor is further configured to execute the instructions to adjust a transmission range of a groupcast communication of the UE, according to the QoS indication.", "output": "The method involves using the application layer (142) in a user equipment (UE) (148) to receive a QoS indication (144) from the access layer (146) in the UE. The QoS indication includes a metric that represents a quality of the radio bearers used for vehicular communication with other UEs (104,149). The application layer then performs transmission control over the vehicular communication based on the QoS indication. An INDEPENDENT CLAIM is also included for a UE used for wireless communication. Wireless communication method for use in vehicular communication systems. Can be used in transmission control in application layer based on radio bearer quality metrics in vehicular communication, including vehicle-to-vehicle (V2V) communication, vehicle-to-pedestrian (V2P) communication, vehicle-to-everything (V2X) communication, enhanced vehicle-to-everything (eV2X) communication, and cellular vehicle-to-everything (C-V2X) communication. Provides a wireless communication method that ensures improved autonomous driving, e.g., in self-driving vehicles operating with reduced or zero human input, and improved driving experience, e.g., improved non-autonomous human driving. The drawing shows a schematic diagram illustrating a wireless communication system and an access network. 104,149Other UEs142Application layer144QoS indication146Access layer148UE", "instruction": "Please summarize the input" }, { "input": "CONGESTION CONTROL FOR NR V2XIn one aspect, a method includes determining, by a user equipment (UE), a channel busy ratio (CBR) window for a CBR measurement for one or more resources; determining, by the UE, a CBR measurement value for the CBR window and for the one or more resources; determining, by the UE, a channel occupancy ratio (CR) window based on a first number of subframes used for a history of past transmissions and based on a second number of subframes used for future planned transmissions and corresponding retransmissions; and determining, by the UE, a CR value for the CR window based on subchannels used for the one or more resources for the first number of subframes and based on subchannels estimated for the one or more resources for the second number of subframes. In another aspect, a method includes determining a CR window based on a CBR measurement value.", "output": "The method involves determining (700) a channel busy ratio (CBR) window for a CBR measurement for resources by a user equipment (UE). A CBR measurement value for the CBR window and for the resources is determined (701) by the UE. A channel occupancy ratio (CR) window is determined based on a first number of sub-frames used for a history of past transmissions and based on a second number of sub-frames used for future planned transmissions and corresponding retransmissions by the UE. A CR value for the CR window is determined based on sub-channels used for the resources for the first number of sub-frames and based on sub-channels estimated for the resources for the second number of sub-frames by the UE. INDEPENDENT CLAIMS are included for the following:(1) an apparatus configured for wireless communication for supporting enhanced congestion control for vehicle-to-everything (V2X) in new radio (NR); and(2) a non-transitory computer-readable medium storing program for supporting enhanced congestion control for vehicle-to-everything (V2X) in new radio (NR). Method for supporting enhanced congestion control for vehicle-to-everything (V2X) in new radio (NR) of wireless communication. The method increases reliability and throughput, reduces latency, and enables operation in ultra-reliable low latency communications (URLLC) modes. The base stations take advantage of the higher dimension multiple input, multiple output (MIMO) capabilities to exploit three-dimensional beam-forming in both elevation and azimuth beam-forming to increase coverage and capacity. The enhanced congestion control operations enables more aperiodic communications to be transmitted, and thus increases throughput and reduce latency. The drawing shows a block diagram of the blocks executed by a UE configured. 700Step for determining a CBR window for a CBR measurement for resources 701Step for determining a CBR measurement value for the CBR window and for the resources 702Step for determining a CR window based on the CBR measurement value ", "instruction": "Please summarize the input" }, { "input": "Automated control of headlight illumination by onboard vehicle-to-everything (V2X) deviceIn an aspect, a method of wireless communication performed by a vehicle-to-everything (V2X) device onboard a vehicle includes receiving one or more V2X safety messages indicating a potential safety condition related to illumination of an object; determining, in response to the one or more V2X safety messages, that the object is within or approaching a target area in which illumination of the object by headlights of the vehicle can be adjusted; and controlling an illumination intensity and/or an illumination pattern of the headlights of the vehicle in response to determining that the object is within the target area.What is claimed is: \n | 1. A method of wireless communication performed by a vehicle-to-everything (V2X) device onboard a vehicle, comprising: \nreceiving one or more V2X safety messages indicating a potential safety condition related to illumination of an object, wherein the one or more V2X safety messages indicate information relating to a location of the object; \ndetermining, in response to the information relating to the location of the object indicated by the one or more V2X safety messages, that the object is within or approaching a target area in which illumination of the object by headlights of the vehicle can be adjusted; \ndetermining one or more occluded regions of the target area based on topographical features identified by the V2X device; \ndetermining whether the object is within or outside of the one or more occluded regions based on the information relating to the location of the object indicated by the one or more V2X safety messages; and \ncontrolling an illumination intensity and/or an illumination pattern of the headlights of the vehicle in response to determining that the object is within the target area outside of the one or more occluded regions. \n | 2. The method of claim 1, further comprising: \ndetermining the target area based on a current illumination intensity and/or a current illumination pattern of the headlights of the vehicle. \n | 3. The method of claim 1, further comprising: \ndetermining the target area based on an illumination capability of the headlights of the vehicle. \n | 4. The method of claim 1, further comprising: \ndetermining the target area based on a current position of the vehicle and/or a projected future position of the vehicle using one or more vehicular sensors. \n | 5. The method of claim 1, wherein: \nthe one or more occluded regions of the target area are determined based on topographical features identified by the V2X using sensor data received from one or more vehicular sensors. \n | 6. The method of claim 5, wherein: \nthe one or more vehicular sensors include one or more light detection and ranging (LIDAR) sensors. \n | 7. The method of claim 5, wherein: \nthe one or more vehicular sensors include one or more radio detection and ranging (RADAR) sensors. \n | 8. The method of claim 5, wherein: \nthe one or more vehicular sensors include one or more image sensors. \n | 9. The method of claim 1, further comprising: \ndetecting ambient lighting conditions exterior to the vehicle; and \ndetermining the target area based on the ambient lighting conditions. \n | 10. The method of claim 1, wherein controlling the illumination intensity and/or the illumination pattern of the headlights comprises: controlling illumination intensities of one or more of a plurality of light-emitting elements of the headlights.\n | 11. The method of claim 10, wherein: \nthe illumination intensities of the one or more of the plurality of light-emitting elements of the headlights are adjusted to provide an illumination pattern that emits light at an increased intensity toward the object. \n | 12. The method of claim 11, wherein: \nthe illumination pattern is asymmetric between a left-side headlight illumination intensity and a right-side headlight illumination intensity. \n | 13. The method of claim 1, wherein: the potential safety condition is a condition in which the object should be illuminated to make the object visible to a driver of the vehicle; and the illumination intensity and/or the illumination pattern of the headlights are controlled to increase intensity of light beams emitted toward the object.\n | 14. The method of claim 1, wherein: the potential safety condition is a condition in which illumination of the object should be limited to reduce a likelihood of blinding an individual at the object; and the illumination intensity and/or the illumination pattern of the headlights are controlled to decrease intensity of light beams emitted toward the object.\n | 15. The method of claim 1, wherein: the potential safety condition is a condition in which the object should be illuminated to make the object visible to one or more image sensors used in an autonomous driving system of the vehicle; and the illumination intensity and/or the illumination pattern of the headlights are controlled to increase intensity of light beams emitted toward the object.\n | 16. The method of claim 1, wherein: the potential safety condition is a condition in which illumination of the object should be limited to reduce a likelihood of overexposing one or more image sensors at the object; and the illumination intensity and/or the illumination pattern of the headlights are controlled to decrease intensity of light beams emitted toward the object.\n | 17. The method of claim 1, wherein: the one or more V2X safety messages indicating the potential safety condition related to illumination of the object are received from a roadside unit (RSU).\n | 18. The method of claim 1, wherein: determining that the object is within the target area comprises determining a current position of the object and/or a projected future position of the object based on the information relating to the location of the object indicated by the one or more V2X safety messages and/or further V2X communications.\n | 19. The method of claim 18, wherein: \nthe current position of the object is based on the information relating to the location of the object indicated by the one or more V2X safety messages. \n | 20. The method of claim 19, wherein the projected future position of the object is based on the current position of the object and: \na speed of the object indicated in V2X communications relating to the object, \na heading of the object indicated in V2X communications relating to the object, \na projected path of the object indicated in V2X communications relating to the object, or \nany combination thereof. \n | 21. The method of claim 1, wherein: \nthe one or more occluded regions of the target area are determined based on the topographical features identified by the V2X device from map data. \n | 22. The method of claim 21, wherein: \nthe one or more occluded regions of the target area are determined based on road lanes used by the object as determined from the map data. \n | 23. The method of claim 21, wherein: \nthe map data includes V2X map data. \n | 24. The method of claim 21, wherein: \nthe map data includes local map data. \n | 25. A vehicle-to-everything (V2X) device onboard a vehicle, comprising: \na memory; \nat least one transceiver; and \nat least one processor communicatively coupled to the memory and the at least one transceiver, the at least one processor configured to: \nreceive, via the at least one transceiver, one or more V2X safety messages indicating a potential safety condition related to illumination of an object, wherein the one or more V2X safety messages indicate information relating to a location of the object; \ndetermine, in response to the information relating to the location of the object indicated by the one or more V2X safety messages, that the object is within or approaching a target area in which illumination of the object by headlights of the vehicle can be adjusted; \ndetermine one or more occluded regions of the target area based on topographical features identified by the V2X device; \ndetermining whether the object is within or outside of the one or more occluded regions based on the V2X safety information indicating the location of the object and control an illumination intensity and/or an illumination pattern of the headlights of the vehicle in response to determining that the object is within the target area. \n | 26. The V2X device of claim 25, wherein the at least one processor is further configured to: \ndetermine the target area based on a current illumination intensity and/or a current illumination pattern of the headlights of the vehicle. \n | 27. The V2X device of claim 25, wherein the at least one processor is further configured to: \ndetermine the target area based on an illumination capability of the headlights of the vehicle. \n | 28. The V2X device of claim 25, wherein the at least one processor is further configured to: \ndetermine the target area based on a current position of the vehicle and/or a projected future position of the vehicle using one or more vehicular sensors. \n | 29. The V2X device of claim 25, wherein: \nthe one or more occluded regions of the target area are determined based on topographical features identified by the V2X using sensor data received from one or more vehicular sensors. \n | 30. The V2X device of claim 29, wherein: \nthe one or more vehicular sensors include one or more light detection and ranging (LIDAR) sensors. \n | 31. The V2X device of claim 29, wherein: \nthe one or more vehicular sensors include one or more radio detection and ranging (RADAR) sensors. \n | 32. The V2X device of claim 29, wherein: \nthe one or more vehicular sensors include one or more image sensors. \n | 33. The V2X device of claim 25, wherein the at least one processor configured to control the illumination intensity and/or the illumination pattern of the headlights comprises the at least one processor configured to: \ncontrol illumination intensities of one or more of a plurality of light-emitting elements of the headlights. \n | 34. The V2X device of claim 33, wherein: \nthe illumination intensities of the one or more of the plurality of light-emitting elements of the headlights are controlled to provide an illumination pattern that emits light at an increased intensity toward the object. \n | 35. The V2X device of claim 33, wherein: \nthe illumination pattern is asymmetric between a left-side headlight illumination intensity and a right-side headlight illumination intensity. \n | 36. The V2X device of claim 25, wherein: \ndetermining that the object is within the target area comprises determining a current position of the object and/or a projected future position of the object based on the information relating to the location of the object indicated by the one or more V2X safety messages and/or further V2X communications. \n | 37. The V2X device of claim 36, wherein: \nthe current position of the object is based on the location information relating to the location of the object indicated by the one or more V2X safety messages. \n | 38. The V2X device of claim 25, wherein: \nthe one or more occluded regions of the target area are determined based on the topographical features identified by the V2X device from map data.", "output": "The method involves receiving vehicle-to-everything (V2X) safety messages indicating a potential safety condition related to illumination of an object (702). Occluded regions of a target area are determined based on topographical features identified by a V2X device. Determination is made (704) to check whether the object is within or outside of the regions based on information relating to a location of the object indicated by the safety messages. An illumination intensity and/or an illumination pattern of headlights of the vehicle are controlled (706) in response to determining that the objects are within the target area. An INDEPENDENT CLAIM is included for a V2X device for onboarding a vehicle. Method for performing wireless communication by a V2X device for onboarding of a vehicle. The method enables utilizing a V2X safety module to determine that the object is within or approaching the target area in which illumination of the object by headlights of the vehicle can be adjusted in response to the V2X safety messages, and to control the illumination intensity and/or the illumination pattern of the headlights of the vehicle based on the determined object in an efficient manner. The drawing shows a flow chart illustrating a method for performing wireless communication by a V2X device for onboarding of a vehicle.702Receiving V2X safety messages indicating potential safety condition related to illumination of object 704Determining whether object is within or outside of regions based on information relating to location of object indicated by safety messages 706Controlling illumination intensity and/or illumination pattern of headlights of vehicle in response to determining that objects are within target area ", "instruction": "Please summarize the input" }, { "input": "LOW LATENCY ENHANCEMENTS TO CV2X AUTONOMOUS RESOURCE SELECTION AND RE-SELECTION PROCEDURE FOR VEHICLE-TO-VEHICLE COMMUNICATIONSLow latency enhancements for communication systems, including autonomous driving and/or selection scenarios, are provided. A method for communication includes monitoring communication resources in a communication system, determining a set of candidate resources to use for subsequent transmission of information within a time window such that the time window is minimized based on a desired communication latency parameter that considers at least one or more of communication channel congestion and a priority of transmission, determining a set of lowest energy resources from the set of candidate resources, selecting a low energy resource from the set of lowest energy resources, and transmitting data on the selected low energy resource. Other aspects, embodiments, and features are also claimed and described.", "output": "The method involves monitoring communication resources (922-946) in a communication system. A set of candidate resources (910) to use for subsequent transmission of information within a time window such that the time window is minimized based on a desired communication latency parameter that considers communication channel congestion and a priority of the intended transmission. A set of lowest energy resources is determined from the set of candidate resources. A low energy resource is selected from the set of lowest energy resources. Data on the selected low energy resource is transmitted. INDEPENDENT CLAIMS are also included for the following:an apparatus for facilitating communication between user equipment'sa communication devicea non-transitory computer-readable medium comprising a set of instructions for facilitating communication between vehicles. Method for facilitating communication i.e. vehicle-to-vehicle communication, between user equipment's e.g. vehicles such as automobiles, for a wireless communications system. Can also be used for cellular phones, smart phones, session initiation protocol (SIP) phones, laptops, personal digital assistants (PDAs), satellite radios, global positioning systems, multimedia devices, video devices, digital audio players i.e. MPEG-1 audio layer 3 (MP3) players, cameras, game consoles, tablets, smart devices and wearable devices for Code Division Multiple Access (CDMA) , Time Division Multiple Access (TDMA) , Frequency-Division Multiple Access (FDMA) , OFDMA , Single-Carrier Frequency-Division Multiple Access (SC-FDMA) systems. The method enables transmitting data streams to the single user equipment so as to increase data rate, and transmitting data streams to the user equipment and other user equipment's to increase system capacity. The drawing shows a schematic view of a communication frame structure. 900Data structure901Trigger time for resource selection or reselection910Candidate resources922-946Communication resources", "instruction": "Please summarize the input" }, { "input": "Methods and apparatus for parking lot exit management using V2XAspects of the present disclosure include methods, apparatuses, and computer readable media for receiving a plurality of requests, from a plurality of user equipments (UEs), to exit a parking area comprising a plurality of vehicles, wherein each of the plurality of UEs is associated with a corresponding vehicle of the plurality of vehicles, determining an exit order for the plurality of vehicles to exit the parking area, and transmitting, to the plurality of UEs, a plurality of exit commands, based on the exit order, for the plurality of vehicles to exit the parking area.What is claimed is: \n | 1. A method of wireless communication by a road side unit in a network, comprising: \nreceiving a plurality of requests, from a plurality of user equipments (UEs), to exit a parking area comprising a plurality of vehicles, wherein each of the plurality of UEs is associated with a corresponding vehicle of the plurality of vehicles; \ndetermining an estimated exit duration to exit the parking area from a current location for an individual vehicle of the plurality of vehicles; \ndetermining, based on the estimated exit duration, an exit order for the plurality of vehicles to exit the parking area; \ntransmitting, to the plurality of UEs, a plurality of exit commands, based on the exit order, for the plurality of vehicles to exit the parking area; \ncollecting, via one or more sensors, sensor information corresponding to activity within the parking area, the collecting comprising: \nmonitoring the plurality of vehicles exiting the parking area; and \ndetecting at least one of an out-of-order exit, a collision, a pedestrian, or other road user; \ngenerating, based on the sensor information, one or more updated exit commands to supersede the plurality of exit commands; and \ntransmitting the one or more updated exit commands to at least a subset of the plurality of UEs in response to detecting the at least one of the out-of-order exit, the collision, the pedestrian, or the other road user. \n | 2. The method of claim 1, wherein receiving the plurality of requests comprises: \nreceiving an emergency exit request from a first responder vehicle of the plurality of vehicles; and \nwherein determining the exit order comprises prioritizing the first responder vehicle in the exit order for the plurality of vehicles. \n | 3. The method of claim 1, wherein determining the exit order comprises: \ndetermining the exit order based on one or more of a reception order associated with receiving the plurality of requests, proximities of the plurality of vehicles to one or more exits of the parking area, sizes of the plurality of vehicles, maneuverabilities of the plurality of vehicles, estimated durations for the plurality of vehicles to exit the parking area, estimated fuel consumptions of the plurality of vehicles, or priorities associated with the plurality of requests. \n | 4. The method of claim 1, wherein transmitting the plurality of exit commands comprises: \nsequentially transmitting each of the plurality of exit commands based on a corresponding scheduled exit time of a plurality of scheduled exit times in accordance with the exit order. \n | 5. The method of claim 1, wherein transmitting the plurality of exit commands comprises: \ntransmitting a first exit command of the plurality of exit commands to a first vehicle of the plurality of vehicles scheduled to exit the parking area before remaining vehicles of the plurality of vehicles; and \ntransmitting, to the remaining vehicles, remaining exit commands of the plurality of exit commands each comprising identification information associated with a vehicle scheduled to exit the parking area immediately before each of the remaining vehicles. \n | 6. The method of claim 5, wherein: \nthe identification information includes at least one of a make, a model, a color, a license plate, a build, a location, or a vehicle type. \n | 7. The method of claim 1, wherein transmitting the plurality of exit commands comprises: \ntransmitting a plurality of scheduled exit times. \n | 8. The method of claim 1, wherein: \nthe plurality of exit commands comprises location information associated with one or more exits of the parking area. \n | 9. A road side unit, comprising: \na memory comprising instructions; \na transceiver; and \none or more processors operatively coupled with the memory and the transceiver, the one or more processors configured to execute the instructions in the memory to: \nreceive a plurality of requests, from a plurality of user equipments (UEs), to exit a parking area comprising a plurality of vehicles, wherein each of the plurality of UEs is associated with a corresponding vehicle of the plurality of vehicles; \ndetermine an estimated exit duration to exit the parking area from a current location for an individual vehicle of the plurality of vehicles; \ndetermine, based on the estimated exit duration, an exit order for the plurality of vehicles to exit the parking area; \ntransmit, to the plurality of UEs, a plurality of exit commands, based on the exit order, for the plurality of vehicles to exit the parking area; \ncollect, via one or more sensors, sensor information corresponding to activity within the parking area, wherein to collect, the one or more processors are configured: \nmonitor the plurality of vehicles exiting the parking area; and \ndetect at least one of an out-of-order exit, a collision, a pedestrian, or other road user; \ngenerate, based on the sensor information, one or more updated exit commands to supersede the plurality of exit commands; and \ntransmit the one or more updated exit commands to at least a subset of the plurality of UEs in response to detecting the at least one of the out-of-order exit, the collision, the pedestrian, or the other road user. \n | 10. The road side unit of claim 9, wherein receiving the plurality of requests comprises: \nreceive an emergency exit request from a first responder vehicle of the plurality of vehicles; and \nwherein determining the exit order comprises prioritizing the first responder vehicle in the exit order for the plurality of vehicles. \n | 11. The road side unit of claim 10, wherein determining the exit order comprises: \ndetermine the exit order based on one or more of a reception order associated with receiving the plurality of requests, proximities of the plurality of vehicles to one or more exits of the parking area, sizes of the plurality of vehicles, maneuverabilities of the plurality of vehicles, estimated durations for the plurality of vehicles to exit the parking area, estimated fuel consumptions of the plurality of vehicles, or priorities associated with the plurality of requests. \n | 12. The road side unit of claim 10, wherein transmitting the plurality of exit commands comprises: \nsequentially transmit each of the plurality of exit commands based on a corresponding scheduled exit time of a plurality of scheduled exit times in accordance with the exit order. \n | 13. The road side unit of claim 10, wherein transmitting the plurality of exit commands comprises: \ntransmit a first exit command of the plurality of exit commands to a first vehicle of the plurality of vehicles scheduled to exit the parking area before remaining vehicles of the plurality of vehicles; and \ntransmit to the remaining vehicles, remaining exit commands of the plurality of exit commands each comprising identification information associated with a vehicle scheduled to exit the parking area immediately before each of the remaining vehicles. \n | 14. The road side unit of claim 13, wherein: \nthe identification information includes at least one of a make, a model, a color, a license plate, a build, a location, or a vehicle type. \n | 15. The road side unit of claim 10, wherein transmitting the plurality of exit commands comprises: \ntransmit a plurality of scheduled exit times. \n | 16. The road side unit of claim 10, wherein: \nthe plurality of exit commands comprises location information associated with one or more exits of the parking area. \n | 17. A method of wireless communication by a user equipment (UE) associated with a vehicle in a network, comprising: \ntransmitting, to a road side unit (RSU), an exit request; and \nreceiving, from the RSU based on an estimated exit duration to exit a parking area from a current location, one or more exit commands comprising identification information associated with an other vehicle scheduled to exit the parking area immediately before the vehicle, and one or both of an indication for the vehicle to begin exiting the parking area or a scheduled exit time for the vehicle. \n | 18. The method of claim 17, wherein transmitting the exit request comprises: \ntransmitting an emergency exit request from a first responder vehicle; and \nwherein receiving the one or more exit commands comprises receiving a priority exit command to exit the parking area ahead of a plurality of vehicles. \n | 19. The method of claim 17, wherein: \nthe identification information includes at least one of a make, a model, a color, a license plate, a build, an identifying mark, or an accessory associated with the other vehicle. \n | 20. The method of claim 17, further comprising: \ndisplaying via a graphical user interface, exit information based on one or more exit commands. \n | 21. The method of claim 17, further comprising: \ntransmitting, to an autonomous drive system, exit information based on the one or more exit commands. \n | 22. A user equipment (UE) associated with a vehicle, comprising: \na memory comprising instructions; \na transceiver; and \none or more processors operatively coupled with the memory and the transceiver, the one or more processors configured to execute instructions in the memory to: \ntransmit, to a road side unit (RSU), an exit request; and \nreceive, from the RSU based on an estimated exit duration to exit a parking area from a current location, one or more exit commands comprising identification information associated with an other vehicle scheduled to exit the parking area immediately before the vehicle, and one or both of an indication for the vehicle to begin exiting the parking area or a scheduled exit time scheduled for the vehicle. \n | 23. The UE of claim 22, wherein transmitting the exit request comprises: \ntransmitting an emergency exit request from a first responder vehicle; and \nwherein receiving the one or more exit commands comprises receiving a priority exit command to exit the parking area ahead of a plurality of vehicles. \n | 24. The UE of claim 22, wherein: \nthe identification information includes at least one of a make, a model, a color, a license plate, a build, an identifying mark, or an accessory associated with the other vehicle. \n | 25. The UE of claim 22, wherein the one or more processors are further configured to: \ndisplay, via a graphical user interface, exit information based on the one or more exit commands. \n | 26. The UE of claim 22, wherein the one or more processors are further configured to: \ntransmit, to an autonomous drive system, exit information based on the one or more exit commands.", "output": "The method involves receiving multiple requests from multiple user equipments (UEs) to exit a parking area (S605), where the parking area comprises multiple vehicles, where each UE is associated with corresponding vehicle. An exit order is determined (S610) from multiple vehicles to exit the parking area, where the exit order comprises prioritizing the responder vehicles in the exit orders for multiple vehicles. Multiple exit commands are transmitted (S615) to multiple vehicles to exit the parking area based on the exit order and estimated exit duration. An emergency exit request is received from a responder vehicle. Identification information of the vehicle is obtained, where the identification information includes make, a model, a color, a number plate, build, a location or a vehicle type. Multiple vehicles exiting the parking area is monitored. Processors are operatively coupled with a memory and a transceiver. INDEPENDENT CLAIMS are included for:(a). a roadside unit;(b). a method for establishing wireless communication by using a user equipment;(c). a user equipment associated with a vehicle Method for facilitating parking lot exit management by using a vehicle-to-everything (V2X) network. The method enables establishing the vehicle-to-everything (V2X) network to manage traffic and reducing congestions. The method enables utilizing Roadside Unit (RSU) to transmit the exit commands to the User Equipment (UEs) to exit the parking area based on the exit order for the vehicles. The method enables performing clear channel assessment (CCA) to determine whether the channel is available or not. The drawing shows a flow diagram illustrating a method for performing parking lot exit management by using a vehicle-to-everything network.S605Step for receiving multiple requests from multiple user equipments to exit a parking area S610Step for determining exit order from multiple vehicles to exit the parking area S615Step for transmitting multiple exit commands to multiple vehicles to exit the parking area based on the exit order and estimated exit duration ", "instruction": "Please summarize the input" }, { "input": "ENHANCING NAVIGATION EXPERIENCE USING V2X SUPPLEMENTAL INFORMATIONEmbodiments of the disclosure are directed to the use of supplemental information received from Vehicle-to-Everything (V2X) capable entities in order to enhance navigation and route selection based on available advanced driver assistance systems (ADAS) functionality. A number of potential routes are evaluated by retrieving the V2X capabilities and locations from V2X capable entities along those routes. That information is used to assess traffic density and availability of supplemental information used by ADAS along each route, allowing for an evaluation of each route on travel time and ADAS support. The driver can then select the best route that supports their needs.|1. A method comprising: \n * obtaining (222) a destination address and a source address;\n * determining (224) a plurality of routes from the source address to the destination address;\n * for each route in the plurality of routes: \n * determining (226) an availability of Vehicle to Everything, V2X,-capable entities, capable of providing V2X information, along one or more portions of each respective route;\n * calculating a travel time estimate for each route in the plurality of routes based on the availability of V2X-capable entities along the one or more portions of each respective route;\n * generating (228) a navigation map for display in an in-vehicle display, the navigation map comprising each route of the plurality of routes and an indication of the availability of V2X-capable entities along the one or more portions of each respective route; and\n * causing (230) the navigation map to be displayed in the in-vehicle display;\n * wherein the method further comprises: \n * receiving a first user selection of a navigation route from the plurality of routes;\n * determining, based on a user event having a duration, that the navigation route will not provide a sufficient level/density of V2X-capable entities for the duration of the user event;\n * determining an alternative route, wherein the alternative route has a sufficient level/density of V2X-capable entities allowing for autonomous driving over the duration of the user event;\n * causing the navigation map in the in-vehicle display to show an indication of the alternative route;\n * receiving a second user selection of the alternative route;\n * calculating a travel time estimate for the alternative route based on a greater availability of V2X-capable entities along one or more portions of the alternative route;\n * determining an estimated change in travel time associated with the second user selection of the alternative route, the estimated change in travel time based at least on the calculated travel time estimate for the alternative route; and\n * causing the in-vehicle display to show the estimated change in travel time.\n | 2. The method of claim 1, further comprising: \n * selecting a navigation route from the plurality of routes based on the availability of V2X-capable entities along the one or more portions of the navigation route; or\n * receiving a third user selection of a navigation route from the plurality of routes; and\n * updating the navigation map in the in-vehicle display to show only the user selected navigation route.\n | 3. The method of claim 1, wherein calculating the travel time estimate for each route in the plurality of routes further comprises: \n * determining V2X-capabilities of the V2X-capable entities along each respective route; and preferably\n * the method further comprises: \n * updating the travel time estimate for each route in the plurality of routes based on the V2X-capabilities of V2X-capable entities along each respective route; and further preferably\n * the method further comprises: \n * ordering each route in the plurality of routes in an ordered list based on the travel time estimate for each respective route; and\n * causing the ordered list of the plurality of routes to be displayed in the in-vehicle display.\n | 4. The method of claim 1, further comprising: \n * for each route in the plurality of routes:\n receiving a V2X-capability and a location for each of the plurality of V2X-capable entities along the one or more portions of each respective route; and preferably \n * the method further comprises: \n * for each route in the plurality of routes:\n determining, from the V2X-capability and the location for each of the plurality of V2X-capable entities along the one or more portions of each respective route, an availability of assisted driving features along the one or more portions of each respective route; and further preferably \n * the method further comprises:\n updating the navigation map in the in-vehicle display to show availability of assisted driving features along the one or more portions of each route in the plurality of routes. \n | 5. The method of claim 1, wherein the plurality of V2X-capable entities includes Vehicle-to-Vehicle, V2V, capable vehicles or Vehicle-to-Infrastructure, V2I, capable infrastructure.\n | 6. A system comprising: \n * a vehicle (100) having an in-vehicle display (756) and an on-board navigation computer (716), the on-board navigation computer capable of receiving communication over Vehicle-to-Everything, V2X, communication; and\n * a navigation application executable by the on-board navigation computer to cause the on-board navigation computer to: \n * obtain a destination address and a source address;\n * determine a plurality of routes from the source address to the destination address;\n * for each route in the plurality of routes: \n * determine an availability of Vehicle to Everything, V2X,-capable entities, capable of providing V2X information, along one or more portions of each respective route;\n * calculate a travel time estimate for each route in the plurality of routes based on the availability of V2X-capable entities along the one or more portions of each respective route;\n * generate a navigation map for display in the in-vehicle display, the navigation map comprising each route of the plurality of routes and an indication of the availability of V2X-capable entities along the one or more portions of each respective route; and\n * cause the navigation map to be displayed in the in-vehicle display;\n * further causing the on-board navigation computer to: \n * receive a first user selection of a navigation route from the plurality of routes;\n * determine, based on a user event having a duration, that the navigation route will not provide a sufficient level/density of V2X-capable entities for the duration of the user event;\n * determine an alternative route, wherein the alternative route has a sufficient level/density of V2X-capable entities allowing for autonomous driving over the duration of the user event;\n * cause the navigation map in the in-vehicle display to show an indication of the alternative route;\n * receive a second user selection of the alternative route;\n * calculate a travel time estimate for the alternative route based on a greater availability of V2X-capable entities along one or more portions of the alternative route;\n * determine an estimated change in travel time associated with the second user selection of the alternative route, the estimated change in travel time based at least on the calculated travel time estimate for the alternative route; and\n * cause the in-vehicle display to show the estimated change in travel time.\n | 7. The system of claim 6, wherein the navigation application is executable by the on-board navigation computer to further cause the on-board navigation computer to: \n * select a navigation route from the plurality of routes based on the availability of V2X-capable entities along the one or more portions of the navigation route; or\n * receive a third user selection of a navigation route from the plurality of routes; and\n * update the navigation map in the in-vehicle display to show only the user selected navigation route.\n | 8. The system of claim 6, wherein the navigation application is executable by the on-board navigation computer to further cause the on-board navigation computer to: \n * determine V2X-capabilities of the V2X-capable entities along each respective route; and preferably\n * the on-board navigation computer is further caused to: \n * update the travel time estimate for each route in the plurality of routes based on V2X-capabilities of the V2X-capable entities along each respective route; and further preferably\n * the on-board navigation computer is further caused to: \n * order each route in the plurality of routes in an ordered list based on the travel time estimate for each respective route; and\n * cause the ordered list of the plurality of routes to be displayed in the in-vehicle display.\n | 9. The system of claim 6, wherein the navigation application is executable by the on-board navigation computer to further cause the on-board navigation computer to:\n for each route in the plurality of routes: \n receive a V2X-capability and a location for each of the plurality of V2X-capable entities along the one or more portions of each respective route; and preferably the on-board navigation computer is further caused to: \n for each route in the plurality of routes: \n determine, from the V2X-capability and the location for each of the plurality of V2X-capable entities along the one or more portions of each respective route, an availability of assisted driving features along the one or more portions of each respective route; and further preferably the on-board navigation computer is further caused to: \n update the navigation map in the in-vehicle display to show availability of assisted driving features along the one or more portions of each route in the plurality of routes. \n | 10. The system of claim 6, wherein the plurality of V2X-capable entities includes Vehicle-to-Vehicle, V2V, capable vehicles or Vehicle-to-Infrastructure, V2I, capable infrastructure.\n | 11. A non-transitory computer readable memory containing instructions executable by a processor to cause the processor to: \n * obtain a destination address and a source address;\n * determine a plurality of routes from the source address to the destination address;\n * for each route in the plurality of routes: \n * determine an availability of Vehicle to Everything, V2X,-capable entities, capable of providing V2X information, along one or more portions of each respective route;\n * calculating a travel time estimate for each route in the plurality of routes based on the availability of V2X-capable entities along the one or more portions of each respective route;\n * generate a navigation map for display in the in-vehicle display, the navigation map comprising each route in the plurality of routes and an indication of the availability of V2X-capable entities along the one or more portions of each respective route; and\n * cause the navigation map to be displayed in the in-vehicle display;\n * and further causing the processor to: \n * receive a first user selection of a navigation route from the plurality of routes;\n * determine, based on a user event having a duration, that the navigation route will not provide a sufficient level/density of V2X-capable entities for the duration of the user event;\n * determine an alternative route, wherein the alternative route has available a sufficient level/density of V2X-capable entities allowing for autonomous driving over the duration of the user event;\n * cause the navigation map in the in-vehicle display to show an indication of the alternative route;\n * receive a second user selection of the alternative route;\n * calculating a travel time estimate for the alternative route based on a greater availability of V2X-capable entities along one or more portions of the alternative route;\n * determine an estimated change in travel time associated with the second user selection of the alternative route, the estimated change in travel time based at least on the calculated travel time estimate for the alternative route; and\n * cause the in-vehicle display to show the estimated change in travel time.", "output": "The method involves obtaining a destination address and a source address. Multiple routes are determined from the source address to the destination address. An availability of vehicle (100,104) to everything (V2X)-capable entities are determined capable of providing V2X information, along one or more portions of each respective route. A navigation map is generated for display in an in-vehicle display, the navigation map includes each route of multiple routes and an indication of the availability of V2X-capable entities along the one or more portions of each respective route. The navigation map is caused to be displayed in the in-vehicle display. INDEPENDENT CLAIMS are included for the following:a system for autonomous driving and advanced driver assistance systems;an apparatus for autonomous driving and advanced driver assistance systems; anda non-transitory computer readable for autonomous driving and advanced driver assistance systems. Method for autonomous driving and advanced driver assistance systems (ADAS). The supplemental information can also be used to generate suggestions to the driver and enable the driver to make better decisions. The drawing shows a schematic view of a V2X-capable entities. 100,104Vehicle102Infrastructure106Power grid110Pedestrian", "instruction": "Please summarize the input" }, { "input": "WIRELESS COMMUNICATION APPARATUS AND METHOD IN WIRELESS DEVICESThe present invention relates to a method and apparatus for wireless communication at wireless devices, in particular a method and apparatus for collaborative early detection and threat mitigation in C-V2X. In one aspect, the apparatus detects a threat entity in a threat area based on data signals received from the threat entity, wherein the threat entity interferes with wireless resources or spectrum used in autonomous driving and cooperative decisions. The transmitter transmits, to at least one second wireless device, a message indicating the threat entity in the threat zone.", "output": "The apparatus has a memory (360), a transceiver and a processor (359) communicatively connected to the memory and the transceiver. The processor is configured to detect a threat entity within a threat zone based on data signals received from the threat entity. The threat entity obstructs wireless spectrum or resources utilized in cooperative or automated driving decisions; and transmit to second wireless device and a message indicating the threat entity within the threat zone. The data signals received from the threat entity comprise data that is inconsistent with projected data for wireless devices. The data signals comprise data of a misbehaving wireless device. The data of the misbehaving wireless device comprises implausible data related to characteristic of the misbehaving wireless device. INDEPENDENT CLAIMS are included for the following:a method for wireless communication of first wireless device;a apparatus for wireless communication at second wireless device; anda method for wireless communication of second wireless device. Method for cooperative early threat detection and avoidance in cellular vehicle-to-everything (C-V2X). The method enables facilitating cooperative early threat detection and avoidance in cellular vehicle-to-everything (C-V2X) and/or D2D technology in an effective manner. The drawing shows a schematic view of a first device and a second device. 310Wireless communication device359Processor360Memory370Receive processor374Channel estimator", "instruction": "Please summarize the input" }, { "input": "Methods and systems for managing interactions between vehicles with varying levels of autonomyMethods, devices and systems enable controlling an autonomous vehicle by identifying vehicles that are within a threshold distance of the autonomous vehicle, determining an autonomous capability metric of each of the identified vehicles, and adjusting a driving parameter of the autonomous vehicle based on the determined autonomous capability metric of each of the identified vehicles. Adjusting a driving parameter may include adjusting one or more of a minimum separation distance, a minimum following distance, a speed parameter, or an acceleration rate parameter.What is claimed is: \n | 1. A method of controlling an autonomous vehicle, comprising: \ndetermining dynamically, via a processor of the autonomous vehicle, a threshold distance appropriate for current conditions; \nidentifying, via the processor of the autonomous vehicle, vehicles that are within the dynamically determined threshold distance of the autonomous vehicle; \ndetermining an autonomous capability metric of each of the identified vehicles; and \nadjusting a driving parameter of the autonomous vehicle based on the determined autonomous capability metric of each of the identified vehicles. \n | 2. The method of claim 1, wherein determining the autonomous capability metric of each of the identified vehicles comprises determining a level of autonomy of each identified vehicle.\n | 3. The method of claim 1, wherein adjusting the driving parameter of the autonomous vehicle based on the determined autonomous capability metric of each identified vehicle comprises: \nadjusting a minimum separation distance to be maintained between the autonomous vehicle and at least one vehicle of the identified vehicles. \n | 4. The method of claim 3, wherein adjusting the minimum separation distance to be maintained between the autonomous vehicle and the at least one vehicle of the identified vehicles comprises adjusting the minimum separation distance based on the autonomous capability metric of the least one vehicle and a behavior model of the at least one vehicle.\n | 5. The method of claim 1, wherein adjusting the driving parameter of the autonomous vehicle based on the determined autonomous capability metric of each identified vehicle comprises: \nadjusting a minimum following distance to be maintained between the autonomous vehicle and at least one vehicle of the identified vehicles. \n | 6. The method of claim 5, wherein adjusting the minimum following distance to be maintained between the autonomous vehicle and the at least one vehicle of the identified vehicles comprises adjusting the minimum following distance based on the autonomous capability metric of the least one vehicle and a behavior model of the at least one vehicle.\n | 7. The method of claim 1, wherein adjusting the driving parameter of the autonomous vehicle based on the determined autonomous capability metric of each of the identified vehicles comprises one or more of: \nadjusting a speed of the autonomous vehicle; or \nadjusting an acceleration rate at which the autonomous vehicle will change speed. \n | 8. The method of claim 7, wherein adjusting the speed of the autonomous vehicle or the acceleration rate at which the autonomous vehicle will change speed comprises adjusting the speed or the acceleration rate based on the autonomous capability metric of at least one vehicle of the identified vehicles and a behavior model of the at least one vehicle.\n | 9. The method of claim 1, wherein determining the autonomous capability metric of each of the identified vehicles comprises receiving the autonomous capability metric from at least one vehicle of the identified vehicles.\n | 10. The method of claim 1, wherein determining the autonomous capability metric of each of the identified vehicles comprises determining values that collectively identify or predict a level of autonomy or a performance capability of a nearby vehicle.\n | 11. The method of claim 10, wherein determining the values that collectively identify or predict the level of autonomy or the performance capability of the nearby vehicle comprises determining the values by one or more of: \nobserving driving behavior of the nearby vehicle; \ndetermining computing or sensor capability of the nearby vehicle; or \nreceiving information regarding the nearby vehicle's rating or certifications via C-V2X communications. \n | 12. The method of claim 11, further comprising determining at least one of the values that collectively identify or predict the level of autonomy or the performance capability of the nearby vehicle based on the observed driving behavior, the determined at least one value representing one or more of: \na consistency, regularity or uniformity of vehicle operations; \na level of predictability for future vehicle operations; \na level of driver aggression; \na degree to which the nearby vehicle tracks a center of a driving lane; \nnumber of driving errors per unit time; \ncompliance with local road rules; \ncompliance with safety rules; \nreaction time of the autonomous vehicle; or \nresponsiveness of the autonomous vehicle to observable events. \n | 13. The method of claim 10, further comprising determining at least one of the values that collectively identify or predict the level of autonomy or the performance capability of the nearby vehicle based on the determined sensor capability, the determined at least one value representing one of: \na sensor type; \na sensor make or model; \na sensor manufacturer; \nnumber of autonomous driving sensors operating in the nearby vehicle; \nsensor accuracy; or \nprecision of one or more sensors. \n | 14. The method of claim 10, further comprising determining one or more of the values that collectively identify or predict the level of autonomy or the performance capability of the nearby vehicle based on information received via C-V2X communications, the one or more values representing one or more of: \na key performance indicator (KPI); \na surface performance rating; \na weather performance rating; \na vehicle capability; \na vehicle feature; \na supported algorithm; or \na prediction and control strategy. \n | 15. A processor for a vehicle, wherein the processor is configured with processor executable instructions to: \ndetermine dynamically a threshold distance appropriate for current conditions; \nidentify vehicles that are within the dynamically determined threshold distance of the vehicle; \ndetermine an autonomous capability metric of each of the identified vehicles; and \nadjust a driving parameter based on the determined autonomous capability metric of each of the identified vehicles. \n | 16. The processor of claim 15, wherein the processor is further configured with processor executable instructions to determine the autonomous capability metric of each of the identified vehicles by determining a level of autonomy of each identified vehicle.\n | 17. The processor of claim 15, wherein the processor is further configured with processor executable instructions to adjust the driving parameter of the vehicle based on the determined autonomous capability metric of each identified vehicle by adjusting at least one of: \na minimum separation distance to be maintained between the vehicle and at least one vehicle of the identified vehicles; \na minimum following distance to be maintained between the vehicle and the at least one vehicle of the identified vehicles; \na speed of the vehicle; or \nan acceleration rate at which the vehicle will change speed. \n | 18. The processor of claim 17, wherein the processor is further configured with processor executable instructions to: \nadjust the minimum separation distance based on the autonomous capability metric of the least one vehicle and a behavior model of the at least one vehicle; \nadjust the minimum following distance based on the autonomous capability metric of the least one vehicle and the behavior model of the at least one vehicle; \nadjust the speed based on the autonomous capability metric of the at least one vehicle of the identified vehicles and the behavior model of the at least one vehicle; or \nadjust the acceleration rate based on the autonomous capability metric of the at least one vehicle of the identified vehicles and the behavior model of the at least one vehicle. \n | 19. The processor of claim 15, wherein the processor is further configured with processor executable instructions to determine the autonomous capability metric of each of the identified vehicles by receiving the autonomous capability metric from at least one vehicle of the identified vehicles.\n | 20. The processor of claim 15, wherein the processor is further configured with processor executable instructions to determine the autonomous capability metric of each of the identified vehicles by determining values that collectively identify or predict a level of autonomy or a performance capability of a nearby vehicle.\n | 21. The processor of claim 20, wherein the processor is further configured with processor executable instructions to determine the values that collectively identify or predict the level of autonomy or the performance capability of the nearby vehicle by determining the values by one or more of: \nobserving driving behavior of the nearby vehicle; \ndetermining computing or sensor capability of the nearby vehicle; or \nreceiving information regarding the nearby vehicle's rating or certifications via C-V2X communications. \n | 22. The processor of claim 21, wherein the processor is further configured with processor executable instructions to determine the values that collectively identify or predict the level of autonomy or the performance capability of the nearby vehicle based on the observed driving behavior by determining a value representing one or more of: \na consistency, regularity or uniformity of vehicle operations; \na level of predictability for future vehicle operations; \na level of driver aggression; \na degree to which the nearby vehicle tracks a center of a driving lane; \nnumber of driving errors per unit time; \ncompliance with local road rules; \ncompliance with safety rules; \nreaction time of the vehicle; or \nresponsiveness of the vehicle to observable events. \n | 23. The processor of claim 21, wherein the processor is further configured with processor executable instructions to determine the values that collectively identify or predict the level of autonomy or the performance capability of the nearby vehicle based on the determined sensor capability by determining a value representing one or more of: \na sensor type; \na sensor make or model; \na sensor manufacturer; \nnumber of autonomous driving sensors operating in the nearby vehicle; \nsensor accuracy; or \nprecision of one or more sensors. \n | 24. The processor of claim 21, wherein the processor is further configured with processor executable instructions to determine the values that collectively identify or predict the level of autonomy or the performance capability of the nearby vehicle based on information received via C-V2X communications by determining a value representing one or more of: \na key performance indicator (KPI); \na surface performance rating; \na weather performance rating; \na vehicle capability; \na vehicle feature; \na supported algorithm; or \na prediction and control strategy. \n | 25. A non-transitory processor-readable storage medium having stored thereon processor-executable instructions configured to cause a processor of an autonomous vehicle to perform operations comprising: \ndetermining dynamically a threshold distance appropriate for current conditions; \nidentifying vehicles that are within the dynamically determined threshold distance of the autonomous vehicle; \ndetermining an autonomous capability metric of each of the identified vehicles; and \nadjusting a driving parameter of the autonomous vehicle based on the determined autonomous capability metric of each of the identified vehicles. \n | 26. A vehicle, comprising: \nmeans for determining dynamically a threshold distance appropriate for current conditions; \nmeans for identifying vehicles that are within the dynamically determined threshold distance of the vehicle; \nmeans for determining an autonomous capability metric of each of the identified vehicles; and \nmeans for adjusting a driving parameter of the vehicle based on the determined autonomous capability metric of each of the identified vehicles. \n | 27. The vehicle of claim 26, wherein means for determining an autonomous capability metric of each of the identified vehicles comprises means for determining values that collectively identify or predict a level of autonomy or performance capability of a nearby vehicle based on one or more of: \nobserving driving behavior of the nearby vehicle; \ndetermining computing or sensor capability of the nearby vehicle; or \nreceiving information regarding the nearby vehicle's rating or certifications via C-V2X communications. \n | 28. The vehicle of claim 26, wherein means for determining an autonomous capability metric of each of the identified vehicles comprises means for determining one or more values that collectively identify or predict a level of autonomy or a performance capability of a nearby vehicle based on information received via C-V2X communications, the one or more values representing one or more of: \na key performance indicator (KPI); \na surface performance rating; \na weather performance rating; \na vehicle capability; \na vehicle feature; \na supported algorithm; or \na prediction and control strategy.", "output": "The method involves identifying (902) the vehicles that are within a threshold distance of an autonomous vehicle through a processor of the autonomous vehicle. An autonomous capability metric of each of the identified vehicles is determined (1104) in which the determining comprises a determining of a level of autonomy of each identified vehicle. A driving parameter of the autonomous vehicle is adjusted based on the determined autonomous capability metric of the identified vehicles. INDEPENDENT CLAIMS are included for the following:a processor;a non-transitory processor-readable storage medium storing program for controlling autonomous vehicle; anda vehicle. Method for controlling autonomous vehicle (claimed). The sensors enable the autonomous vehicle to operate safely with improved performance. The drawing shows a flow diagram illustrating method for adjusting behavior and operations of an autonomous vehicle based on the determined capabilities of the other surrounding vehicle. 902Step for identifying the vehicles that are within a threshold distance of an autonomous vehicle1012Step for controlling behavior of operation of vehicle1104Step for determining autonomous capability metric of each of the identified vehicles1108Step for adjusting driving parameter of the autonomous vehicle to be more trusting1114Step for adjusting driving parameter of the autonomous vehicle to be less trusting", "instruction": "Please summarize the input" }, { "input": "TRANSMISSION CONTROL IN APPLICATION LAYER BASED ON RADIO BEARER QUALITY METRICS IN VEHICULAR COMMUNICATIONMethods, apparatuses, and computer-readable mediums for wireless communication are disclosed by the present disclosure. In an aspect, receive, an application layer in a host user equipment (UE) receives, from an access layer in the host UE, a quality of service (QoS) indication comprising a metric that represents a quality of one or more radio bearers used for a vehicular communication with one or more other UEs. The application layer performs a transmission control over the vehicular communication based on the QoS indication.|1. A?method?of?wireless?communication,?comprising:\n receiving,?by?an?application?layer?in?a?host?user?equipment?(UE)?,?from?an?access?layer?in?the?host?UE,?a?quality?of?service?(QoS)?indication?comprising?a?metric?that?represents?a?quality?of?one?or?more?radio?bearers?used?for?a?vehicular?communication?with?one?or?more?other?UEs;?and \n performing,?at?the?application?layer,?a?transmission?control?over?the?vehicular?communication?based?on?the?QoS?indication. \n | 2. The?method?of?claim?1,?wherein?the?metric?is?indicative?of?a?message?reception?performance?as?affected?by?a?presence?or?an?absence?of?message?interference?or?collision?in?the?one?or?more?radio?bearers.\n | 3. The?method?of?claim?1,?wherein?the?performing?comprises?one?or?more?of?adjusting?a?transmission?rate?of?a?unicast?communication?of?the?host?UE,?a?transmission?range?of?a?groupcast?communication?of?the?host?UE,?a?maneuver?of?the?host?UE,?or?an?autonomous?driving?status?of?the?host?UE,?according?to?the?QoS?indication.\n | 4. The?method?of?claim?1,?further?comprising?sharing?sensor?data?of?the?host?UE?with?a?remote?UE?via?a?unicast?communication?at?a?first?transmission?rate.\n | 5. The?method?of?claim?4,?wherein?the?performing?comprises:\n determining,?by?the?application?layer,?based?on?the?QoS?indication,?a?second?transmission?rate?supportable?by?the?unicast?communication;?and \n adjusting?the?unicast?communication?according?to?the?second?transmission?rate. \n | 6. The?method?of?claim?5,?wherein?the?adjusting?comprises?performing?inter-transmission?time?(ITT)?control?at?the?host?UE.\n | 7. The?method?of?claim?5,\n wherein?the?sharing?comprises?sharing?video?sensor?data?of?the?host?UE?with?the?remote?UE?over?the?unicast?communication;?and \n wherein?the?adjusting?comprises?adjusting?a?video?resolution?of?a?video?codec?of?the?host?UE?according?to?the?second?transmission?rate?supportable?by?the?unicast?communication. \n | 8. The?method?of?claim?4,?wherein?the?receiving?comprises?receiving?at?least?one?of?a?packet?error?rate?(PER)?or?a?negative?acknowledgement?(NACK)?statistic?related?to?the?unicast?communication?with?the?remote?UE.\n | 9. The?method?of?claim?1,?further?comprising:\n communicating,?by?the?host?UE,?with?a?plurality?of?other?UEs?via?a?groupcast?communication;?and \n wherein?the?receiving?comprises?receiving?at?least?one?of?a?packet?error?rate?(PER)?or?a?negative?acknowledgement?(NACK)?statistic?related?to?the?groupcast?communication?with?the?plurality?of?other?UEs. \n | 10. The?method?of?claim?9,?wherein?the?performing?comprises:\n determining,?based?on?the?at?least?one?of?the?PER?or?the?NACK?statistic,?that?a?reachable?range?of?the?host?UE?fails?to?comply?with?a?minimum?range?requirement?of?a?vehicular?application?configured?for?controlling?a?maneuver?of?the?host?UE. \n | 11. The?method?of?claim?10,?wherein?the?performing?further?comprises?cancelling?or?postponing?the?maneuver?of?the?host?UE.\n | 12. The?method?of?claim?10,?wherein?the?performing?further?comprises?regenerating?a?driving?strategy?of?the?host?UE?to?match?the?reachable?range.\n | 13. The?method?of?claim?10,?wherein?the?performing?further?comprises:\n modifying?a?range?of?the?host?UE?according?to?the?reachable?range;?and \n adjusting?the?maneuver?of?the?host?UE?based?on?the?range. \n | 14. The?method?of?claim?13,?wherein?the?modifying?comprises?adjusting?a?radiated?power?of?the?host?UE.\n | 15. The?method?of?claim?13,?wherein?adjusting?the?maneuver?comprises?slowing?down?the?host?UE,?following?a?stop?and?go?operation?at?the?host?UE,?or?exiting?an?autonomous?driving?mode?at?the?host?UE.\n | 16. The?method?of?claim?10,?wherein?the?maneuver?comprises?a?coordinated?intersection?crossing.\n | 17. The?method?of?claim?1,?wherein?the?QoS?indication?comprises?one?or?more?of?a?Packet?Error?Rate?(PER)?,?a?Packet?Received?Rate?(PRR)?,?an?average?number?of?retransmissions,?an?average?PER,?an?average?PRR,?an?acknowledgement?(ACK)?/negative?acknowledgement?(NACK)?statistic,?a?range?statistic?of?a?groupcast?group,?or?a?supported?bit?rate?for?a?radio?bearer.\n | 18. The?method?of?claim?1,?wherein?the?vehicular?communication?comprises?a?new?radio?(NR)?vehicle-to-everything?(V2X)?communication.\n | 19. A?non-transitory?computer-readable?medium?storing?instructions?that?when?executed?by?a?processor?cause?the?processor?to:\n receive,?by?an?application?layer?in?a?host?user?equipment?(UE)?,?from?an?access?layer?in?the?host?UE,?a?quality?of?service?(QoS)?indication?comprising?a?metric?that?represents?a?quality?of?one?or?more?radio?bearers?used?for?a?vehicular?communication?with?one?or?more?other?UEs;?and \n perform,?at?the?application?layer,?a?transmission?control?over?the?vehicular?communication?based?on?the?QoS?indication. \n | 20. A?host?user?equipment?(UE)?for?wireless?communication,?comprising:\n a?memory?storing?instructions;?and \n a?processor?in?communication?with?the?memory,?wherein?the?processor?is?configured?to?execute?the?instructions?to: \n receive,?by?an?application?layer?in?the?host?UE,?from?an?access?layer?in?the?host?UE,?a?quality?of?service?(QoS)?indication?comprising?a?metric?that?represents?a?quality?of?one?or?more?radio?bearers?used?for?a?vehicular?communication?with?one?or?more?other?UEs;?and \n perform,?at?the?application?layer,?a?transmission?control?over?the?vehicular?communication?based?on?the?QoS?indication. \n | 21. A?host?user?equipment?(UE)?for?wireless?communication,?comprising:\n means?for?receiving,?by?an?application?layer?in?the?host?UE,?from?an?access?layer?in?the?host?UE,?a?quality?of?service?(QoS)?indication?comprising?a?metric?that?represents?a?quality?of?one?or?more?radio?bearers?used?for?a?vehicular?communication?with?one?or?more?other?UEs;?and \n means?for?performing,?at?the?application?layer,?a?transmission?control?over?the?vehicular?communication?based?on?the?QoS?indication.", "output": "The method involves receiving, by an application layer (142) in a host user equipment (UE), from an access layer (146) in the host UE, a quality of service (QoS) indication including a metric that represents a quality of multiple radio bearers used for a vehicular communication with the other UEs. The transmission control over the vehicular communication is performed at the application layer based on the QoS indication. The inter-transmission time control is performed at the host UE. The range of the host UE is modified according to the reachable range. INDEPENDENT CLAIMS are included for the following:a non-transitory computer-readable medium storing instructions that when executed by a processor cause the processor to perform an operation for transmission control in application layer based on radio bearer quality metrics in vehicular communication, such as new radio vehicle-to-everything communication with vehicular communication system; anda host user equipment for wireless communication. Method for performing transmission control in application layer based on radio bearer quality metrics in vehicular communication, such as new radio vehicle-to-everything communication with vehicular communication system. Can also be used to provide various telecommunication services, such as telephony, video, data, messaging, and broadcasts. The quality of service indications may be used by the application layer to adapt the range for groupcast, thus allowing the application layer to adjust autonomous driving behavior. The drawing shows a schematic view of a wireless communications system and an access network. 100Wireless communications system110Coverage area132,134Backhaul links142Application layer146Access layer", "instruction": "Please summarize the input" }, { "input": "APPLICATION LAYER MESSAGES FOR LANE DESCRIPTION IN VEHICULAR COMMUNICATIONMethods, apparatuses, and computer-readable mediums for wireless communication are disclosed by the present disclosure. In an aspect, an application layer of a protocol layer stack of vehicular user equipment (UE) receives a vehicular communication message including an application layer data element that directly indicates a curvature or a slope of a lane in a road. The vehicular UE may then implement autonomous driving functionality based on the application layer data element. In another aspect, an application layer of a protocol layer stack of a device generates a vehicular communication message including an application layer data element that directly indicates a curvature or a slope of a lane in a road. The device may then transmit the vehicular communication message to a vehicular UE configured to implement autonomous driving functionality.|1. A?method?of?wireless?communication?at?a?vehicular?user?equipment?(UE)?,?comprising:\n receiving,?by?an?application?layer?of?a?protocol?layer?stack?of?the?vehicular?UE,?a?vehicular?communication?message?including?an?application?layer?data?element?that?directly?indicates?a?curvature?or?a?slope?of?a?lane?in?a?road;?and \n implementing?autonomous?driving?functionality?based?on?the?application?layer?data?element. \n | 2. The?method?of?claim?1,?wherein?the?receiving?comprises?receiving?the?vehicular?communication?message?from?another?vehicular?UE,?a?network,?an?infrastructure,?a?road?side?unit?(RSU)?,?or?a?relay.\n | 3. The?method?of?claim?1,?wherein?the?vehicular?communication?message?further?includes?one?or?more?application?layer?data?elements?that?indicate?a?lane?width?of?the?lane.\n | 4. The?method?of?claim?1,?wherein?the?vehicular?communication?message?further?includes?one?or?more?application?layer?data?elements?that?indicate?a?longitude?value,?a?latitude?value,?and?an?elevation?value?for?each?point?in?a?list?of?spaced?points?positioning?a?center?line?of?the?lane.\n | 5. The?method?of?claim?4,?wherein?a?spacing?between?two?consecutive?points?in?the?list?of?spaced?points?is?a?function?of?the?curvature?of?the?road.\n | 6. The?method?of?claim?4,?wherein?at?least?one?of?the?one?or?more?application?layer?data?elements?indicates?a?differential?value?of?a?position,?curvature,?or?slope?of?a?point?in?the?list?of?spaced?points?as?compared?to?a?neighboring?point?in?the?list?of?spaced?points.\n | 7. The?method?of?claim?4,?wherein?at?least?one?of?the?one?or?more?application?layer?data?elements?indicates?a?differential?value?of?a?position,?curvature,?or?slope?of?a?point?in?the?list?of?spaced?points?as?compared?to?a?corresponding?previous?value?of?the?position,?curvature,?or?slope?of?the?point?in?the?list?of?spaced?points.\n | 8. The?method?of?claim?4,?wherein?the?one?or?more?application?layer?data?elements?indicate?a?plurality?of?curvatures?or?slopes,?each?associated?with?at?least?one?point?in?the?list?of?spaced?points.\n | 9. The?method?of?claim?1,?wherein?implementing?the?autonomous?driving?functionality?comprises?controlling?a?motion?of?the?vehicular?UE?on?the?road.\n | 10. The?method?of?claim?1,?wherein?implementing?the?autonomous?driving?functionality?comprises?implementing?according?to?an?advanced?driver-assistance?system?(ADAS)?.\n | 11. The?method?of?claim?1,?wherein?implementing?the?autonomous?driving?functionality?comprises?controlling?a?speed?or?an?acceleration?of?the?vehicular?UE.\n | 12. The?method?of?claim?1,?wherein?implementing?the?autonomous?driving?functionality?comprises?adjusting?a?detection?range?of?a?sensor?used?in?an?advanced?driver-assistance?system?(ADAS)?.\n | 13. The?method?of?claim?12,?wherein?the?sensor?comprises?a?camera,?a?radar,?or?a?light?detection?and?ranging?(LIDAR)?sensor.\n | 14. The?method?of?claim?12,?wherein?adjusting?the?detection?range?of?the?sensor?comprises?adjusting?a?position?or?an?angle?of?the?sensor?based?on?the?curvature?of?the?lane?in?the?road.\n | 15. The?method?of?claim?12,?wherein?adjusting?the?detection?range?of?the?sensor?comprises?adjusting?a?yaw?angle?of?the?sensor?toward?the?curvature?of?the?lane?in?the?road.\n | 16. The?method?of?claim?12,?wherein?the?slope?comprises?a?longitudinal?slope,?wherein?adjusting?the?detection?range?of?the?sensor?comprises?adjusting?a?pitch?angle?of?the?sensor?toward?the?longitudinal?slope?of?the?lane?in?the?road.\n | 17. The?method?of?claim?1,?wherein?implementing?the?autonomous?driving?functionality?comprises?determining?a?speed?or?acceleration?limitation?based?on?the?curvature?or?the?slope?of?the?lane?in?the?road.\n | 18. The?method?of?claim?17,?wherein?implementing?the?autonomous?driving?functionality?further?comprises?managing?a?safe?turning?of?the?vehicular?UE?by?decelerating?to?an?allowed?maximum?speed.\n | 19. The?method?of?claim?17,?wherein?implementing?the?autonomous?driving?functionality?comprises?determining?the?speed?or?acceleration?limitation?based?on?a?sharpness?level?of?the?curvature?of?the?lane?in?the?road.\n | 20. The?method?of?claim?1,?wherein?the?slope?comprises?a?longitudinal?slope,?wherein?implementing?the?autonomous?driving?functionality?comprises?determining?an?efficient?acceleration?value?based?on?the?longitudinal?slope?to?manage?an?uphill?motion?of?the?vehicular?UE.\n | 21. The?method?of?claim?1,?wherein?the?slope?comprises?a?longitudinal?slope?or?a?transverse?slope?or?both.\n | 22. The?method?of?claim?1,?wherein?the?vehicular?communication?message? comprises?a?vehicle-to-everything?(V2X)?message.\n | 23. A?non-transitory?computer-readable?medium?storing?instructions?that?when?executed?by?a?processor,?cause?the?processor?to:\n receive,?by?an?application?layer?of?a?protocol?layer?stack?of?a?vehicular?user?equipment?(UE)?,?a?vehicular?communication?message?including?an?application?layer?data?element?that?directly?indicates?a?curvature?or?a?slope?of?a?lane?in?a?road;?and \n implement?autonomous?driving?functionality?based?on?the?application?layer?data?element. \n | 24. The?non-transitory?computer-readable?medium?of?claim?23,?wherein?the?processor?is?further?configured?to?perform?any?of?methods?2-22.\n | 25. A?vehicular?user?equipment?(UE)?for?wireless?communication,?comprising:\n a?memory?storing?instructions;?and \n a?processor?in?communication?with?the?memory,?wherein?the?processor?is?configured?to?execute?the?instructions?to: \n receive,?by?an?application?layer?of?a?protocol?layer?stack?of?the?vehicular?UE,?a?vehicular?communication?message?including?an?application?layer?data?element?that?directly?indicates?a?curvature?or?a?slope?of?a?lane?in?a?road;?and \n implement?autonomous?driving?functionality?based?on?the?application?layer?data?element. \n | 26. The?vehicular?UE?of?claim?25,?wherein?the?processor?is?further?configured?to?perform?any?of?methods?2-22.\n | 27. A?vehicular?user?equipment?(UE)?for?wireless?communication,?comprising:\n means?for?receiving,?by?an?application?layer?of?a?protocol?layer?stack?of?the? vehicular?UE,?a?vehicular?communication?message?including?an?application?layer?data?element?that?directly?indicates?a?curvature?or?a?slope?of?a?lane?in?a?road;?and \n means?for?implementing?autonomous?driving?functionality?based?on?the?application?layer?data?element. \n | 28. The?vehicular?UE?of?claim?27,?further?comprising?means?for?performing?any?of?methods?2-22.\n | 29. A?method?of?wireless?communication,?comprising:\n generating,?by?an?application?layer?of?a?protocol?layer?stack?of?a?device,?a?vehicular?communication?message?including?an?application?layer?data?element?that?directly?indicates?a?curvature?or?a?slope?of?a?lane?in?a?road;?and \n transmitting?the?vehicular?communication?message?to?a?vehicular?user?equipment?(UE)?configured?to?implement?autonomous?driving?functionality. \n | 30. A?non-transitory?computer-readable?medium?storing?instructions?that?when?executed?by?a?processor,?cause?the?processor?to:\n generate,?by?an?application?layer?of?a?protocol?layer?stack?of?a?device,?a?vehicular?communication?message?including?an?application?layer?data?element?that?directly?indicates?a?curvature?or?a?slope?of?a?lane?in?a?road;?and \n transmit?the?vehicular?communication?message?to?a?vehicular?user?equipment?(UE)?configured?to?implement?autonomous?driving?functionality. \n | 31. A?device,?comprising:\n a?memory?storing?instructions;?and \n a?processor?in?communication?with?the?memory,?wherein?the?processor?is?configured?to?execute?the?instructions?to: \n generate,?by?an?application?layer?of?a?protocol?layer?stack?of?a?device,?a?vehicular?communication?message?including?an?application?layer?data?element?that?directly?indicates?a?curvature?or?a?slope?of?a?lane?in?a?road;?and \n transmit?the?vehicular?communication?message?to?a?vehicular?user?equipment?(UE)?configured?to?implement?autonomous?driving?functionality. \n | 32. A?device,?comprising:\n means?for?generating,?by?an?application?layer?of?a?protocol?layer?stack?of?a?device,?a?vehicular?communication?message?including?an?application?layer?data?element?that?directly?indicates?a?curvature?or?a?slope?of?a?lane?in?a?road;?and \n means?for?transmitting?the?vehicular?communication?message?to?a?vehicular?user?equipment?(UE)?configured?to?implement?autonomous?driving?functionality.", "output": "The method (900) involves receiving (902) a vehicular communication message including an application layer data element that directly indicates a curvature or a slope of a lane in a road by an application layer of a protocol layer stack of the vehicular UE. The autonomous driving functionality is implemented (904) based on the application layer data element. The vehicular communication message is received from another vehicular UE, a network, an infrastructure, a road side unit (RSU), or a relay. The vehicular communication message is provided with several application layer data elements that indicate a lane width of the lane. The vehicular communication message is provided with application layer data elements that indicate a longitude value, a latitude value, and an elevation value for each point in a list of spaced points positioning a center line of the lane. INDEPENDENT CLAIMS are included for the following:a non-transitory computer-readable medium storing program for wireless communication;a vehicular user equipment for wire1ess communication; anda device for wire1ess communication. Method for wireless communication at vehicular UE referred to as internet of things (IoT) devices such as parking meter, gas pump, toaster, vehicles, and heart monitor. The capacity of the access network is improved. The accuracy of estimations depend on the density of points is improved. The system allows for improved driving assistance such as speed and acceleration control. The drawing shows a flowchart illustrating the method for wireless communication at vehicular UE. 900Method for wireless communication at vehicular UE902Step for receiving vehicular communication message including application layer data element904Step for implementing autonomous driving functionality based on application layer data", "instruction": "Please summarize the input" }, { "input": "ENFORCING RANGE RELIABILITY FOR INFORMATION SHARED VIA WIRELESS TRANSMISSIONSAn ego vehicle determines an intended maneuver and identifies a first set of agents for coordinating the intended maneuver. The ego vehicle also determines a spatial distance for obtaining a level of communication reliability with the set of agents that is greater than a communication reliability threshold. The ego vehicle further applies the determined spatial distance to a sensor-sharing message. The ego vehicle also transmits the sensor-sharing message to a second set of agents within the determined range. The ego vehicle performs the intended maneuver.What is claimed is: \n | 1. A method performed by an ego vehicle, comprising: \ndetermining an intended maneuver of the ego vehicle; \nidentifying a first set of agents for coordinating the intended maneuver; \ndetermining a spatial distance for obtaining a level of communication reliability with the set of agents that is greater than a communication reliability threshold; \napplying the determined spatial distance to a sensor-sharing message; \ntransmitting the sensor-sharing message to a second set of agents within the determined spatial distance; and \nperforming the intended maneuver. \n | 2. The method of claim 1, further comprising transmitting the sensor-sharing message via at least one of a vehicle-to-everything (V2X) transmission, a vehicle-to-vehicle (V2V) transmission, a vehicle-to-infrastructure (V2I) transmission, or a combination thereof.\n | 3. The method of claim 1, in which agents in the first set of agents and the second set of agents comprise at least one of a vehicle, an infrastructure component, a road side unit, a non-vehicular road user, or a combination thereof.\n | 4. The method of claim 3, further comprising receiving communications from an embedded vehicle-to-everything (V2X) device of the non-vehicular road user or a hand-held V2X device of the non-vehicular road user.\n | 5. The method of claim 1, further comprising determining the range based on at least one of the intended maneuver, a speed of the ego vehicle, a number of agents detected within a distance of the ego vehicle, a speed of at least one other agent, a direction of travel of at least one other agent, a road condition, a visibility level, a type of road, a quality of service (QoS), an automation level of the ego vehicle, a direction of travel of the ego vehicle, or a combination thereof.\n | 6. The method of claim 5, in which the distance is based on at least the intended maneuver, the road condition, the type of road, the speed of the ego vehicle or a combination thereof.\n | 7. The method of claim 5, in which: \nthe ego vehicle is capable of performing a plurality of maneuvers, and \neach maneuver corresponds to a different range. \n | 8. The method of claim 1, further comprising coordinating the intended maneuver with each agent of the first set of agents within the determined spatial distance.\n | 9. The method of claim 8, further comprising coordinating the intended maneuver via least one of a vehicle-to-everything (V2X) transmission, a vehicle-to-vehicle (V2V) transmission, a vehicle-to-infrastructure (V2I), or a combination thereof.\n | 10. The method of claim 1, in which the sensor sharing message identifies objects detected within a distance of the ego vehicle via at least one sensor integrated with the ego vehicle.\n | 11. The method of claim 10, in which the objects comprise at least one of non-V2X capable vehicles, non-vehicular road users, infrastructure, road obstacles, road impairments, or a combination thereof.\n | 12. The method of claim 1, further comprising: \ndetermining the range at an application-layer; and \nenforcing the range at a physical-layer. \n | 13. The method of claim 1, in which the ego vehicle comprises an autonomous vehicle or a semi-autonomous vehicle.\n | 14. An apparatus of an ego vehicle, comprising: \nmeans for determining an intended maneuver of the ego vehicle; \nmeans for identifying a first set of agents for coordinating the intended maneuver; \nmeans for determining a spatial distance for obtaining a level of communication reliability with the set of agents that is greater than a communication reliability threshold; \nmeans for applying the determined spatial distance to a sensor-sharing message; \nmeans for transmitting the sensor-sharing message to a second set of agents within the determined spatial distance; and \nmeans for performing the intended maneuver. \n | 15. The apparatus of claim 14, further comprising means for transmitting the sensor-sharing message via at least one of a vehicle-to-everything (V2X) transmission, a vehicle-to-vehicle (V2V) transmission, a vehicle-to-infrastructure (V2I) transmission, or a combination thereof.\n | 16. The apparatus of claim 14, in which agents in the first set of agents and the second set of agents comprise at least one of a vehicle, an infrastructure component, a road side unit, a non-vehicular road user, or a combination thereof.\n | 17. The apparatus of claim 16, further comprising means for receiving communications from an embedded vehicle-to-everything (V2X) device of the non-vehicular road user or a hand-held V2X device of the non-vehicular road user.\n | 18. The apparatus of claim 14, further comprising means for determining the range based on at least one of the intended maneuver, a speed of the ego vehicle, a number of agents detected within a distance of the ego vehicle, a speed of at least one other agent, a direction of travel of at least one other agent, a road condition, a visibility level, a type of road, a quality of service (QoS), an automation level of the ego vehicle, a direction of travel of the ego vehicle, or a combination thereof.\n | 19. The apparatus of claim 18, in which the distance is based on at least the intended maneuver, the road condition, the type of road, the speed of the ego vehicle or a combination thereof.\n | 20. The apparatus of claim 18, in which: \nthe ego vehicle is capable of performing a plurality of maneuvers, and \neach maneuver corresponds to a different range. \n | 21. The apparatus of claim 14, further comprising means for coordinating the intended maneuver with each agent of the first set of agents within the determined spatial distance.\n | 22. The apparatus of claim 21, further comprising means for coordinating the intended maneuver via least one of a vehicle-to-everything (V2X) transmission, a vehicle-to-vehicle (V2V) transmission, a vehicle-to-infrastructure (V2I), or a combination thereof.\n | 23. The apparatus of claim 14, in which the sensor sharing message identifies objects detected within a distance of the ego vehicle via at least one sensor integrated with the ego vehicle.\n | 24. The apparatus of claim 23, in which the objects comprise at least one of non-V2X capable vehicles, non-vehicular road users, infrastructure, road obstacles, road impairments, or a combination thereof.\n | 25. The apparatus of claim 14, further comprising: \nmeans for determining the range at an application-layer; and \nmeans for enforcing the range at a physical-layer. \n | 26. The apparatus of claim 14, in which the ego vehicle comprises an autonomous vehicle or a semi-autonomous vehicle.\n | 27. An ego vehicle, comprising: \na processor; \na memory coupled with the processor; and \ninstructions stored in the memory and operable, when executed by the processor, to cause the ego vehicle: \nto determine an intended maneuver; \nto identify a first set of agents for coordinate the intended maneuver; \nto determine a spatial distance for obtaining a level of communication reliability with the set of agents that is greater than a communication reliability threshold; \nto apply the determined spatial distance to a sensor-sharing message; \nto transmit the sensor-sharing message to a second set of agents within the determined spatial distance; and \nto perform the intended maneuver. \n | 28. The ego vehicle of claim 27, in which the instructions further cause the ego vehicle to transmit the sensor-sharing message via at least one of a vehicle-to-everything (V2X) transmission, a vehicle-to-vehicle (V2V) transmission, a vehicle-to-infrastructure (V2I) transmission, or a combination thereof.\n | 29. The ego vehicle of claim 27, in which agents in the first set of agents and the second set of agents comprise at least one of a vehicle, an infrastructure component, a road side unit, a non-vehicular road user, or a combination thereof.\n | 30. The ego vehicle of claim 29, in which the instructions further cause the ego vehicle to receive communications from an embedded vehicle-to-everything (V2X) device of the non-vehicular road user or a hand-held V2X device of the non-vehicular road user.\n | 31. The ego vehicle of claim 27, in which the instructions further cause the ego vehicle to determine the range based on at least one of the intended maneuver, a speed of the ego vehicle, a number of agents detected within a distance of the ego vehicle, a speed of at least one other agent, a direction of travel of at least one other agent, a road condition, a visibility level, a type of road, a quality of service (QoS), an automation level of the ego vehicle, a direction of travel of the ego vehicle, or a combination thereof.\n | 32. The ego vehicle of claim 31, in which the distance is based on at least the intended maneuver, the road condition, the type of road, the speed of the ego vehicle or a combination thereof.\n | 33. The ego vehicle of claim 31, in which: \nthe ego vehicle is capable of performing a plurality of maneuvers, and \neach maneuver corresponds to a different range. \n | 34. The ego vehicle of claim 27, in which the instructions further cause the ego vehicle to coordinate the intended maneuver with each agent of the first set of agents within the determined spatial distance.\n | 35. The ego vehicle of claim 34, in which the instructions further cause the ego vehicle to coordinate the intended maneuver via least one of a vehicle-to-everything (V2X) transmission, a vehicle-to-vehicle (V2V) transmission, a vehicle-to-infrastructure (V2I), or a combination thereof.\n | 36. The ego vehicle of claim 27, in which the sensor sharing message identifies objects detected within a distance of the ego vehicle via at least one sensor integrated with the ego vehicle.\n | 37. The ego vehicle of claim 36, in which the objects comprise at least one of non-V2X capable vehicles, non-vehicular road users, infrastructure, road obstacles, road impairments, or a combination thereof.\n | 38. The ego vehicle of claim 27, in which the instructions further cause the ego vehicle: \nto determine the range at an application-layer; and \nto enforce the range at a physical-layer. \n | 39. The ego vehicle of claim 27, in which the ego vehicle comprises an autonomous vehicle or a semi-autonomous vehicle.\n | 40. A non-transitory computer-readable medium having program code recorded thereon, the program code executed by a processor and comprising: \nprogram code to determine an intended maneuver of an ego vehicle; \nprogram code to identify a first set of agents for coordinate the intended maneuver; \nprogram code to determine a spatial distance for obtaining a level of communication reliability with the set of agents that is greater than a communication reliability threshold; \nprogram code to transmit the sensor-sharing message to a second set of agents within the determined spatial distance; and \nprogram code to perform the intended maneuver.", "output": "The method involves determining (12) an intended maneuver of an ego vehicle, and identifying a set of agents for coordinating the intended maneuver. A spatial distance is determined (13) for obtaining a level of communication reliability with the agents that is greater than a communication reliability threshold. The determined spatial distance to a sensor-sharing message is applied, and the message is transmitted to another set of the agents within the determined distance. The intended maneuver is performed (17), and a range is determined at an application-layer. The range is enforced at a physical-layer, where the vehicle is an autonomous vehicle or semi-autonomous vehicle. INDEPENDENT CLAIMS are included for the following:an apparatus of an ego vehicle;an ego vehicle; anda non-transitory computer-readable medium storing program for applying a spatial distance to sensor-sharing messages. Method for applying spatial distance to sensor-sharing messages and used for enforcing range reliability for information through wireless transmissions by ego vehicle. Uses include but are not limited to telephony, video, data, messaging, and broadcasts. The method enables determining the spatial distance for obtaining a level of communication reliability with the set of agents that is greater than a communication reliability threshold in an efficient manner. The method allows the ego vehicle to transmit the sensor-sharing message to the agents within the determined spatial distance, so that the vehicle can perform the intended maneuver, thus increasing safety and preventing collisions of the vehicles. The drawing shows a flow diagram of a method for applying a spatial distance to sensor-sharing messages. 12Step for determining an intended maneuver of an ego vehicle13Step for determining spatial distance14Step for identifying objects15Step for sharing sensor information16Step for coordinating the intended maneuver", "instruction": "Please summarize the input" }, { "input": "ARCHITECTURE AND PROTOCOL LAYERING FOR SIDELINK POSITIONINGIn some implementations, a user equipment (UE) may implement a ranging support protocol layer comprising one or more ranging support elements. The UE may communicate, using the one or more ranging support elements of the ranging support protocol layer, with a corresponding ranging support protocol layer in one or more other UEs, wherein the communicating is conducted via at least one lower protocol layer implemented at the UE. The UE may provide, at the ranging support protocol layer, a positioning service to an upper protocol layer implemented at the UE, the positioning service based at least in part on the communicating.|1. A method for supporting side link (SL) positioning, the method is performed by a user equipment (UE) and includes the following steps: \nimplement a ranging support protocol layer including one or more ranging support elements at the UE; \nThe one or more ranging support elements of the ranging support protocol layer are used to communicate with a corresponding ranging support protocol layer in one or more other UEs, wherein the steps of communicating are via at least one step performed at the UE performed at the underlying protocol level; and \nA positioning service is provided at the ranging support protocol layer to an upper protocol layer implemented at the UE, the positioning service being based at least in part on the steps of communication. \n | 2. The method of claim 1, wherein the one or more ranging support elements include a discovery function, and the positioning service includes: \nInformation including a unique identifier of another UE among the one or more other UEs that is capable of participating in side link positioning and ranging services, \ninformation including an indication of a service supported by another of the one or more other UEs, \na side link communication channel with another one of the one or more other UEs, \nside link communication period with another one of the one or more other UEs, or \nany combination thereof. \n | 3. The method described in claim 2 further includes the following steps: using the exploration function to receive information from the upper protocol layer, wherein the information includes: \na trigger for exploring UEs participating in side-link positioning and ranging services, \nThe attributes of the UE to be explored, \na permission for exploration by another of the one or more other UEs and corresponding attributes of the other UE, \na request or permission for side-link positioning and ranging services, or \nany combination thereof. \n | 4. The method as described in request item 1, wherein: \nthe one or more ranging support elements include a group support function; and \nSteps to provide a location service to the upper protocol layer include using the group support function to: \nwhen the upper protocol layer specifies a side link positioning and ranging service group, establishing the side link positioning and ranging service group with more than two of the one or more other UEs, \nProvide the group ID and group local member ID to the upper protocol layer, \nManage the addition or removal of group members, \nSplit or merge groups, \nMonitor group membership status, or \nany combination thereof. \n | 5. The method described in claim 4 further includes the following steps: using the group support function to receive information from the upper protocol layer, wherein the information includes: \na request to establish the sidelink positioning and ranging service group, \na request to add or remove a specific group member UE, \nA management request for the sidelink positioning and ranging service group that includes merging or splitting groups, or \nany combination thereof. \n | 6. The method of claim 1, wherein the one or more ranging support elements include side link positioning and ranging protocol functions, and the step of providing a positioning service to the upper protocol layer includes providing: \nOn-demand sidelink positioning and ranging for determining a range, direction, relative position or relative velocity of another UE or each of a group of other UEs; \nPeriodic sidelink positioning and ranging for periodically determining a range, direction, relative position or relative speed of another UE or each UE in a group of other UEs; \nTriggered sidelink positioning and ranging for triggering determination of a range, direction, relative position or relative speed of another UE or each of a group of other UEs; or \nany combination thereof. \n | 7. The method described in claim 6 further includes the following steps: using the side link positioning and ranging protocol function to receive information from the upper protocol layer, wherein the information includes: \na request for a current range, direction, relative position or relative speed of another UE or group of UEs, \na request for a periodic range, direction, relative position or relative speed of another UE or group of UEs, \na request for a triggered range, direction, relative position or relative velocity for another UE or group of UEs, or \nany combination thereof. \n | 8. The method of claim 6 further includes the step of using the side-link positioning and ranging protocol function to communicate with a network server that supports side-link positioning and ranging.\n | 9. The method of claim 8, wherein the side-link positioning and ranging protocol function communicates with the network server supporting side-link positioning and ranging using non-access layer (NAS) signaling.\n | 10. The method of claim 1, wherein the upper protocol layer is an application layer, and the at least one lower protocol layer includes a ProSe layer, a V2X layer or an Access Layer (AS) layer.\n | 11. The method of claim 10, wherein the application layer supports vehicle-to-everything (V2X), autonomous driving, movement of objects in a factory or warehouse, UE-to-UE ranging, or a combination thereof.\n | 12. The method of claim 10, wherein the step of communicating using the one or more ranging support elements of the ranging support protocol layer includes using a PC5 communication provided by the ProSe layer, the V2X layer or the AS layer Serve.\n | 13. A user equipment (UE) including: \na transceiver; \na memory; and \nOne or more processors communicatively coupled to the transceiver and the memory, wherein the one or more processors are configured to: \nimplement a ranging support protocol layer including one or more ranging support elements; \ncommunicating via the transceiver using the one or more ranging support elements of the ranging support protocol layer with a corresponding ranging support protocol layer in one or more other UEs, wherein the communication is performed at the UE at least one lower protocol layer; and \nA positioning service is provided at the ranging support protocol layer to an upper protocol layer implemented at the UE, the positioning service being based at least in part on the communication. \n | 14. A UE as described in request item 13, wherein: \nIn order to communicate using the one or more ranging support elements, the one or more processors are configured to implement a discovery function; and \nTo provide the location service, the one or more processors are configured to provide: \nInformation including a unique identifier of another UE among the one or more other UEs that is capable of participating in side link positioning and ranging services, \ninformation including an indication of a service supported by another of the one or more other UEs, \na side link communication channel with another one of the one or more other UEs, \nside link communication period with another one of the one or more other UEs, or \nany combination thereof. \n | 15. The UE of claim 14, wherein the one or more processors are further configured to utilize the discovery function to receive information from the upper protocol layer, wherein the information includes: \na trigger for exploring UEs participating in side-link positioning and ranging services, \nThe attributes of the UE to be explored, \na permission for exploration by another of the one or more other UEs and corresponding attributes of the other UE, \na request or permission for side-link positioning and ranging services, or \nany combination thereof. \n | 16. A UE as described in request item 13, wherein: \nTo communicate using the one or more ranging support elements, the one or more processors are configured to implement a group support function; and \nTo provide the location service to the upper protocol layer, the one or more processors are configured to use the group support function to: \nwhen the upper protocol layer specifies a side link positioning and ranging service group, establishing the side link positioning and ranging service group with more than two of the one or more other UEs, \nProvide the group ID and group local member ID to the upper protocol layer, \nManage the addition or removal of group members, \nSplit or merge groups, \nMonitor group membership status, or \nany combination thereof. \n | 17. The UE of claim 16, wherein the one or more processors are further configured to utilize the group support function to receive information from the upper protocol layer, wherein the information includes: \na request to establish the sidelink positioning and ranging service group, \na request to add or remove a specific group member UE, \nA management request for the sidelink positioning and ranging service group that includes merging or splitting groups, or \nany combination thereof. \n | 18. A UE as described in request item 13, wherein: \nTo communicate using the one or more ranging support elements, the one or more processors are configured to implement side link positioning and ranging protocol functions; and \nIn order to provide the location service to the upper protocol layer, the one or more processors are configured to provide: \nOn-demand sidelink positioning and ranging for determining a range, direction, relative position or relative velocity of another UE or each of a group of other UEs; \nPeriodic sidelink positioning and ranging for periodically determining a range, direction, relative position or relative speed of another UE or each UE in a group of other UEs; \nTriggered sidelink positioning and ranging for triggering determination of a range, direction, relative position or relative speed of another UE or each of a group of other UEs; or \nany combination thereof. \n | 19. The UE of claim 18, wherein the one or more processors are further configured to utilize the sidelink positioning and ranging protocol function to receive information from the upper protocol layer, wherein the information includes: \na request for a current range, direction, relative position or relative speed of another UE or group of UEs, \na request for a periodic range, direction, relative position or relative speed of another UE or group of UEs, \na request for a triggered range, direction, relative position or relative velocity for another UE or group of UEs, or \nany combination thereof. \n | 20. The UE of claim 18, wherein the one or more processors are further configured to use the sidelink positioning and ranging protocol function and a network server supporting sidelink positioning and ranging via the transceiver. communication.\n | 21. The UE of claim 20, wherein the one or more processors are configured to use the side-link positioning and ranging protocol function to transmit and support side-link positioning and ranging using non-access layer (NAS) signaling. communication with this web server.\n | 22. A UE as described in request item 13, wherein: \nTo provide the location service to the upper protocol layer, the one or more processors are configured to provide the location service to an application layer; and \nTo communicate via the at least one lower protocol layer, the one or more processors are configured to communicate via a ProSe layer, V2X layer or Access Layer (AS) layer. \n | 23. The UE of claim 22, wherein in order to communicate using the one or more ranging support elements of the ranging support protocol layer, the one or more processors are configured to use the ProSe layer, the V2X layer Or a PC5 communication service provided by the AS layer.\n | 24. A device for supporting side link (SL) positioning, the device comprising: \nComponents for implementing a ranging support protocol layer including one or more ranging support elements; \nMeans for communicating with a corresponding ranging support protocol layer in one or more other UEs using the one or more ranging support elements of the ranging support protocol layer, wherein the communication is performed at the UE at least one underlying protocol layer; and \nMeans for providing a positioning service at the ranging support protocol layer to an upper protocol layer implemented at the UE, the positioning service being based at least in part on the communication. \n | 25. A device as claimed in claim 24, wherein: \nThe means for communicating using the one or more ranging support elements include means for implementing an exploration function; and \nThe components used to provide the location service include components used to provide the following information: \nInformation including a unique identifier of another UE among the one or more other UEs that is capable of participating in side link positioning and ranging services, \ninformation including an indication of a service supported by another of the one or more other UEs, \na side link communication channel with another one of the one or more other UEs, \nside link communication period with another one of the one or more other UEs, or \nany combination thereof. \n | 26. The device of claim 25, further comprising means for utilizing the discovery function to receive information from the upper protocol layer, wherein the information includes: \na trigger for exploring UEs participating in side-link positioning and ranging services, \nThe attributes of the UE to be explored, \na permission for exploration by another of the one or more other UEs and corresponding attributes of the other UE, \na request or permission for side-link positioning and ranging services, or \nany combination thereof. \n | 27. A device as claimed in claim 24, wherein: \nthe means for communicating using the one or more ranging support elements include means for implementing a group support function; and \nThe means for providing the location service include means for using the group support functionality to: \nwhen the upper protocol layer specifies a side link positioning and ranging service group, establishing the side link positioning and ranging service group with more than two of the one or more other UEs, \nProvide the group ID and group local member ID to the upper protocol layer, \nManage the addition or removal of group members, \nSplit or merge groups, \nMonitor group membership status, or \nany combination thereof. \n | 28. The device of claim 27, further comprising means for utilizing the group support function to receive information from the upper protocol layer, wherein the information includes: \na request to establish the sidelink positioning and ranging service group, \na request to add or remove a specific group member UE, \nA management request for the sidelink positioning and ranging service group that includes merging or splitting groups, or \nany combination thereof. \n | 29. A device as claimed in claim 24, wherein: \nthe means for communicating using the one or more ranging support elements include means for implementing side link positioning and ranging protocol functions; and \nThe components used to provide the location service include components used to provide the following information: \nOn-demand sidelink positioning and ranging for determining a range, direction, relative position or relative velocity of another UE or each of a group of other UEs; \nPeriodic sidelink positioning and ranging for periodically determining a range, direction, relative position or relative speed of another UE or each UE in a group of other UEs; \nTriggered sidelink positioning and ranging for triggering determination of a range, direction, relative position or relative speed of another UE or each of a group of other UEs; or \nany combination thereof. \n | 30. A non-transitory computer-readable medium that stores instructions for supporting side-link (SL) positioning, including code for: \nimplement a ranging support protocol layer including one or more ranging support elements; \nThe one or more ranging support elements of the ranging support protocol layer are used to communicate with a corresponding ranging support protocol layer in one or more other UEs, wherein the communication is via at least one lower layer protocol implemented at the UE carried out in layers; and \nA positioning service is provided at the ranging support protocol layer to an upper protocol layer implemented at the UE, the positioning service being based at least in part on the communication.", "output": "The method (1400) involves implementing a ranging support protocol layer comprising ranging support elements at a user equipment (UE) e.g. mobile phone. The communication is made with a corresponding ranging support layer in other UEs through a lower protocol layer implemented at the former UE. The positioning service is provided (1402) to an upper protocol layer at the latter UE, where the positioning service comprises information comprising a unique identifier of another UE of the latter UEs that participate in a sidelink positioning and ranging service. The information is received from the upper layer with a discovery function. INDEPENDENT CLAIMS are included for: (1) a user equipment comprises a transceiver; (2) an apparatus for supporting sidelink positioning of user equipment; (3) a non-transitory computer-readable medium for storing instructions. Method for supporting sidelink positioning of user equipment, such as cellular phone, personal digital assistant, laptop computer, cordless phone, wireless local loop station, personal computer, tablet, set-top box, web appliance, network router, switch or bridge. The method enables allowing the UEs to communicate using sidelink signaling and to be located using the sidelink related positioning in an effective manner. The method allows a user equipment (UE) to communicate with other UEs using the positioning service based on the positioning measurements obtained by the base station, so that the positioning services can be provided to the UE in an efficient manner. The drawing shows a flow diagram of a sidelink positioning supporting method.1400Sidelink positioning supporting method 1402Providing services to an upper layer of the architecture 1404Communicating by the ranging support elements ", "instruction": "Please summarize the input" }, { "input": "METHOD AND APPARATUS FOR VEHICLE MANEUVER PLANNING AND MESSAGINGTechniques are provided which may be implemented using various methods and/or apparatuses in a vehicle to utilize vehicle external sensor data, vehicle internal sensor data, vehicle capabilities and external V2X input to determine, send, receive and utilize V2X information and control data, sent between the vehicle and a road side unit (RSU) to determine intersection access and vehicle behavior when approaching the intersection.|1. A method for an autonomous vehicle to enter an intersection, which includes:Determine a braking distance for the autonomous vehicle based on a vehicle external sensor, a vehicle internal sensor, vehicle capability, or external V2X input, or a combination thereof;A first message is sent from the autonomous vehicle, where the first message includes an identification data element or a vehicle type or a vehicle priority or a combination thereof for the autonomous vehicle and a braking distance data for the autonomous vehicle element;Receiving a second message from a roadside unit (RSU) based at least in part on the braking distance for the autonomous vehicle, which includes one or more instructions regarding the autonomous vehicle's intersection entry; andControl the autonomous vehicle to enter the intersection in response to the one or more instructions received from the RSU.\n | 2. For example, the method for intersection entry of request item 1, further includes sending a third message from the autonomous vehicle to the RSU before the second message, thereby requesting intersection entry.\n | 3. For example, the method for driving at an intersection in claim 1, wherein the braking distance for the autonomous vehicle is determined based at least in part on the speed of the autonomous vehicle.\n | 4. For example, the intersection approach method of claim 3, wherein the braking distance for the autonomous vehicle is determined based at least in part on the tire pressure or weather conditions or tire traction data for the autonomous vehicle or a combination thereof.\n | 5. For example, the method of driving at an intersection in claim 1, wherein the braking distance for the autonomous vehicle is shorter in the automatic control mode than in the manual mode.\n | 6. For example, the method for entering the intersection of request item 1, wherein the first message is a broadcast message.\n | 7. For example, the method of entering the intersection of request item 1, wherein the first message is a point-to-point message.\n | 8. For example, the method for entering an intersection of request item 1, wherein the first message is a basic safety message or a cooperative sensing message.\n | 9. An autonomous vehicle, which includes:One or more wireless transceivers;Vehicle interior sensor;Vehicle external sensor;A memory; andOne or more processors, which are communicatively coupled to the one or more wireless transceivers, the vehicle internal sensors, the vehicle external sensors, and the memory;The one or more processors are configured to:Determine a braking distance for the autonomous vehicle based on the external sensors of the vehicle, the internal sensors of the vehicle, the vehicle capability or the external V2X input, or a combination thereof;A first message is sent from the one or more wireless transceivers, wherein the first message includes an identification data element or a vehicle type or a vehicle priority or a combination thereof for the autonomous vehicle and for the autonomous vehicle One of the braking distance data elements;A second message is received at the one or more wireless transceivers from a road side unit (RSU) based at least in part on the braking distance for the autonomous vehicle, which includes one or more information about the intersection of the autonomous vehicle Multiple instructions; andControl the autonomous vehicle to enter the intersection in response to the one or more instructions received from the RSU.\n | 10. Such as the autonomous vehicle of claim 9, wherein the one or more processors are further configured to send a third message from the one or more wireless transceivers to the RSU before the second message, thereby requesting an intersection Drive in.\n | 11. Such as the autonomous vehicle of claim 9, wherein the braking distance for the autonomous vehicle is determined based at least in part on the speed of the autonomous vehicle and the empirical stopping distance data associated with the speed of the autonomous vehicle.\n | 12. Such as the autonomous vehicle of claim 11, wherein the braking distance for the autonomous vehicle is determined based at least in part on the tire pressure or weather conditions or tire traction data for the autonomous vehicle or a combination thereof.\n | 13. Such as the autonomous vehicle of claim 9, wherein the braking distance for the autonomous vehicle is shorter in the automatic control mode than in the manual mode.\n | 14. Such as the autonomous vehicle of claim 9, wherein the first message is a broadcast message.\n | 15. For example, the autonomous vehicle of claim 9, wherein the first message is a point-to-point message.\n | 16. For example, the autonomous vehicle of claim 9, wherein the first message is a basic safety message or a cooperative sensing message.\n | 17. An autonomous vehicle, which includes:A component used to determine a braking distance for the autonomous vehicle based on a vehicle exterior sensor, a vehicle interior sensor, vehicle capability, or external V2X input, or a combination thereof;A means for sending a first message from the autonomous vehicle, wherein the first message includes an identification data element or a vehicle type or a vehicle priority or a combination thereof for the autonomous vehicle and the information for the autonomous vehicle A braking distance data element;A member for receiving a second message from a road side unit (RSU) based at least in part on the braking distance for the autonomous vehicle, the second message including one or more instructions regarding the autonomous vehicle entering an intersection ;andA component for controlling the entry of the autonomous vehicle at the intersection in response to the one or more instructions received from the RSU.\n | 18. For example, the autonomous vehicle of claim 17, which further includes a component for sending a third message from the autonomous vehicle to the RSU before the second message to request entry at the intersection.\n | 19. For example, the autonomous vehicle of claim 17, wherein the first message is a broadcast message.\n | 20. For example, the autonomous vehicle of claim 17, wherein the first message is a point-to-point message.\n | 21. For example, the autonomous vehicle of claim 17, wherein the first message is a basic safety message or a cooperative sensing message.\n | 22. A non-transitory computer-readable medium on which is stored computer-readable instructions that cause one or more processors on an autonomous vehicle to perform the following operations:Determine a braking distance for the autonomous vehicle based on a vehicle external sensor, a vehicle internal sensor, vehicle capability, or external V2X input, or a combination thereof;A first message is sent from the autonomous vehicle, where the first message includes an identification data element or a vehicle type or a vehicle priority or a combination thereof for the autonomous vehicle and a braking distance data for the autonomous vehicle element;Receiving a second message from a roadside unit (RSU) based at least in part on the braking distance for the autonomous vehicle, which includes one or more instructions regarding the autonomous vehicle's intersection entry; andControl the autonomous vehicle to enter the intersection in response to the one or more instructions received from the RSU.\n | 23. For example, the non-transitory computer-readable medium of the request item 22 further includes an instruction for the one or more processors to send a third message to the RSU before the second message, so as to request entry into the intersection.\n | 24. For example, the non-transitory computer-readable medium of claim 22, wherein the first message is a broadcast message.\n | 25. For example, the non-transitory computer-readable medium of request 22, wherein the first message is a point-to-point message.\n | 26. For example, the non-transitory computer-readable medium of request 22, wherein the first message is a basic security message or a cooperative awareness message.", "output": "The method involves determining a braking distance for the ego vehicle based upon vehicle external sensors, vehicle internal sensors, vehicle capabilities, or external V2X input, or a combination. A first message is sent from the ego vehicle. The first message includes an identification data element for the ego vehicle or a vehicle type or a vehicle priority or a combination thereof and a braking distance data element for the ego vehicle. A second message includes instructions with respect to intersection access by the ego vehicle is received from a roadside unit (RSU) based upon the braking distance for the ego vehicle. The intersection access is controlled by the ego vehicle in response to the instructions received from the RSU. INDEPENDENT CLAIMS are included for the following:an ego vehicle; anda non-transitory computer-readable medium storing program for an ego vehicle. Method for intersection access by ego vehicle. Increased tire inflation decreases the tire surface in contact with the road, reducing traction, and thus increases vehicle turning radius at current speed and reduces maneuverability at current speed. The drawing shows a block diagram of a system level embodiment for an ego vehicle. 910Processor930Wireless transceiver935Camera940Car sensor950Lidar", "instruction": "Please summarize the input" }, { "input": "System and method for relative positioning based safe autonomous drivingDisclosed is a method and apparatus for managing a driving plan of an autonomous vehicle. The method may include obtaining observations of a neighboring vehicle using one or more sensors of the autonomous vehicle. The method may also include classifying one or more behavioral driving characteristics of the neighboring vehicle based on the observations. Furthermore, the method may include updating the driving plan based on a classification of the one or more behavioral driving characteristics of the neighboring vehicle, and controlling one or more operations of the autonomous vehicle based on the updated driving plan.What is claimed is: \n | 1. A method for managing a driving plan of an autonomous vehicle, comprising: \nobtaining observations of a neighboring vehicle using one or more sensors of the autonomous vehicle, the observations including observed driving behaviors of the neighboring vehicle; \ngenerating a driving risk pattern of the neighboring vehicle based on the observations; \nupdating the driving plan based on the generated driving risk pattern of the neighboring vehicle; and \ncontrolling one or more operations of the autonomous vehicle based on the updated driving plan, \nwherein the generated driving risk pattern indicates one of a plurality of different risk levels, \nwherein generating the driving risk pattern of the neighboring vehicle comprises classifying the neighboring vehicle as a vehicle that lacks autonomous driving capability based on the observed driving behaviors of the neighboring vehicle, and \nwherein the generated driving risk pattern comprises a first classification indicating the neighboring vehicle as a vehicle lacking autonomous driving capability. \n | 2. The method of claim 1, further comprising: \ndetermining a second classification of the neighboring vehicle based on vehicle characteristics of the neighboring vehicle exchanged in a vehicle to vehicle communication; and \ncross-checking the second classification against the first classification. \n | 3. The method of claim 1, wherein the one or more sensors of the autonomous vehicle used to obtain the observations comprise a RADAR sensor, a LIDAR sensor, a GPS sensor, a proximity sensor, a visual sensor, or a combination thereof.\n | 4. The method of claim 1, wherein the generated driving risk pattern is generated using a machine learning model.\n | 5. The method of claim 1, wherein obtaining the observations comprises: \ncollecting, using the one or more sensors of the autonomous vehicle, one or more observable vehicle characteristics of the neighboring vehicle. \n | 6. The method of claim 5, wherein the one or more observable vehicle characteristics of the neighboring vehicle collected by the one or more sensors comprise one or more relative accelerations of the neighboring vehicle, a relative speed of the neighboring vehicle, a relative direction of travel of the neighboring vehicle, one or more visual characteristics of the neighboring vehicle, one or more visual characteristics of a driver of the neighboring vehicle, or a combination thereof.\n | 7. The method of claim 5, further comprising: \nsending, to a server, the one or more observable vehicle characteristics of the neighboring vehicle. \n | 8. The method of claim 7, further comprising: \nsending, to the server, a plurality of observable vehicle characteristics associated with a plurality of observed vehicles collected by the autonomous vehicle; and \nreceiving, from the server, a machine learning model trained to identify behavioral characteristics from observable vehicle characteristics using the plurality of observable vehicle characteristics and associated known behavioral characteristics. \n | 9. The method of claim 8, further comprising: \nperiodically sending, to the server, new observable vehicle characteristics associated with new observed vehicles collected by the autonomous vehicle; and \nperiodically receiving, from the server, an updated machine learning model. \n | 10. The method of claim 1, wherein the autonomous vehicle is an autonomous car.\n | 11. A system for managing a driving plan of an autonomous vehicle, the system comprising: \none or more sensors configured to obtain observations of a neighboring vehicle, the observations including observed driving behaviors of the neighboring vehicle; \na memory configured to store the observations; and \none or more processors communicably coupled with the memory and the sensors, the one or more processors configured to: \ngenerate a driving risk pattern of the neighboring vehicle based on the observations, \nupdate the driving plan based on the generated driving risk pattern of the neighboring vehicle, and \ncontrol one or more operations of the autonomous vehicle based on the updated driving plan, \nwherein the generated driving risk pattern indicates one of a plurality of different risk levels, \nwherein the one or more processors configured to generate the driving risk pattern of the neighboring vehicle are configured to classify the neighboring vehicle as a vehicle that lacks autonomous driving capability based on the observed driving behaviors of the neighboring vehicle, and \nwherein the generated driving risk pattern comprises a first classification indicating the neighboring vehicle as a vehicle lacking autonomous driving capability. \n | 12. The system of claim 11, wherein the one or more processors are further configured to: \ndetermine a second classification of the neighboring vehicle based on vehicle characteristics of the neighboring vehicle exchanged in a vehicle to vehicle communication; and \ncross-check the second classification against the first classification. \n | 13. The system of claim 11, wherein the one or more sensors used to obtain the observations comprise a RADAR sensor, a LIDAR sensor, a GPS sensor, a proximity sensor, a visual sensor, or a combination thereof.\n | 14. The system of claim 11, wherein the one or more processors are further configured to use a machine learning model to generate the driving risk pattern of the neighboring vehicle.\n | 15. The system of claim 11, further wherein the one or more sensors are configured to: \ncollect, using the one or more sensors of the autonomous vehicle, one or more observable vehicle characteristics of the neighboring vehicle. \n | 16. The system of claim 15, wherein the one or more observable vehicle characteristics of the neighboring vehicle collected by the one or more sensors comprise one or more relative accelerations of the neighboring vehicle, a relative speed of the neighboring vehicle, a relative direction of travel of the neighboring vehicle, one or more visual characteristics of the neighboring vehicle, one or more visual characteristics of a driver of the neighboring vehicle, or a combination thereof.\n | 17. The system of claim 15, further comprising: \na wireless subsystem configured to send to a server the one or more observable vehicle characteristics of the neighboring vehicle. \n | 18. The system of claim 17, wherein the wireless subsystem is further configured to: \nsend a plurality of observable vehicle characteristics associated with a plurality of observed vehicles to the server; and \nreceive, from the server, a machine learning model trained to identify behavioral characteristics from observable vehicle characteristics using the plurality of observable vehicle characteristics and associated known behavioral characteristics. \n | 19. The system of claim 18, wherein the wireless subsystem is further configured to: \nperiodically send, to the server, new observable vehicle characteristics associated with new observed vehicles collected by the autonomous vehicle; and \nperiodically receive, from the server, an updated machine learning model. \n | 20. The system of claim 11, wherein the autonomous vehicle is an autonomous car.\n | 21. A non-transitory computer readable storage medium including instructions that, when executed by a processor, cause the processor to perform operations for managing a driving plan of an autonomous vehicle, the operations comprising: \nobtaining observations of a neighboring vehicle using one or more sensors of the autonomous vehicle, the observations including observed driving behaviors of the neighboring vehicle; \ngenerating a driving risk pattern of the neighboring vehicle based on the observations; \nupdating the driving plan based on the generated driving risk pattern of the neighboring vehicle; and \ncontrolling one or more operations of the autonomous vehicle based on the updated driving plan, \nwherein the generated driving risk pattern indicates one of a plurality of different risk levels, \nwherein generating the driving risk pattern of the neighboring vehicle comprises classifying the neighboring vehicle as a vehicle that lacks autonomous driving capability based on the observed driving behaviors of the neighboring vehicle, and \nwherein the generated driving risk pattern comprises a first classification indicating the neighboring vehicle as a vehicle lacking autonomous driving capability. \n | 22. The non-transitory computer readable storage medium of claim 21, wherein the operations further comprise: \ndetermining a second classification of the neighboring vehicle based on vehicle characteristics of the neighboring vehicle exchanged in a vehicle to vehicle communication; and \ncross-checking the second classification against the first classification. \n | 23. The non-transitory computer readable storage medium of claim 21, wherein obtaining the observations comprises: \ncollecting, using the one or more sensors of the autonomous vehicle, one or more observable vehicle characteristics of the neighboring vehicle; and \nsending to a server the one or more observable vehicle characteristics of the neighboring vehicle. \n | 24. An apparatus, comprising: \nmeans for obtaining observations of a neighboring vehicle using one or more sensors of an autonomous vehicle, the observations including observed driving behaviors of the neighboring vehicle; \nmeans for generating a driving risk pattern of the neighboring vehicle based on the observations; \nmeans for updating a driving plan based on the generated driving risk pattern of the neighboring vehicle; and \nmeans for controlling one or more operations of the autonomous vehicle based on the updated driving plan, \nwherein the generated driving risk pattern indicates one of a plurality of different risk levels, \nwherein the means for generating the driving risk pattern of the neighboring vehicle comprises means for classifying the neighboring vehicle as a vehicle that lacks autonomous driving capability based on the observed driving behaviors of the neighboring vehicle, and \nwherein the generated driving risk pattern comprises a first classification indicating the neighboring vehicle as a vehicle lacking autonomous driving capability. \n | 25. The apparatus of claim 24, further comprising: \nmeans for determining a second classification of the neighboring vehicle based on vehicle characteristics of the neighboring vehicle exchanged in a vehicle to vehicle communication; and \nmeans for cross-checking the second classification against the first classification. \n | 26. The apparatus of claim 24, wherein the means for obtaining the observations comprises: \nmeans for collecting, using the one or more sensors of the autonomous vehicle, one or more observable vehicle characteristics of the neighboring vehicle; and \nmeans for sending, to a server, the one or more observable vehicle characteristics of the neighboring vehicle. \n | 27. The method of claim 1, further comprising: \nobtaining observations of a second neighboring vehicle; \ngenerating a second driving risk pattern of the second neighboring vehicle based on the observations of the second neighboring vehicle; and \nupdating the driving plan based on a weight average of the driving risk pattern of the neighboring vehicle and the second driving risk pattern of the second neighboring vehicle. \n | 28. The method of claim 1, further comprising: \ndetermining a second classification of the neighboring vehicle based on observed visual characteristics of the neighboring vehicle; and \ncross-checking the second classification against the first classification. \n | 29. The method of claim 1, wherein the observed driving behaviors comprise at least one of a relative speed, a relative acceleration, a relative deceleration, a relative position, or relative direction changes of the neighboring vehicle.\n | 30. The method of claim 4, wherein the machine learning model uses objective driving behavior information as truth data to analyze the observations.", "output": "The method (300) involves obtaining (302) observations of a neighboring vehicle using one or more sensors of an autonomous vehicle. One or more behavioral driving characteristics of the neighboring vehicle is classified (304) based on the observations. A driving plan is updated (306) based on a classification of the one or more behavioral driving characteristics of the neighboring vehicle. One or more operations of the autonomous vehicle is controlled (308) based on the updated driving plan. INDEPENDENT CLAIMS are included for the following:a system for managing driving plan of autonomous vehicle; anda non-transitory computer readable storage medium storing program for managing driving plan of autonomous vehicle. Method for managing driving plan of autonomous motor vehicles such as cars, trucks and trains using machine learning model. The drive control system updates drive plan relative to the irregular behavioral driving characteristics of vehicle, causing autonomous vehicle to slow down, increase a distance between autonomous vehicle and other vehicle, activate an emergency system e.g. collision warning and brake support. Enables autonomous vehicle to operate in a safe and autonomous manner and continuously adjust and react its environment. The drawing shows the flow diagram of a method for managing a driving plan of an autonomous vehicle. 300Method for managing driving plan of autonomous vehicle302Step for obtaining observations of a neighboring vehicle304Step for classifying one or more behavioral driving characteristic306Step for updating a driving plan308Step for controlling one or more operations of the autonomous vehicle", "instruction": "Please summarize the input" }, { "input": "Shape detecting autonomous vehicleAccording to various embodiments, there is provided a method for controlling a vehicle, the method including detecting a triggering event. The method further includes, in response to detecting the triggering event, determining updated dimensions of the vehicle. The method further includes adjusting operation of the vehicle based on the updated dimensions.What is claimed is: \n | 1. A method for controlling a vehicle, the method comprising: \ndetecting, by a sensor, a triggering event; \ndetermining updated dimensions of the vehicle in response to detecting the triggering event; and \nadjusting, by control electronics, at least one operation of the vehicle, wherein the at least one operation of the vehicle comprises an adjustment of at least one of a speed, a turn radius, a navigation path, a clearance allowance, or a parking behavior of the vehicle based at least in part on the updated dimensions. \n | 2. The method of claim 1, wherein the triggering event comprises a changed shape event.\n | 3. The method of claim 2, wherein the changed shape event comprises detecting a parameter associated with the vehicle and determining whether the parameter exceeds a threshold.\n | 4. The method of claim 3, wherein the parameter corresponds to one or more of a weight parameter, a wind parameter, a drag parameter, or an engine torque value.\n | 5. The method of claim 2, further comprising: \ndetermining one or more surrounding conditions of the vehicle; and \ndetecting the changed shape event of the vehicle based at least in part on the one or more surrounding conditions. \n | 6. The method of claim 5, wherein the one or more surrounding conditions comprises at least one of a wind force, a road slope, a radius of curvature of a road, or road terrain conditions.\n | 7. The method of claim 1, wherein determining the updated dimensions of the vehicle comprises: \nsending a scan request to one or more proximate vehicles; \nreceiving one or more at least partial scans of at least one of the one or more proximate vehicles; and \nconstructing the updated dimensions of the vehicle based at least in part on at least one of the one or more at least partial scans. \n | 8. The method of claim 7, wherein the scan request is sent via vehicle-to-vehicle (V2V) communication.\n | 9. The method of claim 7, wherein the one or more at least partial scans comprises at least one Light Detection and Ranging (LIDAR) scan.\n | 10. The method of claim 1, wherein the at least one operation of the vehicle serves to control braking, to perform wireless communication, or to perform environment scanning.\n | 11. The method of claim 1, further comprising configuring at least one of an engine sensor, a weight sensor, a wind sensor, or a cargo sensor.\n | 12. The method of claim 1, wherein determining the updated dimensions of the vehicle comprises: \nsending a scan request to one or more scanning devices of the vehicle; \nreceiving an at least partial scan from at least one of the one or more scanning devices; and \nconstructing the updated dimensions of the vehicle based on at least one of the at least partial scan. \n | 13. The method of claim 12, wherein the at least partial scan is received from another vehicle.\n | 14. The method of claim 12, wherein the at least partial scan is received from an unmanned aerial vehicle.\n | 15. The method of claim 12, wherein the at least partial scan is received from a camera arranged on a fixed object.\n | 16. A controller in a vehicle, the controller comprising: \na processor; and \na memory storing instructions that, when executed by the processor, cause the vehicle to: \ndetect a triggering event; \ndetermine updated dimensions of the vehicle in response to detection of the triggering event; and \nadjust at least one operation of the vehicle, wherein the at least one operation of the vehicle comprises an adjustment of at least one of a speed, a turn radius, a navigation path, a clearance allowance, or a parking behavior of the vehicle based at least in part on the updated dimensions. \n | 17. The controller of claim 16, wherein the triggering event comprises detecting a changed shape event.\n | 18. The controller of claim 17, wherein execution of the instructions causes the vehicle to: \ndetect a parameter associated with the vehicle; and \ndetermine whether the parameter exceeds a threshold. \n | 19. The controller of claim 18, wherein the parameter corresponds to one or more of a weight parameter, a wind parameter, a drag parameter, or an engine torque parameter.\n | 20. The controller of claim 17, wherein execution of the instructions causes the vehicle to further: \ndetermine one or more surrounding conditions of the vehicle; and \ndetect the changed shape event of the vehicle based at least in part on the one or more surrounding conditions. \n | 21. The controller of claim 20, wherein the one or more surrounding conditions comprises at least one of a wind force, a road slope, a radius of curvature of a road, or road terrain conditions.\n | 22. The controller of claim 16, wherein execution of the instructions for determining the updated dimensions further causes the vehicle to: \nsend a scan request to one or more proximate vehicles; \nreceive one or more at least partial scan of at least one of the one or more proximate vehicles; and \nconstruct the updated dimensions of the vehicle based at least in part on at least one of the one or more at least partial scans. \n | 23. The controller of claim 22, wherein the one or more at least partial scans comprises at least one Light Detection and Ranging (LIDAR) scan.\n | 24. The controller of claim 16, wherein execution of the instructions causes the vehicle to control braking, to perform wireless communication, or to perform environment scanning.\n | 25. The controller of claim 16, wherein execution of the instructions causes the vehicle to: \nsend a scan request to one or more scanning devices of the vehicle; \nreceive an at least partial scan from at least one of the one or more scanning devices; and \nconstruct the updated dimensions of the vehicle based on the at least partial scan. \n | 26. The controller of claim 25, wherein at least a partial scan is received from another vehicle.\n | 27. An apparatus for controlling a vehicle, the apparatus comprising: \nmeans for detecting a triggering event; \nmeans for determining updated dimensions of the vehicle in response to detecting the triggering event; and \nmeans for adjusting at least one operation of the vehicle, wherein the at least one operation of the vehicle comprises an adjustment of at least one of a speed, a turn radius, a navigation path, a clearance allowance, or a parking behavior of the vehicle based at least in part on the updated dimensions. \n | 28. The apparatus of claim 27, wherein the triggering event comprises a changed shape event.\n | 29. The apparatus of claim 28, wherein the changed shape event comprises detecting a parameter associated with the vehicle and determining whether the parameter exceeds a threshold.\n | 30. The apparatus of claim 29, wherein the parameter corresponds to one or more of a weight parameter, a wind parameter, a drag parameter, or an engine torque value.", "output": "The method involves detecting a triggering event, determining updated dimensions of the vehicle in response to detecting the triggering event, and adjusting operation of the vehicle based on the updated dimensions. The triggering event involves detecting a changed shape event of the vehicle. The changed shape event is detected by detecting a parameter associated with the vehicle and determining whether the parameter exceeds a threshold. The parameter corresponds to the weight, wind drag, or engine torque of the vehicle. INDEPENDENT CLAIMS are also included for the following:a controller in a vehicle; anda vehicle. Controlling method of vehicle. The vehicle accesses the updated shape information to determine the optimal turn radius for safely traversing the curvature in the road, by using the environment scanning information and the temperature sensor information. The drawing shows the flowchart of a method of controlling an autonomous vehicle. 402Receiving sensor data404Detecting changed shape event406Continuing normal operation408Triggering shape scanning410Adjusting operation based on new shape", "instruction": "Please summarize the input" }, { "input": "VIRTUAL TRAFFIC LIGHT VIA C-V2XTechniques are provided for traffic intersection control information to vehicles via V2X communication links. An example method for providing traffic intersection control messages includes receiving vehicle information associated with a plurality of proximate vehicles, generating one or more vehicle groups based on the vehicle information, generating a traffic control plan based at least in part on the one or more vehicle groups, and transmitting one or more traffic intersection control messages to one or more of the plurality of proximate vehicles based at least in part on the traffic control plan.CLAIMS: \n | 1. A method for providing traffic intersection control messages, comprising: receiving vehicle information associated with a plurality of proximate vehicles; generating one or more vehicle groups based on the vehicle information; generating a traffic control plan based at least in part on the one or more vehicle groups; and transmitting one or more traffic intersection control messages to one or more of the plurality of proximate vehicles based at least in part on the traffic control plan.\n | 2. The method of claim 1 wherein the vehicle information includes basic safety messages transmitted by one or more vehicles in the plurality of proximate vehicles.\n | 3. The method of claim 1 the one or more vehicle groups are based a location of a vehicle, a number of vehicles in a proximate area, a traffic density flowing in a direction, a configuration of an intersection, a size associated with a vehicle, a priority value associated with one or more vehicles, or any combination thereof.\n | 4. The method of claim 1 wherein receiving the vehicle information includes receiving vehicle group information from a network resource.\n | 5. The method of claim 1 wherein the traffic control plan is based at least in part on a time of day, a date, a current density of traffic, a turn lane configuration, or any combination thereof.\n | 6. The method of claim 1 wherein transmitting the one or more traffic intersection control messages includes unicasting a traffic control message including proceed information to one or more vehicles in the plurality of proximate vehicles.\n | 7. The method of claim 1 wherein transmitting the one or more traffic intersection control messages includes groupcasting a traffic control message including a list of vehicle identification values. \n | 8. The method of claim 1 wherein the one or more traffic intersection control messages are transmitted via a PC5 interface, a Uu interface, a device-to-device protocol, or any combinations thereof.\n | 9. A method of receiving a traffic intersection control message, comprising: transmitting one or more basic safety messages; receiving one or more traffic intersection control messages including proceed information; and providing an indication to proceed or halt progress through an intersection based at least in part on the one or more traffic intersection control messages.\n | 10. The method of claim 9 further comprising transmitting vehicle priority information.\n | 11. The method of claim 9 wherein receiving the one or more traffic intersection control messages includes receiving a unicast message including the proceed information.\n | 12. The method of claim 9 wherein receiving the one or more traffic intersection control messages includes receiving a groupcast message including a list of vehicle identification values.\n | 13. The method of claim 9 wherein providing the indication to proceed or halt progress through the intersection includes providing an instruction to a controller in an autonomous or semi-autonomous vehicle.\n | 14. The method of claim 9 wherein providing the indication to proceed or halt progress through the intersection includes activating a driver alert device.\n | 15. The method of claim 9 wherein the one or more traffic intersection control messages are received via a PC5 interface, a Uu interface, a device-to-device protocol, or any combinations thereof. \n | 16. An apparatus, comprising: a memory; at least one transceiver; at least one processor communicatively coupled to the memory and the at least one transceiver, and configured to: receive vehicle information associated with a plurality of proximate vehicles; generate one or more vehicle groups based on the vehicle information; generate a traffic control plan based at least in part on the one or more vehicle groups; and transmit one or more traffic intersection control messages to one or more of the plurality of proximate vehicles based at least in part on the traffic control plan.\n | 17. The apparatus of claim 16 wherein the vehicle information includes basic safety messages transmitted by one or more vehicles in the plurality of proximate vehicles.\n | 18. The apparatus of claim 16 the one or more vehicle groups are based a location of a vehicle, a number of vehicles in a proximate area, a traffic density flowing in a direction, a configuration of an intersection, a size associated with a vehicle, a priority value associated with one or more vehicles, or any combination thereof.\n | 19. The apparatus of claim 16 wherein the at least one processor is further configured to receive vehicle group information from a network resource as at least part of the vehicle information associated with the plurality of proximate vehicles.\n | 20. The apparatus of claim 16 wherein the traffic control plan is based at least in part on a time of day, a date, a current density of traffic, a turn lane configuration, or any combination thereof.\n | 21. The apparatus of claim 16 wherein the at least one processor is further configured to unicast a traffic control message including proceed information to one or more vehicles in the plurality of proximate vehicles as the one or more traffic intersection control messages.\n | 22. The apparatus of claim 16 wherein the at least one processor is further configured to groupcast a traffic control message including a list of vehicle identification values as the one or more traffic intersection control messages.\n | 23. The apparatus of claim 16 wherein the one or more traffic intersection control messages are transmitted via a PC5 interface, a Uu interface, a device-to-device protocol, or any combinations thereof.\n | 24. An apparatus, comprising: a memory; at least one transceiver; at least one processor communicatively coupled to the memory and the at least one transceiver, and configured to: transmit one or more basic safety messages; receive one or more traffic intersection control messages including proceed information; and provide an indication to proceed or halt progress through an intersection based at least in part on the one or more traffic intersection control messages.\n | 25. The apparatus of claim 24 wherein the at least on processor is further configured to transmit vehicle priority information.\n | 26. The apparatus of claim 24 wherein the at least on processor is further configured to receive a unicast message including the proceed information as the one or more traffic intersection control messages.\n | 27. The apparatus of claim 24 wherein the at least one processor is further configured to receive a groupcast message including a list of vehicle identification values as the one or more traffic intersection control messages. \n | 28. The apparatus of claim 24 wherein the at least one processor is further configured to provide an instruction to a controller in an autonomous or semi- autonomous vehicle as the indication to proceed or halt progress through the intersection.\n | 29. The apparatus of claim 24 wherein the at least one processor is further configured to activate a driver alert device as the indication to proceed or halt progress through the intersection.\n | 30. The apparatus of claim 24 wherein the one or more traffic intersection control messages are received via a PC5 interface, a Uu interface, a device-to-device protocol, or any combinations thereof.", "output": "The method (1100) involves receiving vehicle information associated with multiple proximate vehicles (1102). The vehicle groups are generated (1104) based on the vehicle information. A traffic control plan is generated (1106) based in portion on the vehicle groups. The traffic intersection control messages are transmitted (1108) to the proximate vehicle through PC5 interface, a UMTS air interface (Uu) interface, or a device-to-device protocol, based in portions on the traffic control plans, where the vehicle information includes basic safety messages transmitted by the vehicles in the multiple vehicles. INDEPENDENT CLAIMS are included for: (1) a method for receiving a traffic intersection control message; (2) an apparatus for providing traffic intersection control messages to vehicles through vehicle-to-everything communication links; (3) an apparatus for receiving a traffic intersection control message. Method for providing traffic intersection control messages to vehicles such as autonomous or semi-autonomous vehicles, i.e. car through vehicle-to-everything communication links uisng a user euipment (UE). Uses include but are not limited to mobile phone, router, tablet computer, laptop computer, consumer asset tracking device, Internet of Things (IoT) device, ot on-board unit (OBU). The traffic congestion and the potential for collisions at intersections can be reduced. The traffic control messages can be unicasted or groupcasted to the vehicles and the vehicles can proceed or halt at an intersection as a group. The vehicle groups are evaluated in view of a traffic control plan and the groups can be prioritized for proceeding through a traffic intersection. The positioning reference signal (PRS) muting can be used to reduce interference by muting PRS signals. The drawing shows a flow diagram illustrating a method for providing traffic intersection control information to vehicles.1100Method for providing traffic intersection control messages to vehicles 1102Receiving vehicle information associated with multiple proximate vehicle 1104Generating vehicle groups 1106Generating traffic control plan 1108Transmitting traffic intersection control messages to the proximate vehicle ", "instruction": "Please summarize the input" }, { "input": "IMPLEMENTING CONFIDENCE METRICS IN VEHICLE-TO-EVERYTHING (V2X) COMMUNICATIONSCertain aspects of the present disclosure provide techniques for enhancing vehicle operations safety using coordinating vehicle platooning or enhancing platooning safety against location spoofing attacks. In one example, a source user equipment (UE) detects a potential spoofing event associated with location information being altered in an unauthorized manner, the source UE may transmit a request to a platoon control system (PCS) to join a vehicle platoon. In another example, a first UE associated with a lead vehicle in an existing platoon may detect a potential spoofing event associated with location information being altered in an unauthorized manner. The lead vehicle may transmit to a second UE of another vehicle in the platoon an indication of the detection and a request to exchange the respective roles in the platoon. The PCS may also monitor the conditions of the first and the second UEs, and arrange for the platoon reorganization.WHAT IS CLAIMED IS: \n | 1. A source user equipment (UE) for wireless communications, comprising: a memory; and a processor coupled with the memory, the processor and the memory configured to: detect a potential spoofing event associated with location information being altered in an unauthorized manner; and transmit, in response to the detected potential spoofing event, a request to a platoon control system (PCS) to join a vehicle platoon, wherein the request includes an indication of the detected potential spoofing event.\n | 2. The source UE of claim 1, wherein the request to the PCS comprises a confidence metric that indicates a probability that the source UE is receiving spoofed location information.\n | 3. The source UE of claim 2, wherein the processor and the memory are configured to detect the potential spoofing event by detecting that the confidence metric is above a threshold value.\n | 4. The source UE of claim 2, wherein the confidence metric indicates one of a plurality of levels of accuracy of a corresponding level of certainty of the potential spoofing event, and wherein a threshold value corresponds to a predefined level of accuracy.\n | 5. The source UE of claim 2, wherein the processor and the memory are configured to detect the potential spoofing event by receiving one or more signals from at least one of a network entity or a second UE in one or more basic safety messages (BSMs).\n | 6. The source UE of claim 5, wherein the confidence metric is determined by comparing at least one characteristic indicated by the one or more signals and a characteristic indicated by received location information. 48\n | 7. The source UE of claim 2, wherein the processor and the memory are configured to detect the potential spoofing event by measuring, using at least one onboard sensor independent from the location information, a movement attribute of the source UE to examine a validity of the location information.\n | 8. The source UE of claim 2, wherein the request further comprises at least one of: a vehicle identifier, destination information, or a source positioning location.\n | 9. The source UE of claim 1, wherein the request further indicates at least one of: an occupancy parameter of a vehicle associated with the source UE; an autonomy level of a vehicle associated with the UE; or a travel preference parameter.\n | 10. The source UE of claim 1, wherein the processor and the memory are further configured to : receive a response indicating confirmation that the source UE is allowed to join a vehicle platoon assigned by the PCS; receive an invitation corresponding to the confirmation from a lead UE of a lead vehicle of the vehicle platoon; and abstain from transmitting vehicle-to-everything (V2X) messages upon receiving the response.\n | 11. The source UE of claim 1, wherein the processor and the memory are further configured to: receive an alert notice from the PCS when the PCS does not have an available vehicle platoon to assign, wherein the alert notice comprises alert messages requesting manual control.\n | 12. A network entity for wireless communications, comprising: a memory; and a processor coupled with the memory, the processor and the memory configured to: 49 receive a request from a user equipment (UE), the request triggered by a detection of a potential spoofing event at the UE; and transmit, to the UE, an assignment of a vehicle platoon for the UE to join based on the request.\n | 13. The network entity of claim 12, wherein the request further includes a confidence metric that indicates a probability that the UE is receiving spoofed location information of the potential spoofing event.\n | 14. The network entity of claim 13, wherein the request comprises at least one of: a vehicle identifier, destination information, a source positioning location, or the confidence metric.\n | 15. The network entity of claim 12, wherein the assignment has a higher priority when the UE is associated with an autonomous vehicle than when the UE is associated with a non-autonomous vehicle.\n | 16. The network entity of claim 12, wherein the processor and the memory are configured to: transmit, to at least one platoon UE of a corresponding vehicle in the vehicle platoon, an instruction for the at least one platoon UE to transmit a beacon to the UE, wherein the beacon is to be measured by the UE.\n | 17. The network entity of claim 12, wherein the processor and the memory are configured to: transmit, to a roadside unit (RSU), an instruction for the RSU to measure a location of the UE for comparison with location information therein and assessment of the potential spoofing event; and confirm the assignment of the vehicle platoon based on the location measured by the RSU.\n | 18. A first user equipment (UE), comprising: a memory; and 50 a processor coupled with the memory, the processor and the memory configured to: detect a potential spoofing event associated with location information being altered in an unauthorized manner; transmit, to a second UE, an indication of the detection of the potential spoofing event, wherein the first UE and the second UE are associated with vehicles in a platoon; and transmit, to the second UE, a request to exchange a role of a vehicle corresponding to the first UE in the platoon with a role of a vehicle corresponding to the second UE in the platoon.\n | 19. The first UE of claim 18, wherein the indication is carried in one or more basic safety messages (BSMs).\n | 20. The first UE of claim 18, wherein the role of the vehicle corresponding to the first UE is a lead vehicle managing UEs of other vehicles in the platoon, and wherein the role of the vehicle corresponding to the second UE is a secondary vehicle managed by the lead vehicle.\n | 21. The first UE of claim 18, wherein the processor and the memory are configured to: detect the potential spoofing event by determining a first confidence metric of the first UE, the first confidence metric associated with a position accuracy based on a verification of global navigation satellite system (GNSS) position information received at the first UE.\n | 22. The first UE of claim 21, wherein the processor and the memory are further configured to: detect the potential spoofing event by determining that the first confidence metric indicating a probability that the first UE is receiving spoofed location information is above a threshold value.\n | 23. The first UE of claim 21, wherein the processor and the memory are further configured to: receive from at least the second UE, data of sensors thereof, wherein the verification of the GNSS position information is based on the data of sensors.\n | 24. The first UE of claim 21, wherein the processor and the memory are further configured to: receive data from a roadside unit (RSU), wherein the verification of the GNSS position information is further based on the data of the RSU.\n | 25. The first UE of claim 21, wherein the processor and the memory are further configured to: transmit, an indication of the potential spoofing event, to a platoon control system (PCS) in control of the platoon when the confidence metric is above a threshold value.\n | 26. The first UE of claim 21, wherein the processor and the memory are further configured to: request, from the second UE in the platoon, a second confidence metric of the second UE, the second confidence metric indicating a probability that the second UE is receiving spoofed location information, wherein transmitting the request to exchange roles in the platoon is based on the second confidence metric being below a threshold value and indicating an absence of spoofing attack to the second UE; and transmit, to the second UE in the platoon, an indication for the second UE to leave the platoon based on the second confidence metric being greater than or equal to the threshold value.\n | 27. An apparatus for wireless communications, comprising: a memory; and a processor coupled with the memory, the processor and the memory configured to: receive an indication from a first user equipment (UE) of a first vehicle in a vehicle platoon, the indication triggered by the first UE detecting a first potential spoofing event associated with location information being altered in an unauthorized manner; and transmit, to a second UE in the vehicle platoon, an indication for the second UE to assume functionalities performed by the first UE in the vehicle platoon.\n | 28. The apparatus of claim 27, wherein the first UE is a lead UE configured to perform functionalities including management of other UEs in the vehicle platoon.\n | 29. The apparatus of claim 27, wherein the second UE and the first UE are in sidelink communication, and the second UE is managed by the first UE before the indication of the first potential spoofing event.\n | 30. The apparatus of claim 27, wherein the first potential spoofing event is determined based on that a first confidence metric of the first UE indicating a probability that the first UE is receiving spoofed location information is above a threshold value.", "output": "The equipment has a processor coupled with a memory. The processor detects a potential spoofing event associated with location information altered in an unauthorized manner and transmits a request to a platoon control system (PCS) to join a vehicle platoon in response to the detected potential spoofing event, where the request includes an indication of the detected potential spoofing event and the confidence metric indicates levels of accuracy of a corresponding level of certainty of the potential spoofing event. The processor receives a response indicating confirmation of a source user equipment allowed to join a vehicle platoon assigned by the PCS. INDEPENDENT CLAIMS are included for:(1) a network entity for performing wireless communication for coordinating vehicle platooning;(2) an apparatus for performing wireless communication for coordinating vehicle platooning. Source user equipment e.g. mobile station for performing wireless communication for coordinating vehicle platooning for use in telecommunication services e.g. telephony. Uses include but are not limited to video, data, messaging, broadcasts, a terminal, an access terminal, a subscriber unit, a station, a customer premises equipment, a cellular phone, an intelligent phone, a personal digital assistant (PDA), a wireless modem, a wireless communication device, a handheld device, a laptop computer, a cordless phone, a wireless local loop (WLL) station, a tablet computer, a camera, a gaming device, a netbook, a intelligent book, an ultrabook and a medical device. The equipment enhances vehicle operations safety using coordinated vehicle platooning or platooning safety from location spoofing attacks or attempts to alter location information in unauthorized manners. The equipment realizing improved spectral efficiency, reduced operation cost and increased reliability, maintains a minimal distance or headway between moving vehicles at high speeds and avoids use of potentially spoofed location information. The drawing shows a schematic view of a source user equipment. 400Vehicle-to-everything system 402Vehicle 406Wireless communication link 408Vehicle-to-vehicle interface 410Roadside service unit ", "instruction": "Please summarize the input" }, { "input": "Method and apparatus for vehicle steering plan and messagingThe present invention provides techniques that may be implemented using various methods and/or devices in a vehicle to utilize vehicle external sensor data, vehicle internal sensor data, vehicle capability and external V2X input to determine, transmit; receiving and using the V2X information and control data sent between the vehicle and the roadside unit (RSU), so as to determine the intersection entrance and the vehicle behavior at the near intersection.|1. A method for entering a crossroad of a self-control vehicle, comprising: based on vehicle external sensor, vehicle internal sensor, vehicle capacity or external V2X input or a combination thereof to determine the braking distance of the self-control vehicle; sending a first message from the autonomous vehicle, wherein the first message comprises identification data element or vehicle type or vehicle priority or a combination thereof for the self-control vehicle and a braking distance data element for the self-control vehicle; receiving a second message from the roadside unit RSU at least partially based on the braking distance for the autonomous vehicle; the second message comprises one or more instructions related to the intersection of the autonomous vehicle; and controlling the intersection entry of the self-control vehicle in response to the one or more instructions received from the RSU.\n | 2. The method according to claim 1, further comprising sending a third message from the self-control vehicle to the RSU prior to the second message to request entry of a crossroad.\n | 3. The method of entering a crossroad according to claim 1, wherein the braking distance for the self-control vehicle is determined based at least in part on the speed of the self-control vehicle.\n | 4. The method for entering a crossroad according to claim 3, wherein the braking distance for the self-control vehicle is determined based at least in part on a tire pressure or weather condition or tire traction data for the self-control vehicle or a combination thereof.\n | 5. The method according to claim 1, wherein the braking distance for the self-control vehicle is shorter in the autonomous mode than in the manual mode.\n | 6. The method of entering a crossroad according to claim 1, wherein the first message is a broadcast message.\n | 7. The method according to claim 1, wherein the first message is a peer-to-peer message.\n | 8. The intersection entry method according to claim 1, wherein the first message is a basic security message or a cooperative awareness message.\n | 9. A self-control vehicle, comprising: one or more wireless transceivers; a vehicle internal sensor; a vehicle external sensor; a memory; and one or more processors, the one or more processors communicatively coupling to the one or more wireless transceivers, the vehicle internal sensor, the vehicle external sensor and the memory; wherein the one or more processors are configured to: based on the vehicle external sensor, the vehicle internal sensor, vehicle capacity or external V2X input or a combination thereof to determine a braking distance for the autonomous vehicle; sending a first message from the one or more transceivers, wherein the first message comprises identification data element or vehicle type or vehicle priority or a combination thereof for the self-control vehicle and a braking distance data element for the self-control vehicle; at the one or more wireless transceivers at least partially based on the braking distance of the self-control vehicle from the roadside unit RSU receives the second message, the second message comprises one or more instructions related to the intersection of the self-control vehicle; and controlling the intersection entry of the self-control vehicle in response to the one or more instructions received from the RSU.\n | 10. The self-control vehicle according to claim 9, wherein the one or more processors are further configured to send a third message from the one or more wireless transceivers to the RSU prior to the second message to request entry of a crossroad.\n | 11. The self-control vehicle according to claim 9, wherein the braking distance for the self-control vehicle is determined based at least in part on the speed of the self-control vehicle and the experience stop distance data associated with the speed of the self-control vehicle.\n | 12. The self-control vehicle according to claim 11, wherein the braking distance for the self-control vehicle is determined based at least in part on a tire pressure or weather condition or tire traction data for the self-control vehicle or a combination thereof.\n | 13. The self-control vehicle according to claim 9, wherein the braking distance for the self-control vehicle is shorter in the autonomous mode than in the manual mode.\n | 14. The self-control vehicle according to claim 9, wherein the first message is a broadcast message.\n | 15. The self-control vehicle according to claim 9, wherein the first message is a peer-to-peer message.\n | 16. The self-control vehicle according to claim 9, wherein the first message is a basic security message or a cooperative sensing message.\n | 17. A self-control vehicle, comprising: means for determining a braking distance for the autonomous vehicle based on a vehicle external sensor, a vehicle internal sensor, a vehicle capability or an external V2X input, or a combination thereof; means for sending a first message from the self-control vehicle, wherein the first message comprises identification data element or vehicle type or vehicle priority or a combination thereof for the self-control vehicle and a braking distance data element for the self-control vehicle; means for receiving a second message from a roadside unit RSU based at least in part on the braking distance for the self-control vehicle; the second message comprises one or more instructions to enter at a crossroad of the self-control vehicle; and means for controlling the entry of the crossroads of the self-control vehicle in response to the one or more instructions received from the RSU.\n | 18. The self-control vehicle according to claim 17, further comprising means for sending a third message from the self-control vehicle to the RSU prior to the second message to request entry of a crossroad.\n | 19. The self-control vehicle according to claim 17, wherein the first message is a broadcast message.\n | 20. The self-control vehicle according to claim 17, wherein the first message is a peer-to-peer message.\n | 21. The self-control vehicle according to claim 17, wherein the first message is a basic security message or a cooperative sensing message.\n | 22. A non-transitory computer-readable medium having stored thereon computer-readable instructions for causing one or more processors on a self-control vehicle to perform the following operations: based on vehicle external sensor, vehicle internal sensor, vehicle capacity or external V2X input or a combination thereof to determine the braking distance of the self-control vehicle; sending a first message from the autonomous vehicle, wherein the first message comprises identification data element or vehicle type or vehicle priority or a combination thereof for the self-control vehicle and a braking distance data element for the self-control vehicle; receiving a second message from the roadside unit RSU at least partially based on the braking distance for the autonomous vehicle; the second message comprises one or more instructions related to the intersection of the autonomous vehicle; and controlling the intersection entry of the self-control vehicle in response to the one or more instructions received from the RSU.\n | 23. The non-transitory computer-readable medium according to claim 22, further comprising instructions that cause the one or more processors to send a third message to the RSU prior to the second message to request entry of a crossroad.\n | 24. The non-transitory computer readable medium according to claim 22, wherein the first message is a broadcast message.\n | 25. The non-transitory computer-readable medium according to claim 22, wherein the first message is a peer-to-peer message.\n | 26. The non-transitory computer-readable medium according to claim 22, wherein the first message is a basic security message or a cooperative awareness message.", "output": "The method involves receiving a first message from a first vehicle at an ego vehicle. The first message includes an identification data element for the first vehicle, an autonomous vehicle status data element for the first vehicle or a braking distance data element for the first vehicle or a combination. A second message is received from a second vehicle at the ego vehicle. The second message comprises an identification data element for the second vehicle. A target space is determined based upon a size of the ego vehicle, the autonomous vehicle status data element for the first vehicle. The autonomous vehicle status data element for the second vehicle. The braking distance data element for the first vehicle or the braking distance data element for the second vehicle or a combination. An INDEPENDENT CLAIM is included for an ego vehicle with wireless transceivers. Method for messaging an automotive device to facilitate maneuvering of an ego vehicle (claimed). Method for messaging an automotive device to facilitate vehicle maneuvering increases vehicle turning radius at current speed and reduces maneuverability at current speed, and avoid collisions during an emergency stop of the vehicles. The drawing shows a block diagram of a device for determination and communication of a Vehicle-to-everything (V2X) capability data element value based on vehicle internal and external sensors. 100Vehicle external sensors110Vehicle internal sensors120Vehicle capabilities910Processor", "instruction": "Please summarize the input" }, { "input": "Methods and systems for managing interactions between vehicles with varying levels of autonomyMethods, devices and systems enable controlling an autonomous vehicle by identifying a vehicle that is within a threshold distance of the autonomous vehicle, determining an autonomous capability metric (ACM) the identified vehicle, determining whether the ACM of the identified vehicle is greater than a first threshold, determining whether the ACM of the identified vehicle is less than a second threshold, and adjusting a driving parameter of the autonomous vehicle so that the autonomous vehicle is more or less reliant on the capabilities of the identified vehicle based on whether the ACM of the identified vehicle exceeds the thresholds.What is claimed is: \n | 1. A method of controlling an autonomous vehicle, comprising: \nidentifying, via a processor of the autonomous vehicle, a vehicle that is within a threshold distance of the autonomous vehicle; \ndetermining an autonomous capability metric (ACM) of the identified vehicle, wherein the ACM is a vector data structure including a plurality of values each representing a capability of the identified vehicle, the ACM being dynamically determined based on real-time data from the identified vehicle and certificates received via cellular vehicle-to-everything (C-V2X) communications; \ndetermining whether the ACM of the identified vehicle is greater than a first threshold; and \nadjusting a driving parameter of the autonomous vehicle based on capabilities of the identified vehicle in response to determining that the ACM of the identified vehicle is greater than the first threshold. \n | 2. The method of claim 1, wherein adjusting the driving parameter of the autonomous vehicle based on the capabilities of the identified vehicle in response to determining that the ACM of the identified vehicle exceeds the first threshold comprises decreasing a minimum following distance to be maintained between the autonomous vehicle and the identified vehicle in response to determining that the ACM of the identified vehicle is greater than the first threshold.\n | 3. The method of claim 1, further comprising: \ndetermining whether the ACM of the identified vehicle is less than a second threshold in response to determining that the ACM of the identified vehicle is not greater than the first threshold; and \nadjusting the driving parameter of the autonomous vehicle based on the capabilities of the identified vehicle in response to determining that the ACM of the identified vehicle is not greater the first threshold and is less than the second threshold. \n | 4. The method of claim 3, wherein adjusting the driving parameter of the autonomous vehicle based on the capabilities of the identified vehicle in response to determining that the ACM of the identified vehicle is not greater the first threshold and is less than the second threshold comprises increasing a minimum following distance to be maintained between the autonomous vehicle and the identified vehicle in response to determining that the ACM of the identified vehicle is not greater the first threshold and is less than the second threshold.\n | 5. The method of claim 1, wherein identifying the vehicle that is within the threshold distance of the autonomous vehicle comprises identifying a vehicle that is in front of the autonomous vehicle and within the threshold distance of the autonomous vehicle.\n | 6. The method of claim 1, wherein determining the ACM of the identified vehicle comprises determining a value that identifies: \na current level of autonomy of the identified vehicle; \nan autonomous capability of the identified vehicle; or \nwhether the identified vehicle includes an advanced autonomous control system. \n | 7. The method of claim 1, wherein: \ndetermining whether the ACM of the identified vehicle is greater than the first threshold comprises applying the plurality of values to a plurality of decision nodes that each evaluate a different feature, factor or data point. \n | 8. The method of claim 7, wherein applying the plurality of values to the plurality of decision nodes that each evaluate the different feature, factor or data point comprises applying one or more of the plurality of values to a decision node that evaluates: \nwhether vehicle-to-vehicle (V2V) communication circuitry is present in the identified vehicle; \nwhether an accuracy range of a sensor in the identified vehicle is greater than a threshold value; or \nwhether a thickness of each brake pad in the identified vehicle exceeds a threshold thickness of friction material. \n | 9. A processor for an autonomous vehicle, wherein the processor is configured with processor executable instructions to: \nidentify a vehicle that is within a threshold distance of the autonomous vehicle; \ndetermine an autonomous capability metric (ACM) of the identified vehicle, wherein the ACM is a vector data structure including a plurality of values each representing a capability of the identified vehicle, the ACM being dynamically determined based on real-time data from the identified vehicle and certificates received via cellular vehicle-to-everything (C-V2X) communications; \ndetermine whether the ACM of the identified vehicle is greater than a first threshold; and \nadjust a driving parameter of the autonomous vehicle based on capabilities of the identified vehicle in response to determining that the ACM of the identified vehicle is greater than the first threshold. \n | 10. The processor of claim 9, wherein the processor is configured with processor executable instructions to adjust the driving parameter of the autonomous vehicle based on the capabilities of the identified vehicle in response to determining that the ACM of the identified vehicle exceeds the first threshold by decreasing a minimum following distance to be maintained between the autonomous vehicle and the identified vehicle in response to determining that the ACM of the identified vehicle is greater than the first threshold.\n | 11. The processor of claim 9, wherein the processor is further configured with processor executable instructions to: \ndetermine whether the ACM of the identified vehicle is less than a second threshold in response to determining that the ACM of the identified vehicle is not greater than the first threshold; and \nadjust the driving parameter of the autonomous vehicle based on the capabilities of the identified vehicle in response to determining that the ACM of the identified vehicle is not greater the first threshold and is less than the second threshold. \n | 12. The processor of claim 11, wherein the processor is configured with processor executable instructions to adjust the driving parameter of the autonomous vehicle based on the capabilities of the identified vehicle in response to determining that the ACM of the identified vehicle is not greater the first threshold and is less than the second threshold by increasing a minimum following distance to be maintained between the autonomous vehicle and the identified vehicle in response to determining that the ACM of the identified vehicle is not greater the first threshold and is less than the second threshold.\n | 13. The processor of claim 9, wherein the processor is further configured with processor executable instructions to identify the vehicle that is within the threshold distance of the autonomous vehicle by identifying a vehicle that is in front of the autonomous vehicle and within the threshold distance of the autonomous vehicle.\n | 14. The processor of claim 9, wherein the processor is configured with processor executable instructions to determine the ACM of the identified vehicle by determining a value that identifies: \na current level of autonomy of the identified vehicle; \nan autonomous capability of the identified vehicle; or \nwhether the identified vehicle includes an advanced autonomous control system. \n | 15. The processor of claim 9, wherein the processor is configured with processor executable instructions to: \ndetermine whether the ACM of the identified vehicle is greater than the first threshold by applying the plurality of values to a plurality of decision nodes that each evaluate a different feature, factor or data point. \n | 16. The processor of claim 15, wherein the processor is configured with processor executable instructions to apply the plurality of values to the plurality of decision nodes that each evaluate the different feature, factor or data point by applying one or more of the plurality of values to a decision node that evaluates: \nwhether vehicle-to-vehicle (V2V) communication circuitry is present in the identified vehicle; \nwhether an accuracy range of a sensor in the identified vehicle is greater than a threshold value; or \nwhether a thickness of each brake pad in the identified vehicle exceeds a threshold thickness of friction material. \n | 17. A non-transitory processor-readable storage medium having stored thereon processor-executable instructions configured to cause a processor of an autonomous vehicle to perform operations comprising: \nidentifying a vehicle that is within a threshold distance of the autonomous vehicle; \ndetermining an autonomous capability metric (ACM) of the identified vehicle, wherein the ACM is a vector data structure including a plurality of values each representing a capability of the identified vehicle, the ACM being dynamically determined based on real-time data from the identified vehicle and certificates received via cellular vehicle-to-everything (C-V2X) communications; \ndetermining whether the ACM of the identified vehicle is greater than a first threshold; and \nadjusting a driving parameter of the autonomous vehicle based on capabilities of the identified vehicle in response to determining that the ACM of the identified vehicle is greater than the first threshold. \n | 18. The non-transitory processor-readable storage medium of claim 17, wherein the stored processor-executable instructions are configured to cause the processor of the autonomous vehicle to perform the operations such that adjusting the driving parameter of the autonomous vehicle based on the capabilities of the identified vehicle in response to determining that the ACM of the identified vehicle exceeds the first threshold comprises decreasing a minimum following distance to be maintained between the autonomous vehicle and the identified vehicle in response to determining that the ACM of the identified vehicle is greater than the first threshold.\n | 19. The non-transitory processor-readable storage medium of claim 17, wherein the stored processor-executable instructions are configured to cause the processor of the autonomous vehicle to perform operations further comprising: \ndetermining whether the ACM of the identified vehicle is less than a second threshold in response to determining that the ACM of the identified vehicle is not greater than the first threshold; and \nadjusting the driving parameter of the autonomous vehicle based on the capabilities of the identified vehicle in response to determining that the ACM of the identified vehicle is not greater the first threshold and is less than the second threshold. \n | 20. An autonomous vehicle, comprising: \nmeans for identifying a vehicle that is within a threshold distance of the autonomous vehicle; \nmeans for determining an autonomous capability metric (ACM) of the identified vehicle, wherein the ACM is a vector data structure including a plurality of values each representing a capability of the identified vehicle, the ACM being dynamically determined based on real-time data from the identified vehicle and certificates received via cellular vehicle-to-everything (C-V2X) communications; \nmeans for determining whether the ACM of the identified vehicle is greater than a first threshold; and \nmeans for adjusting a driving parameter of the autonomous vehicle so that the autonomous vehicle based on capabilities of the identified vehicle in response to determining that the ACM of the identified vehicle is greater than the first threshold.", "output": "The method (1100) involves identifying (902) a vehicle that is within a threshold distance of an autonomous vehicle by a processor of the autonomous vehicle. An Autonomous capability metric (ACM) of the identified vehicle is determined (1104). A determination is made (1106) to check whether the ACM is greater than a first threshold. A driving parameter of the autonomous vehicle is adjusted (1108) based on capabilities of the determined identified vehicle in response to determining that the determined ACM exceeds the first threshold by decreasing a minimum following distance to be maintained between the vehicle and an identified vehicle e.g. car. The vehicle is in front of and within the threshold distance. INDEPENDENT CLAIMS are included for: (1) a processor for an autonomous vehicle; (2) a non-transitory processor-readable storage medium for storing processor-executable instructions; (3) an autonomous vehicle comprises a unit for identifying a vehicle that is within a threshold distance of the autonomous vehicle. Method for controlling an autonomous vehicle, such as a car. The method enables utilizing vehicle-based communications for safer and more efficient use of motor vehicles and transportation resources. The method allows the autonomous vehicle to determine the autonomous capability metric of the identified vehicles and adjust the driving parameter of the autonomous vehicles based on the determined autonomous capability metrics of the vehicles, thus improving safety and performance of the vehicle in an efficient manner. The drawing shows a flow chart of the method for controlling an autonomous vehicle.902Identifying a vehicle that is within a threshold distance of the autonomous vehicle 1100Method for controlling an autonomous vehicle 1104Determining ACM of the identified vehicle 1106Determining whether the ACM of the identified vehicle is greater than a first threshold 1108Adjusting a driving parameter of the autonomous vehicle ", "instruction": "Please summarize the input" }, { "input": "RESOURCE MANAGEMENT FOR COMMUNICATION AND SENSING SERVICESVarious aspects of the present disclosure generally relate to wireless communication. In some aspects, a first core network entity may receive a first request associated with initiation of a sensing service associated with a user equipment (UE). The first core network entity may receive, from at least one other core network entity and based at least in part on the first request, one or more sensing session parameters associated with the sensing service. The first core network entity may provide, to a second core network entity and based at least in part on the sensing session parameters, a second request to establish a virtual communication session with the UE, the request including one or more communication session parameters. Numerous other aspects are described.WHAT IS CLAIMED IS: \n | 1. A first core network entity, comprising: one or more memories; and one or more processors, coupled to the one or more memories, configured to: receive a first request associated with initiation of a sensing service associated with a user equipment (UE); receive, from at least one other core network entity and based at least in part on the first request, one or more sensing session parameters associated with the sensing service; and provide, to a second core network entity and based at least in part on the sensing session parameters, a second request to establish a virtual communication session with the UE, the request including one or more communication session parameters.\n | 2. The first core network entity of claim 1, wherein the one or more processors are further configured to: map the one or more sensing session parameters to the one or more communication session parameters.\n | 3. The first core network entity of claim 1, wherein the one or more sensing session parameters include at least one of: a sensing type parameter, a range parameter, a range resolution parameter, a velocity parameter, a velocity resolution parameter, an azimuth field of view parameter, an angular resolution parameter, a maximum number of detected targets, a data rate parameter, or a latency parameter.\n | 4. The first core network entity of claim 1, wherein the one or more communication session parameters include at least one of: a quality of service parameter, a signal-to-interference-plus-noise ratio (SINR) parameter, a data rate parameter, or a latency parameter.\n | 5. The first core network entity of claim 1, wherein the one or more sensing session parameters are based at least in part on at least one of: subscription information associated with the UE and the sensing service, or policy information associated with the sensing service.\n | 6. The first core network entity of claim 1, wherein the first core network entity comprises a non-communication session management function (N-SMF) entity and the second core network entity comprises a session management function (SMF) entity.\n | 7. The first core network entity of claim 1, wherein the at least one other core network entity comprises at least one of: a non-communication policy control function (N-PCF) entity, or a unified data management (UDM) entity.\n | 8. The first core network entity of claim 1, wherein the first request is received from an access and mobility management function (AMF) entity.\n | 9. The first core network entity of claim 1, wherein the first request is associated with a sensing network slice that indicates the first request is for the sensing service.\n | 10. The first core network entity of claim 1, wherein the first request is associated with a dynamic network name (DNN) or access point name (APN) that indicates the first request is for the sensing service.\n | 11. The first core network entity of claim 1, wherein the one or more processors, to receive the one or more sensing session parameters, are configured to: receive information indicating one or more policies for managing the sensing service.\n | 12. The first core network entity of claim 11, wherein the information indicating the one or more policies is based at least in part on information that identifies a location of the UE.\n | 13. The first core network entity of claim 11, wherein the information indicating the one or more policies is received from a non-communication policy control function (N-PCF) entity. \n | 14. The first core network entity of claim 1, wherein the one or more processors are further configured to: determine, based at least in part on the sensing session parameters, a communication service type for the virtual communication session; and indicate, to the second core network entity, that the virtual communication session is associated with the communication service type.\n | 15. The first core network entity of claim 14, wherein the communication service type is associated with at least one of: vehicle-to-everything (V2X) communications, or unmanned autonomous vehicle (UAV) communications.\n | 16. The first core network entity of claim 1, wherein the one or more processors are further configured to: provide, for a network node associated with the UE, embedded radio level operation configuration information in a sensing session specific container.\n | 17. The first core network entity of claim 1, wherein the one or more processors are further configured to: provide, to a network node, a third request to establish the sensing service between the network node and the UE.\n | 18. The first core network entity of claim 17, wherein the third request includes information indicating whether the sensing service is for a radar service or positioning service.\n | 19. The first core network entity of claim 17, wherein the third request includes information indicating the one or more sensing session parameters.\n | 20. The first core network entity of claim 17, wherein the third request includes information indicating whether the sensing service is for UE-based sensing or network node-based sensing.\n | 21. The first core network entity of claim 17, wherein the third request includes information indicating a priority associated with the sensing service.\n | 22. A first core network entity, comprising: one or more memories; and one or more processors, coupled to the one or more memories, configured to: receive a first request associated with a communication session associated with a user equipment (UE), wherein the first request is associated with one or more first communication session parameters; receive, from a second core network entity, a second request to establish a virtual communication session associated with the UE, wherein the second request is associated with one or more second communication session parameters, and wherein the virtual communication session corresponds to a sensing service; and provide, to a network node, information indicating the one or more first communication session parameters for the communication session and the one or more second communication session parameters for the virtual communication session.\n | 23. The first core network entity of claim 22, wherein the one or more processors are further configured to: receive, from the second core network entity and embedded in a sensing session specific container, radio level operation configuration information for the network node; and provide, to the network node, the radio level operation configuration information.\n | 24. The first core network entity of claim 22, wherein the first core network entity comprises a session management function (SMF) entity and the second core network entity comprises a non-communication session management function (N-SMF) entity.\n | 25. The first core network entity of claim 22, wherein the one or more processors are further configured to: provide, to the network node, sensing information indicating that the virtual communication session is for the sensing service, to be established between the network node and the UE.\n | 26. The first core network entity of claim 25, wherein the sensing information includes information indicating one or more sensing session parameters.\n | 27. A network node, comprising: one or more memories; and one or more processors, coupled to the one or more memories, configured to: receive, from a first core network entity, a first request associated with initiation of a sensing service associated with a user equipment (UE), wherein the first request includes information identifying one or more sensing session parameters associated with the sensing service; determine, based at least in part on the one or more sensing session parameters and one or more communication session parameters associated with a communication session between the UE and the network node, one or more resources for the sensing service; and transmit, to the UE, information identifying the one or more resources for the sensing service.\n | 28. The network node of claim 27, wherein the one or more processors are further configured to: receive, from a second core network entity, a second request including the one or more communication session parameters.\n | 29. The network node of claim 27, wherein the one or more processors, to determine the one or more resources, are configured to: determine a first portion of joint communication and sensing resources for the sensing service; determine a second portion of the joint communication and sensing resources for the communication session; and determine, as the one or more resources, the first portion of the joint communication and sensing resources.\n | 30. A method of wireless communication performed by a first core network entity, comprising: receiving a first request associated with initiation of a sensing service associated with a user equipment (UE); receiving, from at least one other core network entity and based at least in part on the first request, one or more sensing session parameters associated with the sensing service; and providing, to a second core network entity and based at least in part on the sensing session parameters, a second request to establish a virtual communication session with the UE, the second request including one or more communication session parameters.", "output": "The entity has a processor coupled to a memory configured to receive a first request associated with initiation of a sensing service associated with a user equipment (UE) (120) and sensing session parameters associated with the sensing service based on the first request, where the processor provides a second request including communication session parameters to establish a virtual communication session with the UE and the sensing session parameters include one of sensing type parameter, range parameter, range resolution parameter, velocity parameter, velocity resolution parameter, azimuth field of view parameter and angular resolution parameter and the communication session parameters include one of quality-of-service parameter, signal-to-interference-plus-noise ratio (SINR) parameter, data rate parameter or latency parameter. The processor maps the sensing session parameters to the communication session parameters. An INDEPENDENT CLAIM is also included for a method for performing wireless communication by a core network entity. Core network entity for performing wireless communication for facilitating resource management for communication and sensing services. Uses include but are not limited to telephony, video, data, messaging and broadcasts. The entity effectively supports mobile broadband internet access by improving spectral efficiency, lowering costs, improving services, making use of new spectrum and/or providing new-radio (NR) services. The drawing shows a block diagram of a core network for facilitating resource management for communication and sensing services.100Wireless network120UE420Unified data repository455Message bus605Core network", "instruction": "Please summarize the input" }, { "input": "Location of a suspicious vehicle using V2X communication Location of a suspicious vehicle (SV), implemented within a detection entity (DE), such as another vehicle, the detection entity being distinct from the suspect vehicle, from a V2X communication . FIG. 1|1. [Claim 1] Method for locating a suspect vehicle (SV), implemented within a detection entity (DE), the detection entity being distinct from the suspect vehicle, comprising the steps of:\n- reception (3) from a remote server (SRV) of a request for the location of the suspect vehicle, the request including an identification number of the suspect vehicle; \n- generation (5) of an identification request message; \n- transmission (7) of the message on a V2X network, that is to say by a direct link between the entity and the suspect vehicle; \n- upon receipt of a positive response from the suspect vehicle, transmission (17) to the remote server of geolocation information of the entity and/or the suspect vehicle. \n | 2. [Claim 2] Method according to claim 1, in which the suspect vehicle is geolocated and in which the identification request message further comprises a request for geolocation of the suspect vehicle, so that the positive response from the suspect vehicle further comprises the geolocation of the vehicle suspicious.\n | 3. [Claim 3] Method according to one of the preceding claims, in which the detection entity is a motorized land vehicle and in which the suspect vehicle is geolocated, and further comprising a step of: - calculation (19) of a tracking route for the suspect vehicle by the detection vehicle, the tracking route being configured so that the detection vehicle remains within V2X range of the suspect vehicle while being out of visual range of the suspect vehicle.\n | 4. [Claim 4] Method according to claim 3, in which the ability to be out of visual range of the suspect vehicle is determined from at least one of the following elements: ? a first predetermined distance; ? a plurality of second predetermined distances, each second predetermined distance corresponding to a type of geographical area; ? data acquired by a sensor of the detection vehicle;\n? data acquired by a sensor of the suspect vehicle. \n | 5. [Claim 5] Method according to one of claims 3 or 4, further comprising a step of: - display on a vehicle navigation aid system the tracking route.\n | 6. [Claim 6] Method according to one of claims 3 to 5, further comprising a step of: - generation of a configured autonomous driving instruction for autonomous driving to follow the tracking route.\n | 7. [Claim 7] Method according to one of claims 1 or 2, in which the detection entity is a road infrastructure element.\n | 8. [Claim 8] Computer program comprising instructions for implementing the method according to any one of the preceding claims, when these instructions are executed by a processor (200).\n | 9. [Claim 9] Device for locating a suspect vehicle, included in a detection entity, the detection entity being distinct from the suspect vehicle, and comprising at least one memory and at least one processor arranged to perform the operations of: - reception from a remote server of a location request lization of the suspect vehicle, the request including an identification number of the suspect vehicle; - generation of an identification request message; - transmission of the message on a V2X network, that is to say by a direct link between the entity and the suspect vehicle; - upon receipt of a positive response from the suspect vehicle, transmission to the remote server of geolocation information of the suspicious entity and/or vehicle.\n | 10. [Claim 10] Motorized land vehicle, corresponding to the detection entity, and comprising the device according to claim 9.\n1/2", "output": "The method involves receiving request for location of a suspect vehicle from a remote server (3), where the request includes an identification number of the suspect vehicle. An identification request message is generated (5). The message is transmitted (7) on a vehicle-to-everything network by a direct link between a detection entity and the suspect vehicle. Geolocation information of the entity and the suspect vehicle are transmitted (17) to the remote server upon receipt of a positive response from the suspect vehicle. A tracking route for the suspect vehicle is calculated (19) by the detection vehicle, while being out of visual range of the suspect vehicle. INDEPENDENT CLAIMS are also included for:a computer program comprising a set of instructions for locating a suspect vehicle implemented within a detection entity;a device for locating a suspect vehicle implemented within a detection entity; anda motorized land vehicle. Method for locating a suspect vehicle implemented within a detection entity i.e. road infrastructure element, of a motorized land vehicle (all claimed). Uses include but are not limited to a motor vehicle, a moped, a motorcycle and a storage robot in a warehouse. The method enables locating the suspect vehicle within the detection entity, so that interactions of the location method with the suspect vehicle can be reduced to a strict minimum. The drawing shows a flowchart illustrating a method for locating a suspect vehicle implemented within a detection entity.3Step for receiving request for location of a suspect vehicle from a remote server5Step for generating identification request message7Step for transmitting message on a vehicle-to-everything network by a direct link between a detection entity and the suspect vehicle17Step for transmitting geolocation information of the entity and the suspect vehicle to the remote server upon receipt of a positive response from the suspect vehicle19Step for calculating tracking route for the suspect vehicle by the detection vehicle, while being out of visual range of the suspect vehicle", "instruction": "Please summarize the input" }, { "input": "Immobilization of a suspicious vehicle using V2X communication Method and device for immobilizing a suspect vehicle (SV), implemented within a detection entity (DE), the detection entity being distinct from the suspect vehicle, based on V2X communication. FIG. 1|1. Claims\n[Claim 1] Method for immobilizing a suspect vehicle (SV), implemented within a detection entity (DE), the detection entity being distinct from the suspect vehicle, comprising the steps of: - reception (3) from a remote server (SRV) of a request immobilization of the suspect vehicle, the request including an identification number of the suspect vehicle; - transmission (7) to the suspect vehicle of a message immobilization, the immobilization message being configured so that the suspect vehicle is immobilized, the message being transmitted over a V2X network, that is to say by a direct link between the entity and the suspect vehicle. \n | 2. [Claim 2] Method according to claim 1, further comprising, still at the level of the detection entity, the steps of: - receipt (17) of an acknowledgment of receipt of immobilization from the suspect vehicle; - upon receipt of acknowledgment of receipt, transmission (17) to remote server of immobilization information of the suspect vehicle.\n | 3. [Claim 3] Method according to one of the preceding claims, further comprising, at the level of the suspect vehicle, the step of: - upon receipt of the immobilization message, activation of a mode of inhibiting at least one function of a powertrain, the inhibition of the function being configured so that the suspect vehicle is immobilized.\n | 4. [Claim 4] Method according to one of the preceding claims, further comprising, at the level of the suspect vehicle, the step of: - upon receipt of the immobilization message, activation of a locking mode of an electronic trajectory corrector, the locking being configured so that the suspect vehicle is immobilized by activation of at least one brake of the vehicle.\n | 5. [Claim 5] Method according to one of the preceding claims, further comprising, at the level of the suspect vehicle, the step of: - upon receipt of the immobilization message, activation of a immediate stopping mode of at least one autonomous driving function; - upon activation of the immediate stop mode, generation of an ins- Autonomous driving action configured to make the suspect vehicle stop.\n | 6. [Claim 6] Computer program comprising instructions for implementing the method according to any one of claims 1 or 2, when these instructions are executed by a processor (200).\n | 7. [Claim 7] Device for locating a suspect vehicle, included in a detection entity, the detection entity being distinct from the suspect vehicle, and comprising at least one memory and at least one processor arranged to perform the operations of: - reception from a remote server of a request immobilization of the suspect vehicle, the request including an identification number of the suspect vehicle; - transmission of a message to the suspect vehicle immobilization, the immobilization message being configured so that the suspect vehicle is immobilized, the message being transmitted over a V2X network, that is to say by a direct link between the entity and the suspect vehicle.\n | 8. [Claim 8] Motorized land vehicle, corresponding to the detection entity, and comprising the device according to claim 7.", "output": "The method involves performing reception (3) of request from a remote server (SRV) for immobilization of suspect vehicle. The request includes identification number of the suspect vehicle, and an immobilization message is transmitted (7) to the suspect vehicle in response to received request. The immobilization message is configured so that the suspect vehicle is immobilized, and the message is transmitted over V2X network via direct link between entity and suspect vehicle. INDEPENDENT CLAIMS are included for the following:a computer program comprising instructions for immobilizing suspect vehicle;a device for locating suspect vehicle; anda motorized land vehicle. Method for immobilizing suspect vehicle such as motorized land vehicle e.g. motor vehicle, moped, motorcycle. Can also be used in storage robot in warehouse. The stolen vehicle itself detects suspicious activity and transmits alert so that theft of vehicle is prevented. The interactions of immobilization method with the suspect vehicle are reduced to the strict minimum. The drawing shows a flow diagram illustrating the process for immobilizing suspect vehicle. (Drawing includes non-English language text) 1Step for generating request to immobilize suspect vehicle3Step for reception of request from remote server5Step for generating immobilization message7Step for transmitting immobilization message to suspect vehicle9Step for providing direct exchanges possible between entities", "instruction": "Please summarize the input" }, { "input": "Communication method for reinforcing the safety of autonomous driving of a vehicle, vehicle provided with means for implementing this method Communication method for strengthening the safety of autonomous driving of a vehicle (1), this method comprising an exchange of perception data which is limited to what is necessary. In this process, the V2X communication network is only used when the uncertainty on an environmental element (object external to the vehicle (1)) is too great. On the other hand, if the V2X communication network is requested, the corresponding perception request calls for a response which is limited by attributes specifically correlated to the environmental element. Vehicle (1) provided with means (2, 3, 4, 5, 7) for implementing this method. Figure 1|1. Claims\n[Claim 1] Communication method for strengthening the safety of autonomous driving of a vehicle (1), this method comprising a) a step of acquiring perception data on an environmental element of the vehicle (1), b) a step of determining an uncertainty on the environmental element, c) a step of evaluating the level of uncertainty by comparing this level of uncertainty to a safety threshold, d) a step of creating a collection request, in which the perception request calls for a response limited by attributes specifically correlated to the environment element, e) a step of transmitting this request to a V2X communication network, f) a step of receiving a response to the request, from the V2X communication network, or taking into account a lack of response to the request, from the V2X communication network, g) a merger step, in the event of a response, collective perception data obtained in response to the perception request, with at least some of the perception data which had initially been obtained in step a), h) a step of returning to step b) for determining d an uncertainty updated with the perception data obtained by the fusion step of step g). \n | 2. [Claim 2] A method according to claim 1, wherein the attributes specifically correlated to the environment element are defined by a perception query identity, a relevance region and a type of the environment element.\n | 3. [Claim 3] Method according to one of the preceding claims in which step h) returning to step b) is carried out as part of a logic of isolation and correction of fault detection.\n | 4. [Claim 4] Method according to one of the preceding claims, in which the security threshold is determined as a function of at least one of the following parameters: the severity of the situation, the recurrence of the situation and the controllability of the situation.\n | 5. [Claim 5] Method according to one of the preceding claims, in which step e) of transmitting the perception request to the V2X communication network is carried out before activating an autonomous driving process in minimal risk maneuvering mode.\n | 6. [Claim 6] Method according to one of the preceding claims, comprising, prior to step a), a step of filtering on the relevant environmental elements.\n | 7. [Claim 7] Vehicle (1) equipped with a driving assistance system (2), sensors (3) configured to collect perception data, means for recording and storing perception data (4), means for calculating and processing perception data (5) configured to determine an uncertainty on an environmental element, and means for exchanging perception data (7) with a V2X network, in which at least some of the perception data are used to control the driving assistance system (2), characterized in that the means for exchanging perception data (7) are only implemented in relation to the environmental element only if the uncertainty on this environmental element is greater than or equal to a predetermined security threshold.\n | 8. [Claim 8] Vehicle according to claim 7, in which the means for calculating and processing perception data (5), as well as the means for exchanging perception data (7) are configured to transmit a perception request comprising specifically correlated attributes to the environmental element on which the uncertainty is greater than or equal to the predetermined safety threshold.\n | 9. [Claim 9] Computer program comprising program code instructions for executing the method according to one of claims 1 to 6, when said program is executed on a computer, and for triggering an action of said vehicle (1) depending on the updated uncertainty on the environmental element.\n | 10. [Claim 10] Distributed computer system comprising an on-board computer (5) for executing the program according to claim 9, as well as on-board data processing means (I) configured to process said perception request.", "output": "The method involves acquiring perception data on an environmental element of the vehicle (1). The uncertainty on the environmental element is determined. The level of uncertainty is evaluated by comparing level of uncertainty to a safety threshold. The collection request in which the perception request calls for a response limited by attributes specifically correlated to the environment element is created. The request is transmitted to a V2X communication network. The response to the request is received from the V2X communication network, or taking into account a lack of response to the request from the V2X communication network. The collective perception data is obtained in response to the perception request. The uncertainty updated with the perception data obtained by the fusion step is determined. INDEPENDENT CLAIMS are included for the following:a vehicle;a computer program for strengthening safety of autonomous driving of vehicle; anda distributed computer system for strengthening safety of autonomous driving of vehicle. Communication method for strengthening safety of autonomous driving of vehicle such as motor vehicle. The controllability is the ability of users to maintain control in the event of a failure and indicator is identified based on knowledge of user behavior in the situation. The drawing shows a schematic view of the vehicle. (Drawing includes non-English language text) 1Vehicle2Driving assistance system3Sensor4Perception data5Processing perception data", "instruction": "Please summarize the input" }, { "input": "Method and device for parking a motor vehicle A method of parking a motor vehicle comprises the steps of: acquisition (31) of the environment of a parking space, comprising a sub-step of detection (35) of at least one connected autonomous vehicle parked at the edge of the parking space; sending (37) of a command travel to the connected autonomous vehicle, said movement command comprising movement instructions making it possible to widen the parking space; execution (45) of a parking maneuver after widening the parking space. A parking device and a motor vehicle comprising the device are also described. Figure to be published with the abstract: Fig2|1. Claims\n[Claim 1] Method of parking a motor vehicle comprising the steps of: ? acquisition (31) of the environment of a parking space parking, comprising a sub-step of detection (35) of at least one connected autonomous vehicle parked at the edge of the parking space; ? sending (37) a movement command to the vehicle autonomous connected, said movement command comprising movement instructions making it possible to widen the parking space; ? execution (45) of a parking maneuver after widening the parking space. \n | 2. [Claim 2] Method according to claim 1, in which the movement command is included in a DENM type V2X message.\n | 3. [Claim 3] Method according to claim 1 or 2, wherein the step of acquiring the environment comprises acquiring (39) the environment of the at least one parked autonomous vehicle.\n | 4. [Claim 4] Method according to claim 3, wherein the step of acquiring the environment of the at least one parked vehicle comprises receiving a message from said parked vehicle containing possible movement information.\n | 5. [Claim 5] Method according to claim 3 or 4, in which the movement command is only sent after determining that the environment of the at least one parked autonomous vehicle allows movement to enlarge the parking space sufficient to allow parking of the vehicle.\n | 6. [Claim 6] Method according to one of claims 1 to 5, further comprising a step of sending an information message to the owner of the parked autonomous vehicle.\n | 7. [Claim 7] Method according to any one of the preceding claims, comprising a preliminary step of determining the size of the parking space and comparing it with the minimum size necessary to allow parking of the vehicle.\n | 8. [Claim 8] Device for parking a motor vehicle (1) comprising: ? means of acquiring (5) the environment of a parking space, comprising detection of at least one connected autonomous vehicle parked at the edge of the space parking ;\n? a transmitter (7) of a movement command to the connected autonomous vehicle, said movement command comprising movement instructions making it possible to widen the parking space; \n? a controller (3) adapted to control a parking maneuver after widening the parking space. \n | 9. [Claim 9]\nMotor vehicle comprising a device according to claim 8. \n | 10. [Claim 10] Computer program product downloadable from a communications network and/or recorded on a computer-readable medium and/or executable by a processor, characterized in that it comprises program code instructions for implementing the method according to at least one of claims 1 to 7.", "output": "The method involves acquiring (31) environment of parking space, and performing sub-step detection (35) of at least one connected autonomous vehicle parked at edge of the parking space. A movement command is send (37) to vehicle, which comprises movement instructions to widen the parking space, and a parking maneuver is executed (45) after widening the parking space. The movement command is included in DENM type V2X message, and the environment of at parked autonomous vehicle is acquired by receiving message from parked vehicle containing possible movement information. INDEPENDENT CLAIMS are included for the following:a device for parking motor vehicle; anda computer program product for parking motor vehicle. Method for parking motor vehicle (claimed) such as car and van. The available location calculation is carried out on sides of vehicles and the widening maneuver by parked vehicles is improved by automation of autonomous vehicles. The drawing shows a flow diagram illustrating the process for parking motor vehicle. 31Step for acquiring environment of parking space35Step for detecting connected autonomous vehicle parked at edge of parking space37Step for sending movement command to vehicle39Step for determining whether movement is possible45Step for executing parking maneuver", "instruction": "Please summarize the input" }, { "input": "Method and device for securing an autonomous vehicle The invention relates to a method and a device for securing a vehicle (10) adapted to travel in an autonomous driving mode. To this end, a communication infrastructure control device (1) detects when the vehicle (10) is stationary on a road. The control device transmits to the vehicle (10) a first request awaiting a response from the driver, according to a vehicle-to-infrastructure communication mode, called V2I. In the absence of a response from the driver to the first request, the control device transmits in V2I one or more driving instructions to the vehicle (10) so that the latter reaches a safety position. Figure for abstract: Figure 1|1. Claims\n[Claim 1] Method for securing a vehicle (10), said vehicle (10) being configured to travel in an autonomous driving mode, said method comprising the following steps: - determination (21), by a communication infrastructure control device (1), of a current state of said vehicle (10) representative of a stop of said vehicle on a road; - transmission (22), by said control device, of first data representative of a first request intended for said vehicle (10), said first request requiring a response from a driver of said vehicle (10), said transmission being works according to a vehicle-to-infrastructure communication mode, known as V2I; - transmission (23), by said control device according to said V2I communication mode, of second data representative of at least one driving instruction towards a safety position intended for said vehicle (10) in the event of non-response from said driver to said first request. \n | 2. [Claim 2] Method according to claim 1, for which said second data correspond to: - data representative of coordinates of a geolocation system corresponding to said safety position to be reached; and or - data representative of at least one image of said safety position to be reached; and or - data representative of a voice command to be given in said vehicle (10) to control said vehicle (10) towards said safety position to be reached.\n | 3. [Claim 3] Method according to claim 1 or 2, further comprising a step of transmission (36), by said control device according to said V2I communication mode, of a second request intended for said vehicle (10), said second request requesting a passage said vehicle (10) from a current level of autonomy to a determined level of autonomy, said determined level of autonomy being greater than said current level of autonomy.\n | 4. [Claim 4] Method according to any one of Claims 1 to 3, for which the said first data correspond to: - data representative of a voice message to be delivered in said vehicle (10) requiring an action from said driver in response to said voice message; and or - data representing graphic content to be displayed on a display screen on board said vehicle (10) requiring an action from said driver in response to said displayed graphic content; and/ Where - data representative of an alarm to be returned by an alarm system on board said vehicle and requiring an action from said driver in response to said alarm.\n | 5. [Claim 5] Method according to any one of claims 1 to 4, further comprising a step of transmission, by said control device according to said V2I communication mode, of at least one command intended for at least one control system of said vehicle (10) in the event of non-execution of said at least one driving instruction by said vehicle (10).\n | 6. [Claim 6] Method according to claim 5, for which said at least one command belongs to a set of commands comprising: - a command to start said vehicle; - A stop command of said vehicle; - A speed control of said vehicle; - A steering control of said vehicle; - A braking control of said vehicle; and - A path control of said vehicle.\n | 7. [Claim 7] Method according to one of claims 1 to 6, for which said current state of said vehicle (10) is determined from information representative of the environment of said vehicle (10).\n | 8. [Claim 8] Method according to claim 7, for which said information representative of the environment is obtained by said control device from: - at least one sensor (112) of said communication infrastructure (1) configured to acquire data representative of said environment; and or - said vehicle according to said V2I communication mode; and or - at least one other vehicle (11) according to said V2I communication mode.\n | 9. [Claim 9] Device (4) for securing a vehicle, said device (4) comprising a memory (41) associated with at least one processor (40) configured for the implementation of the steps of the method according to any one of the claims 1 to 8.\n | 10. [Claim 10] System comprising the device (4) according to claim 9 and at least one vehicle (10) configured to travel in an autonomous driving mode.", "output": "The method involves configuring vehicle (10) to travel in an autonomous driving mode. The current state of vehicle representative of a stop of vehicle on a road is determined by a communication infrastructure control device (1). The first data representative of a first request intended for vehicle is transmitted by a control device, where the first request requires a response from a driver of vehicle and the transmission works according to a vehicle-to-infrastructure communication mode. The second data representative of driving instruction is transmitted towards a safety position intended for vehicle in the event of non-response from driver to first request, by the control device according to the vehicle-to-infrastructure communication mode. INDEPENDENT CLAIMS are included for the following:a device for securing vehicle; anda system comprising device for securing vehicle. Method for securing vehicle e.g. autonomous vehicle such as motor vehicle, and land vehicle such as truck, bus and motorcycle. The safety of vehicles on the roads is improved. The vehicle is secured, and the safety of vehicles and passengers is increased. The risk of collision with another vehicle is avoided, and the risk of accident or additional accident linked to the presence of stationary vehicle on the road is reduced. The drawing shows a schematic view of communication infrastructure and vehicle. 1Communication infrastructure control device10Vehicle100Cloud of network110,111Communication devices112Camera", "instruction": "Please summarize the input" }, { "input": "Method for managing a convoy comprising at least two motor vehicles in an autonomous driving mode The invention relates to a method for managing a convoy grouping together at least two motor vehicles (1, 4, 5) in an autonomous driving mode, the convoy traveling on a road (3) and being formed by at least one lead vehicle (1) and at least one follower vehicle (4, 5), the method comprising the following steps: - determining, according to a programmed route, if the leading vehicle (1) must change direction and if the time or the distance remaining before a planned change of direction is less than a time or distance threshold ; - send an information message to the vehicles (4, 5, 6) located near the lead vehicle (1) indicating that a change of direction of the lead vehicle (1) must occur soon. Figure for the abstract: Fig. 3|1. Claims\n[Claim 1] Method for managing a convoy grouping together at least two motor vehicles (1, 4, 5) in an autonomous driving mode, the convoy traveling on a road (3) and being formed by at least one leading vehicle (1) and at least one follower vehicle (4, 5), the method comprising the following steps: - determining (40), according to a programmed route, if the leading vehicle (1) must change direction and if the time or the distance remaining before a planned change of direction is less than a time threshold or distance; - Transmit (42) an information message to vehicles (4, 5, 6) located close to the lead vehicle (1) indicating that a change of direction of the lead vehicle (1) is due soon. \n | 2. [Claim 2] Method according to the preceding claim, in which, when the information message is received by a follower vehicle (4, 5) of the convoy, this vehicle determines (44) whether the change of direction of the lead vehicle (1) is compatible with a programmed route of the following vehicle (4, 5).\n | 3. [Claim 3] Method according to the preceding claim, in which, if it is determined that the programmed route of a following vehicle is incompatible with the change of direction of the leading vehicle, the following vehicle performs (46) at least one of the actions following; - search for a compatible lead vehicle; - maintenance of autonomous driving mode; - stopping autonomous driving mode.\n | 4. [Claim 4] Method according to the preceding claim, in which the stopping of the autonomous driving mode is preceded by an alert message sent to the attention of the driver of the following vehicle (4, 5) concerned, the message being for example of the visual type and/or sound.\n | 5. [Claim 5] Method according to one of the preceding claims, in which the step of transmitting (42) an information message is implemented by means of a wireless communication module on board the lead vehicle and compatible with the standard V2X.\n | 6. [Claim 6] Method according to the preceding claim, in which the wireless communication module on board the lead vehicle is compatible with one or more of the following wireless communication protocols: IEEE 802.lip, ETSIITS-G5, Wifi?, Bluetooth?, GSM 3G-4G-5G, C, LTE.\n | 7. [Claim 7] Method according to one of the preceding claims, in which the step of determining (40) whether the lead vehicle (1) must change direction is carried out by a computer on board the lead vehicle (1), as a function of information provided by a geolocation module (14) on board the lead vehicle (1).\n | 8. [Claim 8] Method according to one of the preceding claims, in which the steps of determining (40) whether the lead vehicle (1) should change direction and of transmitting (42) an information message are implemented so that the information message can be transmitted before the leading vehicle (1) begins the change of direction maneuver.\n | 9. [Claim 9] Method according to the preceding claim, in which the duration between the transmission of an information message and the start of the maneuver to change direction of the leading vehicle (1) is greater than or equal to a threshold duration, the threshold duration being at least equal to 10 seconds.\n | 10. [Claim 10] Computer program product comprising instructions which, when the program is executed by one or more processor(s), cause the latter(s) to implement the steps of the method in accordance with one of Claims 1 to 9.\n1/2", "output": "The method involves determining if a leading vehicle (1) changes direction and if time or distance remaining before a planned change of direction is less than a time threshold or distance. An information message is transmitted to vehicles (4-6) located close to the lead vehicle indicating that change of direction of the lead vehicle is due soon, where the vehicle determines whether the change of direction of the lead vehicle is compatible with a programmed route of the following vehicle when the information message is received by the follower vehicle of a convoy. The leading vehicle is compatible with wireless communication protocols e.g. IEEE 802.lip protocols, ETSIITS-G5 protocols, Wifi protocols, Bluetooth protocols, GSM 3G protocols-4G protocols-5G protocols and LTE protocols. An INDEPENDENT CLAIM is also included for a computer program product comprising a set of instructions for managing a convoy of motor vehicles. Method for managing a convoy of motor vehicles i.e. autonomous or partially autonomous motor vehicles, traveling on a road. The method enables managing the convoy to manage a situation in which the leading vehicle is brought to change direction in complete safety so as to improve the management of driving in the convoy of autonomous vehicles. The drawing shows a schematic view of a portion of a road.1Leading vehicle4-6Vehicles18Sensors30, 32Traffic lanes38Exit lane", "instruction": "Please summarize the input" }, { "input": "Method and device for controlling an autonomous vehicleThe invention relates to a method and a device for controlling an autonomous vehicle (10). To this end, first information representing an environment of the vehicle (10) is obtained, the environment comprising a set of elements (11, 12, 13, 14). At least a portion of the environment is subdivided into a plurality of cells (211-217, 221-225, 231-238). For each cell, a value representative of a level of nuisance weighing on the vehicle (10) is determined from the first information, second information representative of the vehicle (10) and third information representative of a level of nuisance associated with each element (11, 12) of at least part of the set of elements. The vehicle (10) is controlled according to the values ??representative of the level of nuisance. Figure for the abstract: Figure 2|1. Claims\n[Claim 1] A method of controlling an autonomous vehicle (10), said method comprising the following steps: - obtaining (51) first information representative of an environment (1) of the vehicle (10), said environment (1) comprising a set of elements (11, 12, 13, 14) comprising at least one element, at least a part of the first information being representative of said at least one element; - subdivision (52) of at least part of said environment into a plurality of cells (211 to 217, 221 to 225, 231 to 238); - for each cell, determination (53) of a value representative of a level of nuisance weighing on said autonomous vehicle (10) from said first information and second information representative of said autonomous vehicle (10), said determination of a value representative of a level of nuisance weighing on said autonomous vehicle (10) further being a function of third information representative of a level of nuisance associated with each element (11, 12) of at least one part said set of elements; - control of said autonomous vehicle (10) according to said values ??representative of the level of nuisance. \n | 2. [Claim 2] A method according to claim 1, further comprising a step of representing a dynamic environment of said autonomous vehicle (10) based on said cells (211 to 217, 221 to 225, 231 to 238) and said values ??associated with said cells (211 to 217, 221 to 225, 231 to 238).\n | 3. [Claim 3] Method according to claim 1 or 2, for which said set of elements comprises at least one element among the following elements: - static object; and/or - moving object; and/or - floor markings; and/or - traffic information; and/or - signaling device; and/or - hole in the road.\n | 4. [Claim 4] Method according to claim 1, for which said nuisance level is a function of: - a kinetic energy resulting in the event of collision of said autonomous vehicle (10) with an element of said assembly; and or - a braking force resulting from braking of said autonomous vehicle (10); and or - a centrifugal force resulting from a change of direction of said autonomous vehicle (10); - a set of traffic rules; and or - information representative of a determined path for said autonomous vehicle (10).\n | 5. [Claim 5] Method according to claim 1 or 2, for which said first information belongs to a set of information comprising: - position representative information; - information representative of a type of element; - representative size information; - kinematic information of the associated element; - information representative of meteorological conditions; - information representative of traffic rules; - information representative of trajectory; - traffic information; - information representative of traffic conditions.\n | 6. [Claim 6] Method according to any one of Claims 1 to 3, for which the said second information belongs to a set of information comprising: - information representative of position; - Kinematic information of said autonomous vehicle; - information representative of the trajectory of said autonomous vehicle.\n | 7. [Claim 7] Method according to any one of claims 1 to 4, for which said first information is obtained from at least one sensor of a detection system on board said autonomous vehicle (10) and/or from at least one element of said assembly of elements (11, 12, 13, 14) according to a vehicle-to-everything type communication mode, called V2X.\n | 8. [Claim 8] Method according to one of claims 1 to 7, for which said step of controlling the autonomous vehicle comprises determining information representative of the trajectory of said autonomous vehicle (10) as a function of said cells (211 to 217, 221 to 225, 231 to 238) and associated representative nuisance values.\n | 9. [Claim 9] Device (4) configured to control an autonomous vehicle, said device (4) comprising a memory (41) associated with at least one processor (40) configured to implement the steps of the method according to any one of Claims 1 to 8.\n | 10. [Claim 10] Autonomous vehicle (10) comprising the device (4) according to claim 9.", "output": "The method involves obtaining first information representative of an environment of an autonomous vehicle (10), the environment comprises a set of elements e.g. Hole (13) and road sign (14). A part of the environment is subdivided into a set of cells (211-217). A value representative of a level of nuisance weighing is determined on the autonomous vehicle from the information and second information representative of the autonomous vehicle. The value representative of the level of nuisance weighing is determined on the autonomous vehicle from a function of third information representative of a level of nuisance associated with each element of the set of elements. The autonomous vehicle is controlled according to the values representative of the level of nuisance. INDEPENDENT CLAIMS are also included for:a device for controlling an autonomous vehicle; andan autonomous vehicle. Method for controlling a kinematic parameter e.g. speed and acceleration, a trajectory, a braking system and a safety system of an autonomous vehicle (claimed) i.e. autonomous land motor vehicle. The method enables controlling the autonomous vehicle so as to improve representation of the environment of the vehicle and to improve the decision-making of the vehicle in the context of autonomous driving. The drawing shows a schematic view representing a spatial subdivision of an environment.10Autonomous vehicle13Hole14Road sign101-103Three lanes of traffic211-217Cells", "instruction": "Please summarize the input" }, { "input": "Method for updating road signs by an autonomous vehicle The invention relates to a method and device for updating road signs by an autonomous vehicle traveling in a road environment, comprising the steps of: - detection (42), by the autonomous vehicle, of a presence or of an absence of a road sign element from the road environment; - emission (43), by the autonomous vehicle, of information representing a result of the detection, - reception (44) of said information by a device remote from an infrastructure of a communication network and; - updating (45) of the road signs by the remote device according to said information received. Figure for abstract: Figure 4|1. Claims\n[Claim 1] Method for updating road signs by an autonomous vehicle traveling in a road environment, comprising steps of: - Detection (42), by the autonomous vehicle, of a presence or absence of a road sign element of the road environment; - emission (43), by the autonomous vehicle, of information representing a result of the detection, - reception (44) of said information by a device remote from an infrastructure of a communication network and; - updating (45) of the road signs by the remote device as a function of said information received. \n | 2. [Claim 2] Method according to claim 1, which further comprises a step of transmitting said information by the autonomous vehicle to another autonomous vehicle.\n | 3. [Claim 3] A method according to claim 2, wherein said information is transmitted from the autonomous vehicle to the other autonomous vehicle according to vehicle-to-vehicle communication.\n | 4. [Claim 4] Method according to claim 2 or 3, which further comprises a step of transmitting said information by said other autonomous vehicle and intended for the remote device.\n | 5. [Claim 5] Method according to one of claims 1 to 4, for which said information is transmitted from an autonomous vehicle to the remote device according to a vehicle-to-infrastructure communication.\n | 6. [Claim 6] Method according to one of claims 1 to 5, which further comprises, prior to the steps of detecting and transmitting by the autonomous vehicle, a step of transmitting (41), by the remote device, of a request for obtaining an update of the road signs and a step of receiving said request by the autonomous vehicle.\n | 7. [Claim 7] A method according to claim 6, wherein said request informs the autonomous vehicle that a road sign element is present at a particular location.\n | 8. [Claim 8] Device for updating road signs by an autonomous vehicle traveling in a road environment, comprising a memory associated with at least one processor configured for implementing the steps of the method according to any one of the claims\n | 9. [Claim 9] 1 to 7. Computer program product comprising instructions suitable for executing the steps of the method according to one of claims 1 to 7, when the computer program is executed by at least one\n | 10. [Claim 10] processor. Computer-readable recording medium on which is recorded a computer program comprising instructions for carrying out the steps of the method according to one of claims 1 to 7.", "output": "The method (400) involves detection (42) of a presence or absence of a road sign element of the road environment by the autonomous vehicle. Information is emitted (43) that represents a result of the detection by the autonomous vehicle. An information is received (44) by a device remote from an infrastructure of a communication network. The road signs are updated (45) by the remote device as a function of information received. Information is transmitted by the autonomous vehicle to another autonomous vehicle. INDEPENDENT CLAIMS are included for the following:a device for updating road signs by an autonomous vehicle traveling in a road environment; anda computer program product for updating road signs by an autonomous vehicle traveling in a road environment. Method for updating road signs by an autonomous vehicle traveling in a road environment. Method avoids the congestion of the communication network and overloading of the remote device by avoiding the systematic sending of information representing detection results by autonomous vehicles as soon as they detect an element of road signs along their route. The drawing shows a flow chart of a method for updating road signs by an autonomous vehicle traveling in a road environment. (Drawing includes non-English language text). 42Detection of a presence or absence of a road sign element of the road environment by the autonomous vehicle43Emitting the information that represents a result of the detection by the autonomous vehicle44Receiving the information by a device remote from an infrastructure of a communication network45Updating the road signs by the remote device as a function of information received400Method", "instruction": "Please summarize the input" }, { "input": "Vehicle communication method and device The invention relates to a communication method and device for vehicles (10 and 11). To this end, information representative of the arrival of a second vehicle (11), traveling on a traffic lane, is transmitted to the first vehicle (10) via a wireless link of the vehicle-to-vehicle type. At least one guidance instruction is then determined so that the first vehicle (10) allows the second vehicle (11) to pass. Figure for abstract: Figure 1|1. Claims [Claim 1] Communication method for a vehicle, said method being implemented by a first vehicle (10), said method comprising the following steps: - reception (31) of information representative of an approach of a second vehicle (11) on a traffic lane of said first vehicle (10) according to a vehicle-to-vehicle type communication mode, V2V; - determination (32) of at least one instruction for guiding said first vehicle (10) in order to allow passage to said second vehicle (11).\n | 2. [Claim 2] The method of claim 1, further comprising a step of rendering said at least one guidance instruction, to rendering means associated with said first vehicle (10).\n | 3. [Claim 3] A method according to claim 1 or 2, wherein said first vehicle (10) implements said at least one guidance instruction in autonomous driving level 3 or higher.\n | 4. [Claim 4] Method according to any one of claims 1 to 3, for which said at least one guidance instruction comprises at least one command representative of a movement of said first vehicle (10) with respect to said traffic lane.\n | 5. [Claim 5] Method according to any one of the preceding claims, for which said at least one guidance instruction is further dependent on information representative of traffic conditions and / or on information representative of the speed of said first vehicle.\n | 6. [Claim 6] Method according to any one of the preceding claims, for which said second vehicle (11) corresponds to: - a priority vehicle; and or - a two-wheeled vehicle.\n | 7. [Claim 7] Method according to any one of the preceding claims, for which said information representative of an approach of a second vehicle (11) is included in at least one message of CAM and / or DENM type.\n | 8. [Claim 8] Device (2) comprising a memory (21) associated with at least one processor (20) configured for implementing the steps of the method according to any one of claims 1 to 7.\n | 9. [Claim 9] Vehicle (10) comprising the device (3) according to claim 8.\n | 10. [Claim 10] Computer program product comprising instructions suitable for executing the steps of the method according to one of claims 1 to 7, when the computer program is executed by at least one processor.\n1/2", "output": "The method involves receiving the information representative of an approach of second vehicle (11) on a traffic lane of first vehicle (10) according to a vehicle-to-vehicle (V2V) type communication mode. The instruction is determined for guiding first vehicle in order to allow passage to second vehicle. The guidance instruction is rendered to a rendering unit associated with first vehicle. The first vehicle implements the guidance instruction in autonomous driving level 3 or higher. The guidance instruction includes a command representative of a movement of first vehicle with respect to the traffic lane. The guidance instruction is dependent on information representative of traffic conditions and/or on information representative of the speed of first vehicle. INDEPENDENT CLAIMS are included for the following:a communication device; anda computer program product. Communication method for a vehicle (claimed), such as an ambulance, fire engine and police vehicle. The safety on the roads is improved. The free passage of priority vehicles is facilitated. The drawing shows a schematic view of a first vehicle traveling on a traffic lane of a road. 1Road environment10First vehicle11Second vehicle101,102Communication devices1000Road", "instruction": "Please summarize the input" }, { "input": "System and method for projecting a trajectory of an autonomous vehicle onto a road surfaceThe invention claims a system and a method for projecting a current track of an autonomous vehicle onto a road surface In some embodiments, claims an autonomous vehicle with light projector, wherein the light projector is on the top surface the autonomous vehicle. In addition, in some embodiments, the autonomous vehicle may include an electronic control unit, which is used for controlling the operation of the light projector, wherein the electronic control unit detects whether the autonomous vehicle is started. In other embodiments, the electronic control unit receives the data of the environment condition around the autonomous vehicle and receiving the track of the imminent occurrence of the autonomous vehicle. the electronic control unit further can project the light from the light projector to the road, surface the track of the autonomous vehicle to appear.|1. A system for projecting a trajectory of an autonomous vehicle onto a road surface, the system comprising: a light projector on the top surface the autonomous vehicle, and an electronic control unit, which is used for controlling the operation of the light projector, wherein the electronic control unit: detecting whether the autonomous vehicle is activated; If the autonomous vehicle is detected: receiving data of the environment condition around the autonomous vehicle, wherein the environment condition is based on the shape of a part of the object in the image corresponding to the pedestrian, pixel intensity and line to indicate the presence of the upcoming turning track and the pedestrian; receiving data of an imminent track of the autonomous vehicle; adjusting the upcoming trajectory based on the environmental condition; and indicating the light projector to indicate the autonomous vehicle to the front appointed distance and the turning direction of the turning direction of the light beam and text is projected surface the road, the projected light beam indicates the track of the autonomous vehicle is to appear, and indicating the light projector after turning projection three-dimensional fence, The three-dimensional fence is perpendicular to the light beam and the text.\n | 2. The system according to claim 1, wherein the data of the track to be present of the autonomous vehicle comprises: GPS data and at least one of the received data of the environmental condition.\n | 3. The system according to claim 2, wherein the data of the environment condition comprises: Traffic information, road sign information object detection and road condition information\n | 4. The system according to claim 3, wherein the data of the environmental condition is collected by at least one of a camera, a sensor, a navigation system, a vehicle communication system, a vehicle-to-infrastructure communication system, and a laser scanner\n | 5. The system according to claim 2, further comprising a vehicle communication module, wherein the vehicle communication module is configured to transmit and receive GPS data and data of the environmental condition to different autonomous vehicles.\n | 6. The system according to claim 1, wherein each of the projected beams comprises: straight arrow, turning arrow, inclined arrow, projection of at least one of character and number.\n | 7. The system according to claim 6, wherein each of the projected beams further comprises: The projection of the current speed of the autonomous vehicle.\n | 8. The system according to claim 1, wherein at least one of the projected beams comprises: the projection of the fence, for indicating the parking area of the autonomous surface on the road.\n | 9. The system according to claim 1, wherein the light projector comprises: It comprises light source of light emitting diode or laser diode.\n | 10. The system according to claim 1, wherein, when the autonomous vehicle detects the presence of a pedestrian, the light beam is projected surface the road.\n | 11. The system according to claim 10, wherein the system comprises a vehicle-to-vehicle communication system for alerting a nearby vehicle of the possibility of the presence of the pedestrian.\n | 12. The system according to claim 1, wherein the electronic control unit provides an audible notification when the vehicle determines that the upcoming track will collide with an object or a pedestrian.\n | 13. The system according to claim 12, wherein when the electronic control unit determines that an object or pedestrian in a certain distance from the autonomous vehicle is collided, the audible notification is provided.\n | 14. The system according to claim 1, wherein the light beam from the light projector is projected from 2 feet to 20 feet in front of the autonomous vehicle.\n | 15. The system according to claim 1, wherein the light beam from the light projector is projected from 2 feet to 20 feet on the side of the autonomous vehicle.\n | 16. A method for projecting a trajectory of an autonomous vehicle onto a road surface, the method comprising: detecting whether the autonomous vehicle is activated; If the autonomous vehicle is detected: receiving data of the environment condition around the autonomous vehicle, wherein the environment condition is based on the shape of a part of the object in the image corresponding to the pedestrian, pixel intensity and line to indicate the presence of the upcoming turning track and the pedestrian; receiving data of an imminent track of the autonomous vehicle; adjusting the upcoming trajectory based on the environmental condition; and indicating the light projector to indicate the autonomous vehicle to the front appointed distance and turning direction of the turning direction of the light beam and text is projected surface the road, the projected light beam indicates the track of the autonomous vehicle will appear, and indicating the light projector after turning projection three-dimensional fence, The three-dimensional fence is perpendicular to the light beam and the text.\n | 17. The method according to claim 16, wherein the track to be present comprises: expected path of the autonomous vehicle.\n | 18. The method according to claim 16, wherein the data of the imminent track of the autonomous vehicle is determined by at least one of the GPS data and the received data of the environment condition, the data of the environment condition comprises: Traffic information, road sign information object detection and road condition information\n | 19. The method according to claim 16, wherein each of the projected beams comprises: straight arrow, turning arrow, inclined arrow, projection of at least one of character and number.\n | 20. The method according to claim 16, wherein at least one of the projected beams comprises: the projection of the fence, for indicating the parking area of the autonomous surface on the road.\n | 21. The method according to claim 16, further comprising: displaying the upcoming track on the screen in front of the driver of the autonomous vehicle.\n | 22. The method according to claim 16, wherein the light beam from the light projector is projected on the surface of the road, from 2 feet to 20 feet in front of the autonomous vehicle.", "output": "The system (100) has a light projector (120) arranged on a top surface of an autonomous vehicle. An electronic control unit (140) controls operation of the light projector, and detects whether the autonomous vehicle is turned on, receives data of an environmental condition surrounding the autonomous vehicle, receives an upcoming trajectory path of the autonomous vehicle and projects a light from the light projector onto a surface of a road indicating the upcoming trajectory path of the autonomous vehicle, where the upcoming trajectory path of the autonomous vehicle is determined by one of received global positioning system data and the received data of the environmental condition. An INDEPENDENT CLAIM is also included for a method for projecting current trajectory path of an autonomous vehicle on a surface of road. System for projecting current trajectory path of an autonomous vehicle on a surface of road. The system allows autonomous vehicles to decrease traffic collision caused by human errors. The system allows the autonomous vehicles with enhanced driving control systems and safety mechanisms to ensure reliability and safety of the autonomous vehicles. The drawing shows a block diagram of a system for projecting current trajectory path of an autonomous vehicle on a surface of road. 100System for projecting current trajectory path of autonomous vehicle on surface of road120Light projector140Electronic control unit160Switch162Camera", "instruction": "Please summarize the input" }, { "input": "AN IMPROVED PERFORMANCE AND COST GLOBAL NAVIGATION SATELLITE SYSTEM ARCHITECTURESignificant, cost-effective improvement is introduced for Position, Navigation, and Timing (PNT) on a global basis, particularly enhancing the performance of Global Navigation Satellite Systems (GNSS), an example of which is the Global Positioning System (GPS). The solution significantly improves performance metrics including the accuracy, integrity, time to acquire, interference rejection, and spoofing protection. A constellation of small satellites employing a low-cost architecture combined with improved signal processing yields an affordable enabler for spectrum-efficient transportation mobility. As air traffic management modernization transitions to a greater dependence on satellite positioning, the solution provides aviation users new protections from both intentional and unintentional interference to navigation and surveillance. And in response to an era in which intelligent transportation is under development for automobiles, reliable where-in-lane positioning enables new applications in connected and autonomous vehicles. New military capability increases PNT availability.I claim: \n | 1. A method for supporting resilient carrier phase positioning of user devices connected by respective communication links to at least one service data processor, measurements received from Global Navigation Satellite System (GNSS) satellites, and measurements received from low Earth orbit (LEO) satellites, said measurements including carrier phase pseudorange information, comprising the steps of: (a) the at least one service data processor accepting said measurements received from (i) at least one of said GNSS satellites by at least one LEO satellite, (ii) at least one of said GNSS satellites and the at least one LEO satellite by at least one ground reference station, and/or (iii) at least one other LEO satellite by the at least one LEO satellite via a LEO-to-LEO crosslink transmission; (b) the at least one service data processor generating precise orbit and clock predictions for the at least one LEO satellite from available said pseudorange infomiation; and (c) the at least one service data processor disseminating said predictions over said communications links to the user devices to enable the user devices to take into account the precise orbit and clock predictions when computing respective positions of the user devices upon receiving measurements from GNSS and LEO satellites.\n | 2. A method for supporting resilient carrier phase positioning of user devices as claimed in claim 1, wherein (a) the at least one service data processor accepts said measurements received from (i) at least one of said GNSS satellites by the at least one LEO satellite and (ii) at least one of said GNSS satellites and the at least one LEO satellite by the at least one ground reference station and Date Recue/Date Received 2021-04-28 (b) the at least one service data processor (i) generates the orbit predictions from said pseudorange information received from at least one of said GNSS satellites by the at least one LEO satellite and (ii) generates the clock predictions from said pseudorange information received from at least one of said GNSS satellites and the at least one LEO satellite by the at least one ground reference station.\n | 3. A method for supporting resilient carrier phase positioning of user devices as claimed in claim 1, wherein said measurements received from the at least one other LEO satellite by the at least one ground reference station are from configurations wherein the at least one ground reference station is outside the footprint of the at least one LEO satellite.\n | 4. A method for supporting resilient carrier phase positioning of user devices as claimed in claim 1, wherein measurements received from LEO satellites by ground reference stations are unavailable.\n | 5. A method for supporting resilient carrier phase positioning of user devices as claimed in claim 1, wherein measurements received from GNSS satellites by LEO satellites are unavailable.\n | 6. A method for supporting resilient carrier phase positioning of user devices as claimed in claim 1, wherein (i) said at least one LEO satellite includes an oscillator of known stability coupled coherently to a receiver for use in measuring carrier phase pseudorange information from said GNSS satellites or from other said LEO satellites and a transmitter for use in broadcasting carrier phase to be received by said ground reference stations and (ii) the at least one user device endures loss of one or more clock predictions due to disablement of satellites, ground reference stations, service data processors, or data dissemination means via which the clock predictions are received. Date Recue/Date Received 2021-04-28\n | 7. A method for supporting resilient carrier phase positioning of user devices as claimed in claim 1, wherein said at least one service data processor is integrated into a WAAS master station or a precise point positioning network operations center.\n | 8. A method for supporting resilient carrier phase positioning of user devices as claimed in claim 1, wherein said disseminating step is accomplished using SBAS satellites, Inmarsat Narrowband, NDGPS data broadcast, VHF aviation radio, 4G LTE, DOT ITS V2I 5.9 GHz standard broadcast, or said LEO satellites.\n | 9. A method for supporting resilient carrier phase positioning of user devices utilizing at least one service data processor connected to the user devices by respective communication links, measurements received from GNSS satellites, and measurements received from LEO satellites, said measurements including carrier phase pseudorange information, comprising the steps of: (a) the user devices accepting precise orbit and clock predictions disseminated by the at least one service data processor for at least one LEO satellite, said precise orbit and clock predictions being generated from available pseudorange information accepted by the at least one service data processor received from (i) at least one GNSS satellite by at least one LEO satellite, (ii) at least one GNSS satellite and the at least one LEO satellite by at least one ground reference station, and/or (iii) LEO-to-LEO crosslink transmissions between at least one other LEO satellite and the at least one LEO satellite; and b) the user devices taking into account the precise orbit and clock predictions disseminated by the at least one service data processor when computing respective positions of the user devices upon receiving respective said measurements from GNSS and LEO satellites. Date Recue/Date Received 2021-04-28\n | 10. A method for supporting resilient carrier phase positioning of user devices as claimed in claim 9, wherein (i) the precise orbit predictions are generated from pseudorange information accepted by the at least one service data processor received from at least one GNSS satellite by the at least one LEO satellite and (ii) the precise clock predictions are generated from pseudorange information received from at least one GNSS satellite and the at least one LEO satellite by the at least one ground reference station.\n | 11. A method for supporting resilient carrier phase positioning of user devices as claimed in claim 9, wherein the pseudorange information accepted from the at least one other LEO satellite by the at least one ground reference station is from configurations wherein the at least one ground reference station is outside the footprint of the at least one LEO satellite.\n | 12. A method for supporting resilient carrier phase positioning of user devices as claimed in claim 9, wherein pseudorange information received from LEO satellites by ground reference stations is unavailable.\n | 13. A method for supporting resilient carrier phase positioning of user devices as claimed in claim 9, wherein pseudorange information received from GNSS satellites by LEO satellites is unavailable.\n | 14. A method for supporting resilient carrier phase positioning of user devices as claimed in claim 9, wherein (i) said at least one LEO satellite includes an oscillator of known stability coupled coherently to a receiver for use in measuring carrier phase pseudorange information from said GNSS satellites or from other said LEO satellites and a transmitter for use in broadcasting carrier phase to be received by said ground reference stations and (ii) the at least one user device endures loss of one or more clock predictions Date Recue/Date Received 2021-04-28 due to disablement of satellites, ground reference stations, service data processors, or data dissemination means via which the clock predictions are received.\n | 15. A method for supporting resilient carrier phase positioning of user devices as claimed in claim 9, further comprising the step of employing Receiver Autonomous Integrity Monitoring (RAIM) to weight a fusion of other sensors selected from at least one camera, lidar receiver, or radar receiver.\n | 16. A method for supporting resilient carrier phase positioning of user devices as claimed in claim 9, further comprising the step of forming coherent cross- correlations across at least one pair of satellites to combat potential interference.\n | 17. A method for supporting resilient carrier phase positioning of user devices as claimed in claim 9, wherein said GNSS and LEO satellites have known oscillator stabilities, and further comprising the step of receiving precise clock predictions of the GNSS and LEO satellites from the at least one service data processor and enduring subsequent loss of one or more clock predictions due to disablement of ground reference stations, service data processors, or data dissemination means via which the clock predictions are received.\n | 18. A method for supporting resilient carrier phase positioning of user devices as claimed in claim 9, wherein the method is carried out despite enduring subsequent loss of one or more clock predictions due to disablement of ground reference stations, service data processors, or data dissemination means therebetween or therefrom.\n | 19. A method for supporting resilient carrier phase positioning of user devices as claimed in claim 9, wherein said at least one LEO satellite is included in a constellation of Date Recue/Date Received 2021-04-28 said LEO satellites that minimize the number of required PRN codes through PRN code re-use.\n | 20. A method for supporting resilient carrier phase positioning of user devices as claimed in claim 9, further comprising the steps of: (a) the user device, at such time as it is moving, receiving broadcasting signals from one or more terrestrial, free- running, pre- surveyed pseudolites of known oscillator stability and measuring carrier phase pseudorange information therefrom, and (b) incorporating the pre-surveyed locations and known oscillator stability of said pseudolites in said position computation.\n | 21. A method for supporting resilient carrier phase positioning of user devices as claimed in claim 20, wherein said pseudolites broadcast in the 5.9 GHz band.\n | 22. A method for supporting resilient carrier phase positioning of user devices as claimed in claim 20, wherein some or all of said pseudolites are mounted at street level.\n | 23. A method for supporting resilient carrier phase positioning of user devices as claimed in claim 20, wherein some or all of said pseudolites are mounted at an elevated position relative to said at least one user device.\n | 24. A method for supporting resilient carrier phase positioning of user devices as claimed in claim 9, further comprising the steps of: (a) receiving pseudorange information from multi-band LEO, single-band LEO, and GNSS satellites; (b) collecting service data processor precise orbit and clock predictions of both the LEO and GNSS satellites and road-specific ionosphere and troposphere estimates; Date Recue/Date Received 2021-04-28 (c) applying said road-specific estimates to correct said single- band LEO satellite pseudoranges.\n | 25. A method for supporting resilient carrier phase positioning of user devices as claimed in claim 24, wherein one or more of said single-band LEO satellite signals are broadcast in the band centered at 1,575,420,000 Hz.\n | 26. A method for supporting resilient carrier phase positioning of user devices as claimed in claim 24, wherein one or more of said single-band LEO satellite signals are broadcast in the band spanning 1,616,000,000 to 1,626,500,000 Hz.\n | 27. A service data processor for supporting resilient carrier phase positioning of user devices utilizing at least one service data processor connected to the user devices by respective communication links, measurements received from GNSS satellites, and measurements received from LEO satellites, said measurements including carrier phase pseudorange information, comprising: (a) means for accepting said measurements from (i) at least one of said GNSS satellites by at least one LEO satellite (ii) at least one of said GNSS satellites and said at least one LEO satellite by at least one ground reference station and/or (iii) at least one other LEO satellite to the at least one LEO satellite via a LEO-to-LEO crosslink transmission; (b) means for generating precise orbit and clock predictions for the at least one LEO satellite from available said pseudorange information received by the at least one LEO satellite; and (c) means for disseminating said predictions to the user devices over the communications links to enable the user devices to take into account the precise orbit and clock predictions when computing respective positions of Date Recue/Date Received 2021-04-28 the user devices upon receiving respective said measurements from GNSS and LEO satellites. 28. A service data processor for supporting resilient carrier phase positioning of user devices as claimed in claim 27, wherein (a) the accepted measurements are received from (i) at least one of said GNSS satellites by the at least one LEO satellite and (ii) at least one of said GNSS satellites and the at least one LEO satellite by the at least one ground reference station; and (b) the generated orbit predictions are from said pseudorange information received from at least one of said GNSS satellites by the at least one LEO satellite, and the generated clock predictions are from said pseudorange information received from at least one of said GNSS satellites and the at least one LEO satellite by the at least one ground reference station. 29. A service data processor for supporting resilient carrier phase positioning of user devices as claimed in claim 27, wherein the measurements received from the at least one other LEO satellite by the at least one ground reference station are from configurations wherein the at least one ground reference station is outside the footprint of the at least one LEO satellite. 30. A service data processor for supporting resilient carrier phase positioning of user devices as claimed in claim 27, wherein measurements received from LEO satellites by ground reference stations are unavailable. 31. A service data processor for supporting resilient carrier phase positioning of user devices as claimed in claim 27, wherein measurements received from GNSS satellites by LEO satellites are unavailable. Date Recue/Date Received 2021-04-28\n | 32. A service data processor for supporting resilient carrier phase positioning of user devices as claimed in claim 27, wherein (i) said at least one LEO satellite includes an oscillator of known stability coupled coherently to a receiver for use in measuring carrier phase pseudorange information from said GNSS satellites or from other LEO satellites and a transmitter for use in broadcasting carrier phase to be received by said ground reference stations and (ii) the at least one user device endures loss of one or more clock predictions due to disablement of satellites, ground reference stations, service data processors, or data dissemination means via which the clock predictions are channeled.\n | 33. A service data processor for supporting resilient carrier phase positioning of user devices as claimed in claim 27, wherein said service data processor is spaceborne.\n | 34. A service data processor for supporting resilient carrier phase positioning of user devices as claimed in claim 33, further including coupled transmitters and receivers provided in an integrated circuit chipset hosted by said LEO satellite.\n | 35. A service data processor for supporting resilient carrier phase positioning of user devices as claimed in claim 27, wherein said at least one service data processor is integrated into a WAAS master station or a precise point positioning network operations center.\n | 36. A service data processor for supporting resilient carrier phase positioning of user devices as claimed in claim 27, wherein said disseminating means utilizes SBAS satellites, Inmarsat Narrowband, NDGPS data broadcast, VHF aviation radio, 4G LTE, DOT ITS V2I 5.9 GHz standard broadcast, or said LEO satellites.\n | 37. A user device supported by at least one service data processor, the at least one service data processor connected to a plurality of user devices by respective Date Recue/Date Received 2021-04-28 communication links, to utilize measurements received from GNSS satellites and measurements received from LEO satellites in order to compute a position of the user device, said measurements including carrier phase pseudorange information, comprising: (a) accepting means for accepting precise orbit and clock predictions disseminated by the at least one service data processor for at least one LEO satellite, the precise orbit and clock predictions being generated from available pseudorange information accepted by the at least one service data processor received from (i) at least one GNSS satellite by the at least one LEO satellite, (ii) at least one GNSS satellite and the at least one LEO satellite by at least one ground reference station, and/or (iii) at least one other LEO satellite by the at least one LEO satellite as a LEO-to-LEO crosslink transmission; and (b) computing means for computing the position of the user device by taking into account the precise orbit and clock predictions when computing the position upon receiving said measurements from GNSS and LEO satellites.\n | 38. A user device supported by at least one service data processor as claimed in claim 37, wherein (i) the precise orbit predictions are generated from pseudorange information accepted by the at least one service data processor received from at least one GNSS satellite by the at least one LEO satellite and (ii) the precise clock predictions are generated from pseudorange information accepted by the at least one service data processor received from the at least one LEO satellite by the at least one ground reference station.\n | 39. A user device supported by at least one service data processor as claimed in claim 37, wherein the pseudorange information received from the at least one other LEO satellite by the at least one ground reference station is from configurations wherein the at least one ground reference station is outside the footprint of the at least one LEO satellite. Date Recue/Date Received 2021-04-28\n | 40. A user device supported by at least one service data processor as claimed in claim 37, wherein pseudorange information received from LEO satellites by ground reference stations is unavailable.\n | 41. A user device supported by at least one service data processor as claimed in claim 37, wherein received from GNSS satellites by LEO satellites is unavailable.\n | 42. A user device supported by at least one service data processor as claimed in claim 37, said at least one LEO satellite includes an oscillator of known stability coupled coherently to a receiver for use in measuring carrier phase pseudorange information from said GNSS satellites or from other LEO satellites and a transmitter for use in broadcasting carrier phase to be received by said ground reference stations and (ii) the at least one user device endures loss of one or more clock predictions due to disablement of satellites, ground reference stations, service data processors, or data dissemination means via which the clock predictions are received.\n | 43. A user device supported by at least one service data processor as claimed in claim 37, wherein said computing means is coupled to a Receiver Autonomous Integrity Monitoring (RAIIVI) device.\n | 44. A user device supported by at least one service data processor as claimed in claim 37, wherein said computing means is coupled to means for employing said RAIM device to weight the fusion of other sensors.\n | 45. A user device supported by at least one service data processor as claimed in claim 44, wherein said other sensors include at least one of a camera and a lidar or radar receiver. Date Recue/Date Received 2021-04-28\n | 46. A user device supported by at least one service data processor as claimed in claim 37, wherein LEO signals broadcast from each said LEO satellite to each said ground reference station and said user device use frequency bands that are the same as those used by GNSS satellites.\n | 47. A user device supported by at least one service data processor as claimed in claim 46, wherein said LEO signals are consistent with legacy or modern GNSS PRN codes.\n | 48. A user device supported by at least one service data processor as claimed in claim 47, wherein said GNSS PRN codes are selected from the following GNSS PRN codes: GPS C/A, GPS P(Y), GPS M, GPS M', GPS L5, GPS L2C, GPS Ll C, Galileo El, Galileo E5a, Galileo E5b, Galileo E5, and Galileo E6.\n | 49. A user device supported by at least one service data processor as claimed in claim 46, wherein said LEO satellite signals are codes generated by a 128-bit AES counter producing a chipping rate of an integer multiple of 1,023,000 chips per second.\n | 50. A user device supported by at least one service data processor as claimed in claim 37, further comprising means for: (a) the user device in motion receiving signals broadcast by one or more terrestrial, free-running, pre-surveyed pseudolites of known oscillator stability, the signals from the pseudolites including carrier phase pseudorange information and (b) incorporating the pre-surveyed locations and oscillator stabilities of said pseudolites in said position calculation.\n | 51. A user device supported by at least one service data processor as claimed in claim 50, wherein said pseudolites broadcast in the 5.9 GHz band. Date Recue/Date Received 2021-04-28\n | 52. A user device supported by at least one service data processor as claimed in claim 50, wherein some or all of said pseudolites are mounted at street level.\n | 53. A user device supported by at least one service data processor as claimed in claim 50, wherein some or all of said pseudolites are mounted above where street vehicles operate. Date Recue/Date Received 2021-04-28", "output": "The method involves providing a service data processor accepting measurements received from one of the global navigation satellite system (GNSS) satellites and one of the low earth orbit (LEO) satellites. The precise orbit and clock predictions for the LEO satellite are generated. The predictions to the user device is disseminated to enable the user device to take into account the precise orbit and clock predictions, when computing the position upon receiving signals and measuring additional carrier phase pseudo-ranges from GNSS and LEO satellites. INDEPENDENT CLAIMS are included for the following:a service data processor for supporting resilient carrier phase of user device;a user device;a method for carrier phase positioning of user device;a method for localizing emitter;a method for GNSS signal authentication;a method for user position authentication;a method for fielding positioning service for one or more users;a method for generating regional, high-power navigation signals;a system for thermal control of high-power regional navigation satellite system;a method for providing agile, robust, and cost-effective services; anda method of beam forming space-borne distributed aperture. Method for supporting resilient carrier phase of user device such as subscriber vehicle (claimed), manned aircraft and unmanned aircraft. The performance of the GNSS is improved. The performance metrics including the accuracy, integrity, time to acquire, interference rejection, and spoofing protection for supporting resilient carrier phase of user device is improved. The disseminating process conforms to 4G LTE. The drawing shows a schematic view illustrating the process for supporting resilient carrier phase of user device.", "instruction": "Please summarize the input" }, { "input": "METHOD AND APPARAUTE FOR IDENTIFYING AUTONOMOUS VEHICLES IN AUTONOMOUS DRIVING ENVIRONMENTDisclosed is an autonomous vehicle identification method and apparatus for identifying autonomous vehicles using V2X communication data in an autonomous driving environment. An autonomous vehicle identification method performed by an autonomous vehicle identification device installed in a roadside device includes data indicating an autonomous driving state in a probe vehicle data (PVD) message received from a first vehicle that has entered a V2I communication area of a roadside device. Determining whether a frame or a data element representing an autonomous driving level exists, and if the data frame or data element does not exist, identifying a vehicle that has transmitted the PVD message as a general vehicle.|1. An autonomous vehicle identification method performed by an autonomous vehicle identification device installed in a roadside device, comprising: receiving a probe vehicle data (PVD) message from a vehicle that has entered a vehicle to infrastructure (V2I) communication area of the roadside device;\ndetermining whether a data element indicating an autonomous driving level exists in the PVD message; and if the data element does not exist in the PVD message, identifying a vehicle that has transmitted the PVD message as a general vehicle, wherein a data element indicating the autonomous driving level is DE_AutonomousLevel and includes the data element. The data frame is a specific data frame representing an autonomous driving state, and the specific data frame is any one of a data element (DE_ODDinfo) defining operation design domain (ODD) information and a data element (DE_FallbackStatus) defining fallback information. A self-driving vehicle identification method further comprising the above. \n | 2. The method of claim 1, further comprising identifying a vehicle that has transmitted the PVD message as an autonomous vehicle when the data element indicates an autonomous driving level of one or more.\n | 3. The autonomous driving method according to claim 2, further comprising identifying a vehicle that has transmitted the PVD message as a non-autonomous vehicle or a connected car without an autonomous driving function, if the data element indicates an autonomous driving level of less than 1. Vehicle identification method.\n | 4. The method of claim 1, wherein the specific data frame is an AutonomousStatus indicating an autonomous driving state.\n | 5. An autonomous vehicle identification method performed by an autonomous vehicle identification device installed in a roadside device, comprising: receiving a probe vehicle data (PVD) message from a vehicle that has entered a vehicle to infrastructure (V2I) communication area of the roadside device;\ndetermining whether a data element indicating an autonomous driving level exists in the PVD message; \ndetermining whether the data element indicates one or more autonomous driving levels; and identifying a vehicle transmitting the PVD message as an autonomous vehicle when the data element indicates one or more autonomous driving levels. \n, wherein the data element indicating the autonomous driving level is DE_AutonomousLevel, the data frame including the data element is a specific data frame representing an autonomous driving state, and the specific data frame defines ODD (operation design domain) information. The autonomous vehicle identification method further includes any one or more of a data element (DE_ODDinfo) that defines fallback information and a data element (DE_FallbackStatus) that defines fallback information. \n | 6. The method according to claim 5, further comprising: identifying a vehicle transmitting the PVD message as a non-autonomous vehicle or a connected car without an autonomous driving function when the data element indicates an autonomous driving level of less than 1; and if the data element does not exist in the PVD message, identifying the vehicle transmitting the PVD message as a normal vehicle.\n | 7. The method of claim 5, wherein the specific data frame is Autonomous indicating an autonomous driving state.\n | 8. An autonomous vehicle identification device installed in a roadside device to identify an autonomous vehicle, comprising: a wireless communication module supporting vehicle to everything (V2X) communication; and at least one processor connected to the wireless communication module, wherein the at least one processor receives a probe vehicle data (PVD) message from a first vehicle that has entered a vehicle to infrastructure (V2I) communication area of the roadside device. Receiving through the wireless communication module, determining whether a data element indicating an autonomous driving level exists in the PVD message, determining whether the data element indicates one or more autonomous driving levels, and determining whether the data element indicates one or more autonomous driving levels, and When the above autonomous driving level is indicated, a step of identifying a vehicle having transmitted the PVD message as an autonomous vehicle is performed, a data element indicating the autonomous driving level is DE_AutonomousLevel, and a data frame including the data element is It is a specific data frame representing the autonomous driving state, The specific data frame further includes any one or more of a data element (DE_ODDinfo) defining operation design domain (ODD) information and a data element (DE_FallbackStatus) defining fallback information.\n | 9. The method according to claim 8, wherein the at least one processor, when the data element indicates an autonomous driving level of less than 1, the vehicle transmitting the PVD message to a non-autonomous vehicle or a connected car not equipped with an autonomous driving function. Identifying, and if the data element does not exist in the PVD message, identifying a vehicle that has transmitted the PVD message as a normal vehicle.\n | 10. The apparatus of claim 8, wherein the specific data frame is an AutonomousStatus indicating an autonomous driving state.", "output": "The method involves receiving a probe vehicle data (PVDD) message from a vehicle, and determining whether a data element indicating an autonomous driving level exists in the PVD message. A specific data frame is provided for representing the autonomous driving state, where the data frame includes the data element that defines operation design domain (ODD) information and data element defines fallback status. The data element is provided with a data frame that represents an autonomous status. An INDEPENDENT CLAIM is included for a autonomous vehicle identification device for identify autonomous vehicle. Method for identifying autonomous vehicle i.e. car using autonomous vehicle identification device installed in roadside device in autonomous driving environment. The method enables effectively identifying whether a target vehicle within a communication area is an autonomous vehicle or a vehicle equipped with an autonomous driving function. The method enables defining an identification factor in a message frame in compliance with a V2X communication data standard defined in a SAE J2735, so that scenarios for future driving negotiations can be effectively responded. The drawing shows a block diagram of method for identifying autonomous vehicle in autonomous driving environment. 100Roadside device110Processor120Memory130Transceiver150Communication module", "instruction": "Please summarize the input" }, { "input": "The driving negotiation method and an apparatusPROBLEM TO BE SOLVED: To provide a driving negotiation method and apparatus for supporting stability against a blind spot, an unexpected situation, etc., and a rapid judgement and response of an autonomous vehicle in various driving environments. \n\nSOLUTION: A driving negotiation apparatus includes a wireless communication module configured to support V2X (vehicle to everything) communication and at least one processor connected to the wireless communication module. The at least one processor receives a cooperative request message from a first vehicle, broadcasts a cooperative request message and additional information required for a negotiation to surrounding vehicles, receives a cooperative response message from at least one second vehicle among the surrounding vehicles and transmits a message indicating that the negotiation is possible or impossible to the first vehicle on the basis of the cooperative response message. \n\nSELECTED DRAWING: Figure 3|1. The wireless communication module which assists V2X (vehicle to everything) communication; The device includes at least one processor connected to the wireless communication module; the at least one processor receives a cooperation request message (CooperativeRequestMsg) from the first vehicle; and the like, and the communication module is provided with the above-mentioned processor. The cooperation relay message including the type code of the additional information required for negotiation in the cooperation request message is broadcasted to the surrounding vehicle; and the cooperation relay message is performed to the surrounding vehicle. A cooperative response message (CooperativeReponseMsg) corresponding to the cooperative relay message is received from at least one second vehicle out of the surrounding vehicles; and the cooperative reply message is received from the second vehicle. A message for negotiable or non-negotiable state is transmitted to the first vehicle based on the cooperation response message; and the travel negotiation device is provided.\n | 2. The cooperation request message or the cooperation response message includes a vehicle speed; a data frame (data frames, DFs) for lane change and lane merging; and The first data frame relative to the vehicle speed is overpassed as negotiation message information about the speed adjustment plan, and includes a first data element for deceleration and stop. A second data frame for the lane change is avoided as negotiation message information for a lane change plan; an accident; and a second data element for an interruption and a pedestrian; and a second data frame is included in the lane change plan. A third data frame for the lane merging is used as a negotiation message for a merging plan; a merging path; an intersection; and a third data element with respect to the rotation intersection; and a traveling negotiation device described in claim 1.\n | 3. The cooperation request message or the cooperation response message is a time stamp (timestamp), a vehicle identifier (id or TemporaryID), a message identifier (UniqueMSG_ID, messangeid or MsgID), and a message identifier (ID). The traveling negotiation device includes a data element for a previous message identifier (previousmessageid or preMsgID) and message information (infoMsg), and is described in claim 1.\n | 4. The message identifier that is the data element of the cooperation request message is the present travel negotiation session identification value, and is used to define a process from the time of the travel negotiation request to the response as one session. The traveling negotiation device is described in claim 3.\n | 5. The message information that is the data element of the cooperation request message is the sequence of message lists or message types required for the negotiation information (SequenceofMessageType); The traveling negotiation device is received from the first vehicle in a null state in the first travel negotiation, and is described in claim 3.\n | 6. The at least one processor is configured to generate a related message related to the message type by the message type of the negotiation request message in relation to the additional information before broadcasting the cooperative relay message to the surrounding vehicle. A preset type code of the related message is included in the cooperation request message; and the related message is included in the cooperation request message. The at least one processor, when broadcasting the cooperative relay message, broadcasted the relevant message together with the cooperative relay message; and the traveling negotiation device described in the claim 1.\n | 7. The cooperation response message includes a response value to the negotiation of the second vehicle itself and a data field of a response type including information necessary for negotiation, and includes a data element for the message identifier of the first vehicle and the previous message identifier. The traveling negotiation device is described in claim 1.\n | 8. The response value for the negotiation includes a data element for consent (agree) or rejection (refuse); the radio communication module is a roadside base station or a roadside device (road side) (road side) It is installed in unit, RSU); the travel negotiation device described in Claim 7.\n | 9. The message for the negotiable or non-negotiable message is transmitted to the first vehicle in the form of a broadcast, and the traveling negotiation device described in the claim 1 is provided.\n | 10. The at least one processor transmits a message for the negotiable or non-negotiable message to the first vehicle after the negotiation. A message for renegotiation for making an identification value of a present travel negotiation session from the first vehicle to an identification value of a previous travel negotiation session or further a cooperation request message is further received; the travel negotiation device described in the claim 1. .\n | 11. A step of receiving a cooperation request message (CooperativeRequestMsg) from a first vehicle; A step for generating a cooperative relay message including a type code of additional information required for negotiation in the cooperation request message is generated. The cooperative relay message is broadcasted to the surrounding vehicle. A step of receiving a cooperative response message (CooperativeReponseMsg) corresponding to the cooperative relay message is received from at least one second vehicle out of the surrounding vehicles. A message for negotiable or non-negotiable state is transmitted to the first vehicle, based on the cooperation response message. ; the traveling negotiation method.\n | 12. The cooperation request message or the cooperation response message includes a vehicle speed; a data frame (data frames, DFs) for lane change and lane merging; and The first data frame relative to the vehicle speed is overpassed as negotiation message information about the speed adjustment plan, and includes a first data element for deceleration and stop. A second data frame for the lane change is avoided as negotiation message information for a lane change plan; an accident; and a second data element for an interruption and a pedestrian; and a second data frame is included in the lane change plan. A third data frame for the lane merging is used as a negotiation message for the merging plan; and a merging path; an intersection; and a third data element with respect to the rotation intersection, and the traveling negotiation method described in the claim 11.\n | 13. The cooperation request message or the cooperation response message is a time stamp (timestamp), a vehicle identifier (id or TemporaryID), a message identifier (UniqueMSG_ID, messangeid or MsgID), and a message identifier (ID). The traveling negotiation method includes a data element to a former message identifier (previousmessageid or preMsgID) and message information (infoMsg); and a traveling negotiation method described in claim 11.\n | 14. The message identifier that is the data element of the cooperation request message is the present travel negotiation session identification value, and is used to define a process from the time of the travel negotiation request to the response as one session. The travel negotiation method described in claim 13.\n | 15. The message information that is the data element of the cooperation request message is the sequence of message lists or message types required for the negotiation information (SequenceofMessageType); The traveling negotiation method is received from the first vehicle in a null state during the first travel negotiation, and the traveling negotiation method described in the claim 13.\n | 16. Before the cooperation relay message is broadcasted to the surrounding vehicle, it further includes a stage for generating related messages related to the message type by the message type of the cooperation request message in relation to the additional information. When the cooperative relay message is broadcasted to the surrounding vehicle, the relevant message is broadcasted together with the cooperative relay message; and the traveling negotiation method described in the claim 11.\n | 17. The cooperation response message includes a response value to the negotiation of the second vehicle itself and a data field of a response type including information necessary for negotiation, and includes a data element for the message identifier of the first vehicle and the previous message identifier. The travel negotiation method described in claim 11.\n | 18. The response value for the negotiation includes a data element for consent (agree) or rejection (refuse), and the travel negotiation method described in claim 17.\n | 19. The message for the negotiable or non-negotiable message is transmitted to the first vehicle in the form of a broadcast, and the traveling negotiation method described in the claim 11 is disclosed.\n | 20. After transmitting a message to the first vehicle for the negotiable or non-negotiable message, the message is transmitted to the vehicle. The method further includes a step of receiving a message for renegotiation of identifying a current travel negotiation session identification value from the first vehicle to an identification value of a previous travel negotiation session or further a further request for cooperation request message. The travel negotiation method described in claim 11.", "output": "The device (100) has a wireless communication module supporting vehicle to everything (V2X) communication. A processor (110) is connected to the wireless communication module, and receives a cooperation request message (CooperativeRequestMsg) from the first vehicle, and a type code of additional information required for negotiation in the cooperation request message Broadcasting a cooperative relay message including a to surrounding vehicles, receiving a cooperative response message (CooperativeReponseMsg) corresponding to the cooperative relay message from a second vehicle among the surrounding vehicles, and negotiation based on the cooperative response message. The cooperation request message or the cooperation response message includes a timestamp, a vehicle identifier (id or TemporaryID), a message identifier (UniqueMSG-ID, messangeid, or MsgID). The message identifier is a data element of the cooperation request message. An INDEPENDENT CLAIM is included for a driving negotiation method. Driving negotiation device for supporting the stability of blind spots or unexpected situations. The stability and ability of autonomous vehicles to cope with blind spots or unexpected situations are improved. The driving stability of vehicles such as autonomous vehicles is improved by providing a message set and negotiation process definition for V2I communication-based driving negotiation in a C-ITS environment. The drawing shows a block diagram for explaining the main configuration of a driving negotiation apparatus for executing the driving negotiation method. (Drawing includes non-English language text) 100Driving negotiation device110Processor120Memory130Transceiver150Wireless communication module", "instruction": "Please summarize the input" }, { "input": "Systems And Methods Using Artificial Intelligence For Routing Electric VehiclesThe present invention provides specific systems, methods and algorithms based on artificial intelligence expert system technology for determination of preferred routes of travel for electric vehicles (EVs). The systems, methods and algorithms provide such route guidance for battery-operated EVs in-route to a desired destination, but lacking sufficient battery energy to reach the destination from the current location of the EV. The systems and methods of the present invention disclose use of one or more specifically programmed computer machines with artificial intelligence expert system battery energy management and navigation route control. Such specifically programmed computer machines may be located in the EV and/or cloud-based or remote computer/data processing systems for the determination of preferred routes of travel, including intermediate stops at designated battery charging or replenishing stations. Expert system algorithms operating on combinations of expert defined parameter subsets for route selection are disclosed. Specific fuzzy logic methods are also disclosed based on defined potential route parameters with fuzzy logic determination of crisp numerical values for multiple potential routes and comparison of those crisp numerical values for selection of a particular route. Application of the present invention systems and methods to autonomous or driver-less EVs is also disclosed.The embodiments of the invention in which an exclusive property or privilege is claimed are defined as follows: \n | 1. An artificial intelligence (AI) Electric Vehicle (EV) route optimization method comprising: \nan electronic, specifically programmed, communication computer AI system performing EV route optimization for travel of said EV from a designated origin location or EV present location to an EV designated destination location with intermediate stops at intervening battery charging stations to maintain battery charge levels; \nstoring in memory one or more EV attribute parameters comprising EV operational status parameters, EV location parameters, or EV battery status parameters; \nderivation of EV potential route condition parameters for said EV based on information exchanges with at least two of: (1) communication network connections with application servers, (2) communication network connections with other motor vehicles, (3) communication network connections with pedestrians, and (4) communication network connections with roadside monitoring and control units; \nstoring in memory expert defined propositional logic inference rules specifying multiple multidimensional conditional relationships between two or more of said EV attribute parameters and EV potential route condition parameters, and with expert defined individual parameter degree of danger value ranges; \nAI evaluation and assignment of expert defined value ranges to selected of said EV attribute parameters and selected of said EV potential route condition parameters and wherein said expert defined value ranges depend on individual parameter importance to EV route optimization; \nstoring in memory expert defined propositional logic inference rules defining multiple range dependent conditional relationships between two or more interrelated multidimensional parameters comprising selected said EV attribute parameters and selected said EV potential route condition parameters; \nAI evaluation of EV potential routes of travel from said EV designated origin location or EV present location to said EV designated destination location based on said EV attribute parameters, said EV potential route condition parameters and said expert defined propositional logic inference rules, and further wherein EV potential routes of travel include visiting battery charging stations as necessary to maintain proper EV battery charge levels to reach said EV designated destination location; and, \nAI expert system optimization of selection of a particular route of travel based on said AI evaluation of said EV potential routes of travel comprising expert system analysis of one or more multidimensional combinations of said two or more interrelated multidimensional parameters of said EV attribute parameters and said EV potential route condition parameters. \n | 2. The AI EV route optimization method of claim 1 further comprising accessing said EV potential route condition parameters using Internet telecommunications technology.\n | 3. The AI EV route optimization method of claim 1 further comprising accessing of said EV potential route condition parameters using cellular communication technology to receive or transmit information between said EV and said external information sources.\n | 4. The AI EV route optimization method of claim 1 further comprising exchanging selected of said EV attribute parameters of said EV with other motor vehicles or via remote information source facilities.\n | 5. The AI EV route optimization method of claim 1 wherein said EV route optimization is further based upon battery charging station usage and actual or probable requests for route guidance from other EVs traveling within a defined distance from said EV present location, and further wherein such information that is accessed from said other EV's affects the expected waiting times or queues encountered at battery charging stations on possible routes of travel.\n | 6. The AI EV route optimization method of claim 1 wherein said EV present location information is derived from motor vehicle GPS (Global Positioning System) signal sensors or from determination of the distance of said EV from cellular telephone towers or other known fixed locations transmitting signals received by one of the EV receivers.\n | 7. The AI EV route optimization method of claim 1 wherein said potential route condition parameters comprise dynamic roadway conditions further comprising one or more of traffic congestion, weather conditions, police reported concerns, or other dynamic roadway condition information received from external information source database or data processing units.\n | 8. The AI EV route optimization method of claim 1 wherein EV route selection decisions comprise consideration of potential dynamically changing charging requirements from other vehicles within a defined radius or distance from said EV present location.\n | 9. The AI EV route optimization method of claim 1 wherein communicating with said external information sources further comprises operating an RFID (radio frequency identification) tag device used to identify the EV and communicate information with RFID tag readers located along highways tollways or roadways along which the EV is traveling.\n | 10. The AI EV route optimization method of claim 7 wherein said external information source database or data processing units are cloud based and are accessed through the Internet or cellular telephone communication networks.\n | 11. The AI EV route optimization method of claim 1 further comprising Bluetooth, Wi-Fi or other voice or data telecommunication capabilities for communicating with charging stations or other nearby vehicles present in ongoing traffic or waiting for use of charging stations.\n | 12. The AI EV route optimization method of claim 1 wherein said EV potential route condition parameters from external information sources comprise pedestrian or crowd information.\n | 13. The AI EV route optimization method of claim 1 wherein said EV accesses information from communication network applications.\n | 14. The AI EV route optimization method of claim 13 wherein EV access of said communication network applications further comprising one or more of a Navigation System Application, Traffic Database Application, EV Account Application, Battery Charger/Replacement Station Application, Weather Data Application, Police Report Application, Special Event Application, or Road Condition Application.\n | 15. The AI EV route optimization method of claim 14 wherein said Traffic Database Application comprises EV vehicle traffic congestion or density data.\n | 16. The AI EV route optimization method of claim 14 wherein said Special Event Application comprises traffic or crowd congestion arising from special events along potential routes of travel.\n | 17. The AI EV route optimization method of claim 1 wherein said EV attribute parameters and said EV potential route condition parameters from external information sources are stored in a remote database and wherein said remote database may be accessed and updated via vehicle-to-network connections.\n | 18. The AI EV route optimization method of claim 1 wherein said EV is a driverless or autonomous driving vehicle.\n | 19. An artificial intelligence (AI) Electric Vehicle (EV) route optimization system comprising: \nan electronic, specifically programmed, communication computer AI system performing EV route optimization for travel of said EV from an EV designated origin location or EV present location to an EV designated destination location with intermediate stops at intervening battery charging stations to maintain battery charge levels; \na memory for storing one or more EV attribute parameters comprising EV operational status parameters, EV location parameters, or EV battery status parameters; \nderivation of EV potential route condition parameters for said EV based on information exchanges with at least two of: (1) communication network connections with application servers, (2) communication network connections with other motor vehicles, (3) communication network connections with pedestrians, and (4) communication network connections with roadside monitoring and control units; \nevaluating and assigning AI expert defined value ranges to selected of said EV attribute parameters and selected of said EV potential route condition parameters and wherein said expert defined value ranges depend on individual parameter importance to EV route optimization; \na memory for storing expert defined propositional logic inference rules defining multiple range dependent conditional relationships between two or more interrelated multidimensional parameters comprising selected said EV attribute parameters and selected said EV potential route condition parameters; \nAI evaluation of EV potential routes of travel from said EV designated origin location or EV present location to said EV designated destination location based on said EV attribute parameters, said EV potential route of travel parameters and said expert defined propositional logic inference rules, and further wherein said EV potential routes of travel include visiting battery charging stations as necessary to maintain proper EV battery charge levels to reach said EV designated destination location; and, \nAI expert system optimization of selection of a particular route of travel based on said AI evaluation of said EV potential routes of travel comprising expert system analysis of one or more multidimensional combinations of said two or more interrelated multidimensional parameters of said EV attribute parameters and said EV potential route condition parameters. \n | 20. The artificial intelligence (AI) Electric Vehicle (EV) route optimization system of claim 19 further comprising accessing said EV potential route condition parameters using cellular or Internet telecommunications technology.", "output": "The method involves performing electric vehicle (EV) route optimization for travel of EV from a designated origin location or EV present location to an EV designated destination location with intermediate stops at intervening battery charging stations (104) to maintain battery charge levels. One or more EV attribute parameters comprising EV operational status parameters, EV location parameters, or EV battery status parameters are stored in memory. The EV potential route condition parameters for EV are derivation based on information exchanges with at least two of communication network connections with application servers. The AI expert system optimization of selection of a particular route of travel based on artificial intelligence (AI) evaluation of EV potential routes of travel is performed. An INDEPENDENT CLAIM is included for an AI EV route optimization system. AI EV route optimization method. The method enables providing efficient routing algorithms that can be employed in real-time without excessive and complex computation and that consider multiple factors such as battery charging-replacement station locations, required time of travel, roadway conditions, traffic congestion, weather conditions and/or emergency traffic considerations, thus improving EV operational usefulness through determination of preferred routes of travel where the preferred routes include intermediate charging or replacement of EV batteries as required. The drawing shows a schematic diagram illustrating configuration of a driving situation with recharging stations benefiting from a routing and control system without limitation.101Driving area 102GPS satellite 103Destination 104Charging station 105Particular automotive vehicle ", "instruction": "Please summarize the input" }, { "input": "Method and apparatus for vehicle-mounted enhanced visualization of sensor range and field of viewSome embodiments of the methods disclosed herein may include: receiving the predicted driving route, the sensor range of the sensor on the autonomous vehicle (AV) and the sensor field of view (FOV) data; determining whether a minimum sensor visibility requirement is met along the predicted driving route; predicting a blind area along the predicted driving route, wherein the predicted blind area is determined to have potentially reduced sensor visibility; and AR visualization using augmented reality (AR) display devices to display blind areas.|1. A method, comprising: receiving sensor range and sensor field FOV data of the sensor on the first vehicle; receiving blind area information from the second vehicle; predicting a blind area along the predicted driving path, wherein the predicted blind area is determined to have a potentially weakened sensor visibility; and using the augmented reality AR display device to display the AR visualization of the blind area, the method further comprising: receiving the predicted driving route; and determining whether the minimum sensor visibility requirement is satisfied along the predicted driving route, wherein determining whether the minimum sensor visibility requirement is satisfied comprises: determining a percentage of a minimum visibility region covered by the field of view FOV of the one or more sensors; and determining whether the percentage exceeds a visibility region threshold.\n | 2. The method according to claim 1, wherein the first vehicle is a partially autonomous vehicle having at least one of a manual mode or a driver-assisted mode.\n | 3. The method according to claim 1 or 2, wherein the first vehicle is a fully autonomous vehicle.\n | 4. The method according to claim 1, further comprising: receiving map data; and updating the blind area, wherein updating the blind area comprises comparing the received map data with local dynamic map data.\n | 5. The method according to claim 1 or 2, further comprising: The second vehicle is determined to be in a blind area in the predicted blind area based on the blind area information.\n | 6. The method according to claim 5, further comprising: In response to determining that the second vehicle is in the blind area in the predicted blind area, an indication that the second vehicle is in the blind area is displayed.\n | 7. The method according to claim 1 or 2, further comprising: identification blind area reduction technology; and moving the first vehicle from a first position to a second position in response to an identification blind area reduction technique.\n | 8. The method according to claim 7, wherein the blind area reduction technique includes at least one of repositioning the first vehicle or adjusting an orientation of one of the sensors.\n | 9. The method according to claim 1 or 2, wherein predicting a blind area comprises determining a visibility region at a plurality of locations along the predicted driving route.\n | 10. The method according to claim 9, wherein determining a visibility region comprises simulating sensor visibility at the plurality of locations along the predicted driving route using three-dimensional 3D map data.\n | 11. The method according to claim 1 or 2, wherein predicting the blind area comprises continuously estimating the location of the blind area based on a plurality of sensor readings.\n | 12. The method according to claim 1 or 2, further comprising: The orientation of the first vehicle is tracked, wherein a prediction blind area is based on the orientation of the first vehicle.\n | 13. The method according to claim 1 or 2, wherein displaying the AR visualization of the blind area comprises projecting the AR visualization using a vehicle-mounted augmented reality projection system to display the AR visualization.\n | 14. The method according to claim 1 or 2, wherein displaying the AR visualization of the blind area comprises overlaying a highlighted display indicating the blind area on a map.\n | 15. The method according to claim 1 or 2, wherein displaying the AR visualization of the blind area comprises displaying a contour of an area indicative of the blind area on a map.\n | 16. A device, the device comprises: a processor; and a non-temporary computer-readable medium storing instructions that, when executed by the processor, are operable to cause the device to perform the method according to any one of claims 1 to 15.\n | 17. The apparatus according to claim 16, further comprising: a group of sensors; a blind area prediction module configured to identify a potential blind area; a driving mode selection module configured to select a driving mode; a communication module configured to receive a vehicle-to-vehicle V2V message; and an augmented reality AR display device.\n | 18. A method, comprising: receiving the predicted driving route, the sensor range of the sensor on the vehicle and the sensor field FOV data; determining whether the minimum sensor visibility requirement is satisfied along the driving route; predicting a blind area along the predicted driving route; wherein the predicted blind area is determined to have a potentially weakened sensor visibility; and displaying an AR visualization of the blind area using an augmented reality AR display device, wherein determining whether a minimum sensor visibility requirement is met comprises: determining a percentage of a minimum visibility region covered by the field of view FOV of the one or more sensors; and determining whether the percentage exceeds a visibility region threshold.\n | 19. The method according to claim 18, wherein predicting a blind area along the driving route comprises determining an area where a sensor visibility range along the driving route is less than a minimum sensor visibility range requirement, and wherein the step of determining a blind area along the driving route comprises determining an area where the sensor visibility range along the driving route is less than a minimum sensor visibility range requirement. The minimum sensor visibility requirement includes the minimum sensor visibility range requirement.\n | 20. The method according to claim 18, further comprising determining the minimum sensor visibility requirement along the driving route of the autonomous vehicle AV.\n | 21. A device, the device comprises: a processor; and a non-temporary computer-readable medium storing instructions that, when executed by the processor, are operable to cause the device to perform the method according to any one of claims 18 to 20.\n | 22. A method, comprising: predicting a blind area along the predicted driving route of the autonomous vehicle AV based on the expected limit of the sensor; and when the AV travels along the driving route, using an enhanced display AR visualization to provide an indication of the predicted blind area, the method further comprising: receiving the predicted driving route; and determining whether the minimum sensor visibility requirement is satisfied along the predicted driving route, wherein determining whether the minimum sensor visibility requirement is satisfied comprises: determining a percentage of a minimum visibility region covered by the field of view FOV of the one or more sensors; and determining whether the percentage exceeds a visibility region threshold.", "output": "The method (1700) involves receiving (1702) a predicted driving route, sensor ranges of sensors on a vehicle, and sensor field-of-view data, determining (1704) whether minimum sensor visibility requirements are met along the predicted driving route, predicting (1706) blind areas along the predicted driving route, in which the predicted blind areas are determined to have potentially diminished sensor visibility, and displaying (1708) an augmented reality visualization of the blind areas using an augmented reality display device. An INDEPENDENT CLAIM is also included for an apparatus for in-vehicle augmented reality visualization of sensor. Method for in-vehicle augmented reality visualization of sensor. Provides blind area reduction techniques, and responsive to identifying blind area reduction techniques, moving the vehicle from a first position to a second position which can help reduce significant traffic collisions. The drawing shows the flow diagram illustrating an example process for predicting blind areas and displaying a visualization corresponding to the predicted blind areas 1700Method1702Receiving predicted route sensor ranges of sensors1704Determining whether minimum sensor visibility requirement are met along predicted driving route1706Predicting blind areas along predicted driving route1708Displaying augmented reality visualization of blind areas", "instruction": "Please summarize the input" }, { "input": "System and/or method for platooningThe system can include a dispatcher and a plurality of cars. However, the system 100 can additionally or alternatively include any other suitable set of components. The system 100 functions to enable platooning of the plurality of cars (e.g., by way of the method S100).We claim: \n | 1. A method for coordinated braking within a rail platoon, comprising: \ndetermining, at an autonomous rail vehicle within the rail platoon, a compressive force at a leading end of the autonomous rail vehicle in a direction of traversal; \ndetermining a coordinated braking event; and \nin response to determining the coordinated braking event, braking at the autonomous rail vehicle while maintaining compression at the leading end, comprising autonomously controlling an independent set of brakes of the autonomous vehicle based on the compressive force at the leading end. \n | 2. The method of claim 1, wherein braking at the autonomous rail vehicle comprises: based on the compressive force, independently controlling a regenerative braking of a battery-electric powertrain of the autonomous vehicle.\n | 3. The method of claim 1, wherein the leading end comprises an abutment surface of a bumper which is suspended relative to a chassis of the autonomous rail vehicle.\n | 4. The method of claim 1, wherein determining the coordinated braking event comprises receiving a braking command via a vehicle-to-vehicle (V2V) communication from a rail vehicle within the rail platoon.\n | 5. The method of claim 4, further comprising: in response to receiving the V2V communication, relaying the braking command to a second autonomous rail vehicle within the platoon.\n | 6. The method of claim 1, wherein the coordinated braking event is determined based on the compressive force.\n | 7. The method of claim 1, further comprising: separating from the platoon based on a track geometry or a location of a level crossing.\n | 8. The method of claim 1, wherein the compressive force is determined with a load cell.\n | 9. A method, comprising: \ndetermining, at a rail vehicle within a rail platoon, a contact force at a leading end of the rail vehicle in a direction of traversal; and \nautonomously controlling the rail vehicle within the rail platoon, comprising: \ndetermining a platoon control target; and \nbased on the contact force, controlling an independent powertrain of the rail vehicle to achieve the platoon control target. \n | 10. The method of claim 9, wherein the independent powertrain comprises a battery-electric powertrain.\n | 11. The method of claim 10, wherein autonomously controlling the rail vehicle comprises regeneratively braking with the battery-electric powertrain.\n | 12. The method of claim 9, wherein the leading end comprises an abutment surface of a bumper which is damped relative to a chassis of the autonomous rail vehicle.\n | 13. The method of claim 9, wherein the platoon control target comprises a target contact force.\n | 14. The method of claim 13, wherein the target contact force is based on a relative energy distribution of the rail platoon.\n | 15. The method of claim 9, wherein the platoon control target comprises a speed setpoint.\n | 16. The method of claim 9, further comprising: \nreceiving, at the autonomous rail vehicle, a set of dispatch instructions associated with a coordinated separation of the rail platoon; and \nbased on the set of dispatch instructions, controlling the independent powertrain of the rail vehicle to separate from a leading portion of the rail platoon at the leading end. \n | 17. The method of claim 16, wherein the coordinated separation is based on a track geometry or a location of a level crossing.\n | 18. The method of claim 16, wherein the set of dispatch instructions comprises a warrant for the autonomous rail vehicle, wherein the control target is determined based on the warrant.\n | 19. The method of claim 9, wherein autonomously controlling the rail vehicle within the rail platoon comprises: at the leading end of the rail vehicle, pushing an adjacent rail vehicle.\n | 20. The method of claim 19, wherein no components span between the rail vehicle and the adjacent rail vehicle.", "output": "The method involves determining a compressive force at a leading end of an autonomous rail vehicle within the rail platoon in a traversal direction. A coordinated braking event is determined. Autonomous rail vehicle braking process is performed while maintaining compression at the leading end in response to determining the coordinated braking event. An independent set of brakes of the autonomous vehicle is autonomously controlled based on the compressive force at the leading end. An abutment surface of a bumper is suspended relative to a chassis of the autonomous rail vehicle. Method for realizing coordinated braking of self-propelling rail cars within rail platoon. The method enables reducing the risk of injury to the driver of the vehicle, preventing the vehicle from colliding with the other vehicles, and improving the operating reliability by eliminating components subject to failure and removing possibilities for human error. The drawing shows a schematic view of a structure for realizing coordinated braking of self-propelling rail cars within rail platoon.100Self-propelling rail cars coordinated braking system 110Disptacher 120Cars ", "instruction": "Please summarize the input" }, { "input": "SYSTEM AND/OR METHOD FOR PLATOONINGThe system can include a dispatcher and a plurality of cars. However, the system 100 can additionally or alternatively include any other suitable set of components. The system 100 functions to enable platooning of the plurality of cars (e.g., by way of the method S100).We claim: \n | 1. A method comprising: \npositioning a first railway vehicle along a track; \nproviding a first set of instructions to a second railway vehicle; and \nbased on the first set of instructions, controlling traversal of the second railway vehicle until the second railway vehicle abuts the first railway vehicle; \nproviding a second set of instructions to the first railway vehicle; \nbased on the second set of instructions, controlling traversal of the first railway vehicle in a direction of transit; and \nwhile controlling traversal of the first railway vehicle based on the second set of instruction, maintaining abutment between the second railway vehicle and the first railway vehicle by controlling the second railway vehicle to push the first railway vehicle in the direction of transit. \n | 2. The method of claim 1, wherein the second railway vehicle is dynamically controlled to push the first railway vehicle in the direction of transit.\n | 3. The method of claim 2, wherein the second railway vehicle is dynamically controlled based on a motion of the first railway vehicle.\n | 4. The method of claim 2, wherein the second railway vehicle is controlled with a feedback controller based on a push force of the second railway vehicle applied on the first railway vehicle in the direction of transit.\n | 5. The method of claim 1, wherein the second set of instructions is received from a remote dispatch system.\n | 6. The method of claim 5, wherein controlling the second railway vehicle to push the first railway vehicle in the direction of transit comprises receiving, at the second railway vehicle, a third set of instructions from the remote dispatch system.\n | 7. The method of claim 1, wherein the second set of instructions corresponds to both the first and second rail vehicles.\n | 8. The method of claim 1, wherein positioning a first railway vehicle along a track comprises: controlling a powertrain of the first railway vehicle with an autonomous controller of the first railway vehicle based on a location of the railway vehicle.\n | 9. The method of claim 1, wherein the second railway vehicle initially contacts the first railway vehicle while the first railway vehicle is substantially stationary.\n | 10. The method of claim 1, wherein the first railway vehicle is autonomous vehicle.\n | 11. The method of claim 1, wherein the second railway vehicle comprises an autonomous electric bogie.\n | 12. A method comprising: \nforming a platoon of rail vehicles comprising: independently controlling each rail vehicle of the platoon to arrange the rail vehicles in series along a track, with abutment between each pair of adjacent rail vehicles of the platoon; and \ncontrolling traversal of the platoon in a first direction, comprising: for each pair of adjacent rail vehicles, controlling a trailing rail vehicle of the pair to push against a leading vehicle of the pair in the first direction. \n | 13. The method of claim 12, wherein the abutment between each pair of adjacent rail vehicles of the platoon is continuously maintained during traversal of the platoon.\n | 14. The method of claim 12, wherein forming the platoon comprises simultaneously maneuvering a plurality of the rail vehicles within a rail yard.\n | 15. The method of claim 12, wherein controlling traversal of the platoon comprises, at a forwardmost rail vehicle of the platoon relative to the first direction: controlling traversal of the forwardmost rail vehicle according to a set of commands, wherein the set of commands are propagated rearwardly through the rail vehicles in series based on the motion of the forwardmost rail vehicle.\n | 16. The method of claim 15, wherein the forwardmost rail vehicle is controlled via a velocity controller or torque controller.\n | 17. The method of claim 12, further comprising: while controlling traversal of the platoon in the first direction, executing a coordinated deceleration of the platoon.\n | 18. The method of claim 17, wherein the coordinated deceleration is based on a plurality of wireless vehicle-to-vehicle (V2V) communications.\n | 19. The method of claim 12, wherein each rail vehicle of the platoon comprises a pair of electric bogies.\n | 20. The method of claim 19, wherein each electric bogie is autonomous and configured to be independently maneuverable.", "output": "The method involves positioning first railway vehicle (120) along a track, and providing first set of instructions to second railway vehicle. Traversal of the second railway vehicle is controlled based on the first set of instructions until the second railway vehicle abuts the first railway vehicle. Second set of instructions is provided to the first railway vehicle. Traversal of the first railway vehicle is controlled in direction of transit based on the second set of instructions. Abutment is maintained between the second railway vehicle and the first railway vehicle, while controlling traversal of the first railway vehicle based on the second set of instructions by controlling the second railway vehicle to move the first railway vehicle in direction of transit. The first railway vehicle is autonomous vehicle. The second railway vehicle comprises an autonomous electric bogie. Method for platooning a vehicle e.g. autonomous vehicle such as self-propelling railcar, and autonomous electric bogie (all claimed), in a payload transportation field. The method enables reducing risk of injury to the driver of the vehicle and preventing the vehicle from colliding with other vehicles. The method enables maintaining an energy source of the lead vehicle to facilitate continuous autonomous protection at the lead vehicle and maintain continuous energy supply without power contributions from a powertrain of the lead vehicle. The drawing shows a schematic block diagram of a system for platooning a vehicle.100System for platooning vehicle 110Dispatcher 120Railway vehicle V2IVehicle-to-infrastructure V2VVehicle-to-vehicle communications ", "instruction": "Please summarize the input" }, { "input": "Driving auxiliary method, road photographic image collecting method and road side deviceWithout significantly increasing the load of ITS communication, the load of control for detecting an event of interest to perform a danger avoidance action can be reduced, thereby appropriately assisting the driving control of an automatic driving vehicle. a vehicle-mounted terminal (4) loaded on a vehicle (1) that has ended the passage in the object interval of the road as an information providing source, when an event of interest is detected during the passage in the object interval, sending a photographic image and additional information related to the event of interest to a roadside machine (6) on the end point side of the end point of the object section, the roadside machine (6) on the end point side sends the photographic image and additional information related to the event of interest to a roadside machine (6) on the start point side of the start point of the object section, The road side machine (6) on the starting point side sends the photographic image and additional information related to the attention event to the vehicle-mounted terminal which is carried as the information providing destination to start to pass in the object interval, The vehicle terminal as the information providing destination performs the processing related to the driving control of the vehicle based on the shooting image related to the attention event and the additional information.|1. A driving assistance method, wherein a vehicle-mounted device mounted on a vehicle passing in an object section of a road as an information providing source transmits vehicle-mounted information indicating that the vehicle is mounted with a driving recorder and an image issuing function to a road-side device. the vehicle-mounted device sends a photographic image and additional information related to the interest event to the road side device when the interest event is detected during the passage of the object interval, when the road side device does not receive the vehicle-mounted information within a predetermined time from the vehicle-mounted information received by the vehicle-mounted device, the detection of the attention event is started, when the attention event is detected, the photographic image and additional information related to the attention event are collected, the collected captured image and the additional information are transmitted directly or via other road side devices to a vehicle-mounted device mounted on a vehicle to be started to pass in the object interval as an information providing destination, the road side device receives the vehicle-mounted information from the vehicle-mounted device, stopping the detection of the attention event and the transmission of the photographic image and additional information collected by the device, sending the photographic image and additional information related to the attention event received from the vehicle-mounted device directly or via other road side device to the vehicle-mounted device as the information providing destination of the vehicle to be started to pass in the object interval, The vehicle-mounted device as the information providing destination performs processing related to the travel control of the vehicle based on a photographic image related to the event of interest and additional information.\n | 2. The travel assistance method according to claim 1, wherein the additional information comprises the interval information of the interval where the vehicle is currently located, the information related to the travel direction of the vehicle, the information related to the content of the attention event, and the position information of the vehicle on the map when the attention event is detected. Position information of a place where an event of interest occurs on a photographic image and time information when the event of interest is detected.\n | 3. The driving assistance method according to claim 1, wherein the vehicle-mounted device as the information providing destination is based on the similarity between the shot image related to the attention event and the shot image at the current time point. to make a determination that the vehicle is approaching the place of occurrence of the event of interest.\n | 4. The travel assistance method according to claim 3, wherein the vehicle-mounted device as the information providing destination outputs a risk avoidance operation instruction to the travel control device when it is determined that the vehicle is approaching the occurrence point of the event of interest.\n | 5. The travel assistance method according to claim 1, wherein the on-vehicle device as the information supply source transmits the photographic image and additional information related to the event of interest to a road side device located at the end point side of the end point of the object section. the road side device at the end point side sends the photographic image and additional information related to the attention event to the road side device at the starting point side of the starting point of the object interval, The road side device at the starting point side sends the photographic image and additional information related to the attention event to the vehicle-mounted device as the information providing destination.\n | 6. The travel assistance method according to claim 1, wherein the vehicle-mounted device as an information supply source is mounted on a vehicle passing in the first direction in the object section. The vehicle-mounted device as an information providing destination is mounted on a vehicle passing in a second direction opposite to the first direction in the object section, the road side device extracts the photographic image of the moving object at the front side of the shielding object from the photographic image obtained from the vehicle-mounted device as the information providing source passing to the first direction, calculating the position of the moving object on the photographic image in the second direction obtained by the vehicle-mounted device of the vehicle passing from the past to the second direction, generating a composite image obtained by superposing the photographic image of the moving object on the photographic image in the second direction based on the calculated position, The roadside device transmits the composite image as a photographic image related to the event of interest to the on-vehicle device as an information providing destination.\n | 7. The driving assistance method according to claim 1, wherein the attention event is a traffic accident, the road side device accumulates the captured image and additional information related to the attention event obtained in the vehicle-mounted device or the device as the information providing source in the device. In addition to the latest photographic image and additional information related to the event of interest, the roadside device also transmits the photographic image and additional information related to the event of interest accumulated in the local device to the on-vehicle device as an information providing destination.\n | 8. A road photographic image collecting method, wherein the vehicle-mounted device loaded on the vehicle passing in the object interval of the road sends the vehicle-mounted information indicating that the vehicle is loaded with the driving recorder and the image issuing function to the road side device, the vehicle-mounted device sends the shot image and additional information related to the interest event to the road side device when the interest event is detected during the passage of the object interval, when the road side device does not receive the vehicle-mounted information within a predetermined time from the vehicle-mounted information received by the vehicle-mounted device, the detection of the attention event is started, when the attention event is detected, the photographic image and additional information related to the attention event are collected, the collected photographic image and the additional information are transmitted to the server device, the road side device stops the detection of the attention event and the transmission of the photographic image and the additional information collected by the device under the condition that the vehicle carrying information is received from the vehicle-mounted device, sending the photographic image and additional information related to the event of interest received from the vehicle-mounted device to the server device, the server device accumulates the photographic image and additional information related to the event of interest, the server device according to the browsing request of the designated place from the user, A photographic image and additional information relating to the event of interest of the designated location is presented to the user.\n | 9. A road side device, comprising: a road-to-vehicle communication part for communicating with the vehicle-mounted device; and a processor, wherein the processor receives vehicle-mounted information indicating that the vehicle is mounted with a driving recorder and an image publishing function from a vehicle-mounted device which is used as an information providing source and is mounted on a vehicle which has passed in an object interval of the road, when the processor does not receive the vehicle loading information within a predetermined time from the last receiving of the vehicle loading information, the detection of the attention event is started, when the attention event is detected, the photographic image and additional information related to the attention event are collected, sending the collected photographic image and the additional information directly or via other road side device to the vehicle-mounted device loaded on the vehicle to be started to pass in the object interval as the information providing destination, the processor receives the vehicle loading information, stopping the detection of the attention event and the transmission of the photographic image and additional information collected by the device, receiving a photographic image and additional information related to an event of interest detected by the vehicle-mounted device as the information providing source during the passage of the object interval from the vehicle-mounted device as the information providing source through the road-to-vehicle communication part, The processor transmits the received photographic image and additional information related to the attention event to the vehicle-mounted device as the information providing destination of the vehicle to be started to pass in the object interval directly or via other road-side devices via the road-room communication part.\n | 10. The road side device according to claim 9, wherein when the device is located at the end point of the object area, the road side communication part receives the shot image and additional information related to the attention event from the vehicle-mounted device as the information providing source. when the device is located at the starting point of the object interval, the road-to-vehicle communication part sends the shot image and additional information related to the attention event to the vehicle-mounted device as the information providing destination.\n | 11. The roadside device according to claim 9, wherein the road-to-vehicle communication unit receives a photographic image and additional information related to the event of interest from the on-vehicle device as an information providing source mounted on a vehicle passing in the object section in the first direction. the road-to-vehicle communication part sends the shot image and additional information related to the attention event to the vehicle-to-vehicle device as the information providing destination of the vehicle passing in the second direction opposite to the first direction in the object interval, the processor extracts the photographic image of the moving object at the front side of the current shielding object from the photographic image obtained from the vehicle-mounted device as the information providing source passing to the first direction, calculating the position of the moving object on the photographic image in the second direction obtained by the vehicle-mounted device of the vehicle passing from the past to the second direction, generating a composite image obtained by superposing the photographic image of the moving object on the photographic image in the second direction based on the calculated position, The processor transmits the composite image as a photographic image related to the event of interest to the on-vehicle device as an information providing destination.\n | 12. The roadside device according to claim 9, wherein the attention event is a traffic accident, and the processor accumulates the captured image and additional information related to the attention event obtained from the on-vehicle device as the information providing source in the storage unit of the device. In addition to the latest photographic image and additional information related to the event of interest, the processor also transmits the photographic image and additional information related to the event of interest accumulated in the storage unit to the on-vehicle device as the information providing destination.\n | 13. A road side device, comprising: a road-to-vehicle communication part for communicating with the vehicle-mounted device; and a processor, wherein the processor receives vehicle-mounted information indicating that the vehicle is mounted with a driving recorder and an image publishing function from a vehicle-mounted device which is used as an information providing source and is mounted on a vehicle which has passed in an object interval of the road, when the processor does not receive the vehicle loading information within a predetermined time from the last receiving of the vehicle loading information, the detection of the attention event is started, when the attention event is detected, the photographic image and additional information related to the attention event are collected, sending the collected photographic image and the additional information to the server device, the processor stops the detection of the attention event and the sending of the photographic image and the additional information collected by the device, receiving a photographic image and additional information related to an event of interest detected by the vehicle-mounted device as the information providing source during the passage of the object interval from the vehicle-mounted device as the information providing source through the road-to-vehicle communication part, The processor transmits the received photographic image and additional information relating to the event of interest to a server device.", "output": "The method involves transmitting the photographed image and additional information regarding the noteworthy event to a roadside device, and the roadside device sends the photographed image and additional information when a noteworthy event is detected while passing through the target section. The captured image and additional information related to the attention event are transmitted directly or through another roadside device to the in-vehicle device of the information providing destination mounted on a vehicle (1) that starts the passage of the target section. The process related to travel control of the own vehicle is performed based on a photographed image and additional information related to the attention event. INDEPENDENT CLAIMS are included for the following:a road picked-up image collection method; anda roadside apparatus. Driving assistance method for assisting traveling control of vehicle. The safe driving of an automatic drive vehicle can be assisted appropriately. The load of the control performs danger avoidance movement can be reduced. The drawing shows a schematic view of the driving assistance system. (Drawing includes non-English language text) 1Vehicle2Camera4Vehicle-mounted terminal6Traveling control apparatus", "instruction": "Please summarize the input" }, { "input": "METHOD FOR DETERMINING TRAFFIC VIOLATION VEHICLE WHICH EXISTS ON DRIVING ROUTE OF AUTONOMOUS VEHICLE AND THE DEVICE USING THE SAMEAccording to the present invention, there is provided a method for determining whether a vehicle violating traffic laws is present on a driving path of an autonomous vehicle in motion, comprising (a) at least one of (i) a camera, RADAR, and LIDAR in the autonomous vehicle A state in which a first information acquisition module, (ii) a second information acquisition module including a V2X (Vehicle to Everything) communication module, and (iii) a third information acquisition module including a GPS (Global Positioning System) module is mounted In the above, the offending vehicle determination device corresponding to the autonomous vehicle (i) analyzes the first data acquired by the first information acquisition module or causes the first information acquisition module to analyze, A process of acquiring detection information about at least one other vehicle driving in the vicinity, (ii) analyzing the second data acquired by the second information acquisition module or causing the second information acquisition module to analyze, a process of acquiring signal information of at least one traffic light existing in the vicinity of the autonomous vehicle, and (iii) analyzing the third data acquired by the third information acquisition module or causing the third information acquisition module to analyze performing a process of acquiring first location information of the autonomous vehicle; (b) the violating vehicle judging device determines whether the traffic law violating vehicle exists on the driving path of the autonomous vehicle by referring to at least a part of the detection information, the signal information, and the first location information to do; and (c) when there is a vehicle violating traffic laws, the violating vehicle judging device, among the detection information, the signal information, and the first location information, specific detection information related to the traffic law violation vehicle, specific signal information and allowing the traffic law violation evidence collection module to store and manage traffic law violation evidence information including at least specific first location information; There is provided a method comprising a.|1. A method for determining whether a vehicle violating traffic laws is present on a driving route of an autonomous driving vehicle in driving, comprising: (a) a first information acquisition module including (i) a camera module, a RADAR module, and a LIDAR module in the autonomous driving vehicle , (ii) a second information acquisition module including a V2X (Vehicle to Everything) communication module, and (iii) a third information acquisition module including a GPS (Global Positioning System) module are mounted, the autonomous driving The offending vehicle judging device corresponding to the vehicle (i) analyzes the first data acquired by the first information acquisition module or causes the first information acquisition module to analyze at least while driving in the vicinity of the autonomous vehicle A process of acquiring detection information for one other vehicle, (ii) analyzing the second data acquired by the second information acquisition module or causing the second information acquisition module to analyze, a process of obtaining signal information of at least one existing traffic light; and (iii) analyzing the third data acquired by the third information acquisition module or performing a process of acquiring the first location information of the autonomous vehicle by causing the third information acquisition module to analyze;\n(b) the violating vehicle judging device determines whether the traffic law violating vehicle exists on the driving path of the autonomous vehicle by referring to at least a part of the detection information, the signal information, and the first location information to do; and (c) when there is a vehicle violating traffic laws, the violating vehicle judging device, among the detection information, the signal information, and the first location information, specific detection information related to the traffic law violation vehicle, specific signal information and allowing the traffic law violation evidence collection module to store and manage traffic law violation evidence information including at least specific first location information; \nand, when image data photographing the surrounding conditions of the autonomous vehicle is obtained from the camera module, the offending vehicle determination device causes the deep learning module interlocked with the offending vehicle determination device with respect to the image data to perform a predetermined to detect the other vehicle using an image object detection algorithm of If present, support to use a Single Shot Multibox Detector (SSD) as the image object detection algorithm, and (ii) the specific unit area area in which the other vehicle exists over the first threshold in the entire area of the image data It is characterized in that it supports YOLO (You Only Look Once) to be used as the image object detection algorithm when it does not exist, in the step (a), The offending vehicle determination device fuses each of the detection information of the other vehicle obtained from each of the first information acquisition module using a Kalman filter-based sensor fusion algorithm, (i) an environment factor applying the sensor fusion algorithm In a situation where it is determined that the weight of the time factor is high, the EKF (Extended Kalman Filter) algorithm is supported to be used, and (ii) it is determined that the weight of the accuracy factor is high among the environmental factors to which the sensor fusion algorithm is applied. is characterized in that it supports the UKF (Unscented Kalman Filter) algorithm to be used, and in step (b), the offending vehicle determination device includes (i) the detection information, the signal information, and the first location information a process of generating a first LDM (Local Dynamic Map) in which dynamic driving information of each of the autonomous vehicle and the other vehicle is linked with predetermined map data with reference to at least a part of the information; (ii) a process of obtaining each of the second LDMs generated by each of the other vehicles using the V2X communication module; and (iii) combining each of the first LDM and the second LDM to generate one expanded map; characterized in that to perform, characterized in that the violating vehicle determination device, with additional reference to the expansion map, to determine whether the traffic law violation vehicle exists, in the step (a), the violating vehicle determination The device allows the V2X communication module to transmit each SPAT (Signal Phase and Timing) message from each RSU (Rode Side Unit) interlocked with each of the at least one traffic light located in the vicinity of the autonomous vehicle as 2-1 data. It is characterized in that the signal information is obtained by receiving and analyzing or directly analyzing, and the offending vehicle determination device causes the V2X communication module to obtain a BSM (Basic Safety Message) of each of the other vehicles from each of the other vehicles. It is characterized in that the driving information of each of the other vehicles is additionally acquired by receiving and analyzing as the 2-2 data, or directly analyzing it, and in the step (b), the offending vehicle determining apparatus By additionally referring to each of the driving information of each of the other vehicles, it is determined whether there is a specific collision-anticipated other vehicle expected to collide on the driving path of the autonomous vehicle among the other vehicles, and the specific collision-anticipated other vehicle is determined to be the vehicle violating traffic laws when there is, and in step (c), the traffic law violation evidence information corresponds to the traffic law violating vehicle among the driving information of each of the other vehicles. When it is possible to receive each of the BSMs of each of the other vehicles using the V2X communication module, the offending vehicle determination device may further include specific driving information that is By predicting the trajectory of each of the other vehicles using a constant turn rate and acceleration (CTRA) model assuming that the (yaw rate) value is constant, It is characterized in that it is determined whether each of the other vehicles has a risk of colliding with the autonomous vehicle, and after step (c), (d) the offending vehicle determination device causes the traffic law violation evidence collection module to The traffic law violation evidence information corresponding to the traffic law violation vehicle is transmitted to a report module, so that the report module sends the report module to a specific enforcement agency corresponding to the traffic law violation type of the traffic law violation vehicle among a plurality of enforcement agencies. Supporting to report at least a part of the traffic law violation evidence information; \nIt characterized in that it further comprises, the report module, characterized in that the information on each traffic law violation report interface provided from each of the enforcement agencies is stored in advance, the violation vehicle judging device, the report Support the module to automatically input and report specific traffic law violation evidence information that can be entered into the specific traffic law violation report interface among the traffic law violation evidence information through the specific traffic law violation report interface provided by the specific enforcement agency A method characterized in that \n | 2. delete\n | 3. The method according to claim 1, wherein when image data photographing the surrounding conditions of the autonomous vehicle is obtained from the camera module, the offending vehicle determination device uses a deep learning module interlocked with the offending vehicle determination device for the image data. to detect the other vehicle using a predetermined image object detection algorithm, but before the step (a), the offending vehicle determining device performs learning of the deep learning module using predetermined learning data characterized in that, the learning of the deep learning module causes the weight to be increased for at least one specific first class from which the correct rate of the class prediction value is less than the second threshold is derived, and the class prediction value is the correct rate above the second threshold For at least one specific second class from which this is derived, a loss value is calculated using a focal loss that causes the weight to be reduced, The process of optimizing a plurality of parameters included in the deep learning module is repeated by performing backpropagation so that the loss value is minimized, and the class prediction value is the object included in the image data of the corresponding class. A method characterized in that it is a probability value predicted whether it is an object.\n | 4. delete\n | 5. delete\n | 6. The MLE (Maximum Likelihood) of claim 1, wherein when the offending vehicle determination device combines each of the first LDM and the second LDM, when there is a specific other vehicle having different positional coordinates among the other vehicles. Estimation) method, characterized in that the correction in one absolute position coordinates.\n | 7. delete\n | 8. delete\n | 9. delete\n | 10. delete\n | 11. A vehicle judging device for judging whether a vehicle violating traffic laws exists on a driving route of an autonomous vehicle in motion, comprising: at least one memory for storing instructions; and at least one processor configured to execute the instructions. including, wherein the processor includes (I) a first information acquisition module including (i) a camera module, a RADAR module, and a LIDAR module in the autonomous vehicle, (ii) a V2X (Vehicle to Everything) communication module. In a state in which a third information acquisition module including a second information acquisition module and (iii) a global positioning system (GPS) module is mounted, (i) analyzes the first data acquired by the first information acquisition module or a sub-process of obtaining detection information for at least one other vehicle driving in the vicinity of the autonomous vehicle by causing the first information acquisition module to analyze; (ii) the second information acquisition module acquired by the second information acquisition module. 2 A sub-process of analyzing data or causing the second information acquisition module to analyze to acquire signal information of at least one traffic light existing in the vicinity of the autonomous vehicle; and (iii) analyzing the third data acquired by the third information acquisition module or having the third information acquisition module analyze to perform a sub-process of acquiring the first location information of the autonomous vehicle; (II) a process of determining whether the traffic law violation vehicle exists on the driving path of the autonomous vehicle by referring to at least a part of the detection information, the signal information, and the first location information; and (III) when the traffic law violation vehicle exists, specific detection information related to the traffic law violation vehicle among the detection information, the signal information, and the first location information, the specific signal information, and the specific first location information a process for allowing the traffic law violation evidence collection module to store and manage traffic law violation evidence information including at least; characterized in that, when the image data photographing the surrounding situation of the autonomous vehicle is obtained from the camera module, the processor causes the deep learning module interlocked with the offending vehicle determination device with respect to the image data The other vehicle is detected using a predetermined image object detection algorithm, and the processor is configured to: (i) the other vehicle corresponding to the number equal to or greater than the first threshold within a specific unit area area among the entire area of the image data. In this case, a Single Shot Multibox Detector (SSD) can be used as the image object detection algorithm, and (ii) the specific unit area area in which the other vehicle exists above the first threshold exists in the entire area of the image data. If not, it is characterized in that it supports YOLO (You Only Look Once) to be used as the image object detection algorithm, and in the (I) process, The processor fuses each of the detection information of the other vehicle obtained from each of the first information acquisition module using a Kalman filter-based sensor fusion algorithm, (i) a time factor among environmental factors to which the sensor fusion algorithm is applied In a situation where the weight of is determined to be high, the EKF (Extended Kalman Filter) algorithm is supported to be used, and (ii) in a situation where the weight of the accuracy factor is determined to be high among the environmental factors to which the sensor fusion algorithm is applied, the UKF ( Unscented Kalman Filter) algorithm is supported to be used, and in the process (II), the processor, (i) referring to at least a part of the detection information, the signal information, and the first location information , a process of generating a first LDM (Local Dynamic Map) in which dynamic driving information of each of the autonomous vehicle and the other vehicle is linked with predetermined map data; (ii) a process of obtaining each of the second LDMs generated by each of the other vehicles using the V2X communication module; and (iii) combining each of the first LDM and the second LDM to generate one expanded map; characterized in that to perform, characterized in that the processor, with further reference to the extension map, characterized in that the determination of whether the traffic violation vehicle exists, in the (I) process, the processor, the V2X communication Let the module receive and analyze each of the Signal Phase and Timing (SPAT) messages as 2-1 data from each of the RSUs (Rode Side Units) interlocked with each of the at least one traffic light located in the vicinity of the autonomous vehicle, or Direct analysis, characterized in that the signal information is obtained, and the processor causes the V2X communication module to receive the BSM (Basic Safety Message) of each of the other vehicles from each of the other vehicles as 2-2 data. It is characterized in that each of the driving information of each of the other vehicles is additionally obtained by analyzing it or directly analyzing it, and in the process (II), the processor, By additionally referring to each of the driving information of each of the other vehicles, it is determined whether there is a specific collision-anticipated other vehicle expected to collide on the driving path of the autonomous vehicle among the other vehicles, and the specific collision-anticipated other vehicle is determined as the traffic law-violating vehicle when there is, and in the process (III), the traffic law violation evidence information corresponds to the traffic law-violating vehicle among the driving information of each of the other vehicles. It is characterized in that it further includes specific driving information to be used, and when each of the BSMs of each of the other vehicles can be received using the V2X communication module, the processor, the yaw rate included in the BSM ) by predicting the trajectory of each of the other vehicles using a constant turn rate and acceleration (CTRA) model assuming that the value is constant, It is characterized in that it is determined whether each of the other vehicles has a risk of colliding with the autonomous vehicle, and after the (III) process, (IV) the processor causes the traffic law violation evidence collection module to cause the traffic law The traffic law violation evidence information corresponding to the violating vehicle is transmitted to the report module, and the report module causes the report module to transmit the traffic law violation to a specific enforcement agency corresponding to the traffic law violation type of the traffic law violating vehicle among a plurality of enforcement agencies. a process of supporting to report at least some of the evidence information; characterized in that it further performs, the report module, characterized in that the information on each traffic law violation report interface provided from each of the enforcement agencies is stored in advance, and the processor causes the report module to Through the specific traffic law violation report interface provided by the specific enforcement agency, the specific traffic law violation evidence information that can be inputted into the specific traffic law violation report interface among the traffic law violation evidence information is automatically input to support reporting Violation vehicle judgment system.\n | 12. delete\n | 12. The method of claim 11, wherein when the image data photographing the surrounding conditions of the autonomous vehicle is obtained from the camera module, the processor causes the deep learning module interlocked with the offending vehicle determination device to set a predetermined value on the image data. Detect the other vehicle using an image object detection algorithm, characterized in that before the (I) process, the processor performs learning of the deep learning module using predetermined learning data, The learning of the learning module is such that the weight is increased for at least one specific first class from which the correct rate of the class prediction value is less than the second threshold value is derived, and the class prediction value is at least one from which the correct rate higher than the second threshold value is derived For a specific second class, the loss value is calculated using the focal loss that causes the weight to be reduced, The process of optimizing a plurality of parameters included in the deep learning module is repeated by performing backpropagation so that the loss value is minimized, and the class prediction value is the object included in the image data of the corresponding class. Violation vehicle determination device, characterized in that the probability value predicted whether the object is.\n | 14. delete\n | 15. delete\n | 12. The method of claim 11, wherein when the processor combines each of the first LDM and the second LDM, if there is a specific other vehicle having different positional coordinates among the other vehicles, a Maximum Likelihood Estimation (MLE) technique Violation vehicle determination device, characterized in that the correction to one absolute position coordinates through.\n | 17. delete\n | 18. delete\n | 19. delete\n | 20. delete", "output": "The method involves analyzing data acquired by an information acquisition module by an offending vehicle judging device (100) corresponding to a vehicle. Signal information of an existing traffic light is acquired, and location information of the autonomous vehicle is acquired. A determination is made whether a traffic law violating vehicle exists on a driving path of an autonomous vehicle by referring to a part of the detection information, signal information and the location information. A traffic law violation evidence collection module (150) is utilized to store and manage traffic law violation evidence information including specific first location information. An INDEPENDENT CLAIM is also included for vehicle judging device for judging whether a vehicle violating traffic laws exists on a driving route of an autonomous vehicle in motion. The method is useful for determining whether a vehicle violating traffic laws exists on a driving path of an autonomous vehicle. The information on a vehicle violating traffic laws existing on a driving path of an autonomous vehicle is automatically reported to an enforcement agency. The drawing shows a diagram schematically illustrating an overall system in which an offending vehicle determination device for determining whether a vehicle violating traffic laws exists on a driving path of an autonomous vehicle (Drawing includes non-English language text).100Offending vehicle judging device150Traffic law violation evidence collection module", "instruction": "Please summarize the input" }, { "input": "Method and monitoring server for verifying the operation of autonomous vehicles using THE Quality Control verification appIn the method of verifying the operation of an autonomous vehicle using a QC (Quality Control) verification app, a specific verification scenario for a specific road section is divided into a plurality of verification sections, and the autonomous vehicle for each verification section In the state that at least one reference PVD (Probe Vehicle Data) for each operation event to be operated is stored in the database, and the driving PVD (Prove Vehicle Data) of the autonomous vehicle is being transmitted to the control server, (a) the Specific driving PVD (Prove Vehicle Data) transmitted from the autonomous driving vehicle by the control server - The specific driving PVD performs a specific operation event included in a specific verification section in which the autonomous driving vehicle is one of the plurality of verification sections acquiring - which is the driving PVD of the autonomous vehicle corresponding to one; (b) the control server, (i) the first verification result information transmitted by the user riding in the autonomous vehicle - The first verification result information is the specific operation input by the user through the QC verification app A process of obtaining - which is information on whether the operation performed by the autonomous driving vehicle for the event is successful performing a process of obtaining second verification result information, which is a result of determining whether the specific reference PVD is matched; and (c) the control server determines whether the first verification result information and the second verification result information match, and a third information that is final verification result information of the operation performed by the autonomous vehicle with respect to the specific operation event A method and a control server for verifying the operation of an autonomous vehicle, comprising: obtaining verification result information; are disclosed.|1. In the method of verifying the operation of an autonomous vehicle using a QC (Quality Control) verification app, a specific verification scenario for a specific road section is divided into a plurality of verification sections, and the autonomous vehicle for each verification section In a state that at least one reference PVD (Probe Vehicle Data) for each operation event to be operated is stored in the database, and the driving PVD (Prove Vehicle Data) of the autonomous vehicle is being transmitted to the control server, (a) the Specific driving PVD (Prove Vehicle Data) transmitted from the autonomous driving vehicle by the control server - The specific driving PVD performs a specific operation event included in a specific verification section in which the autonomous driving vehicle is one of the plurality of verification sections acquiring - which is the driving PVD of the autonomous vehicle corresponding to one;\n(b) the control server, (i) the first verification result information transmitted by the user riding in the autonomous vehicle - The first verification result information is the specific operation input by the user through the QC verification app A process of obtaining - which is information on whether the operation performed by the autonomous driving vehicle for the event is successful performing a process of obtaining second verification result information, which is a result of determining whether the specific reference PVD is matched; and (c) the control server determines whether the first verification result information and the second verification result information match, and a third information that is final verification result information of the operation performed by the autonomous vehicle with respect to the specific operation event obtaining verification result information; \nA method of verifying the operation of an autonomous vehicle comprising a. \n | 2. The method according to claim 1, wherein each of the reference PVD and the driving PVD is used for verification of (i) a standard field indicating a data item of the autonomous vehicle standardized for V2X communication and (ii) each of the plurality of operation events A method for verifying the operation of an autonomous vehicle, characterized in that it includes a non-standard field indicating a data item of the autonomous vehicle.\n | 3. The method according to claim 1, wherein the control server additionally acquires traffic signal information transmitted from a plurality of traffic signal controllers installed in the specific road section, (i) the autonomous vehicle communicates with the plurality of traffic signal controllers V2X an indirect acquisition method of acquiring the traffic signal information through A method of verifying the operation of an autonomous vehicle, characterized in that the traffic signal information is acquired by at least one of direct acquisition methods for acquiring the traffic signal information through direct communication with a traffic signal controller of\n | 4. According to claim 3, wherein the control server, with reference to the specific driving PVD to display or support to display on the control display of the control system a screen on which the location of the autonomous vehicle is displayed on a map including the specific road section However, the second connection information allowing the manager of the control system to access (i) the vehicle status information of the autonomous driving vehicle and (ii) the verification status information of the autonomous driving vehicle A method of verifying the operation of an autonomous vehicle, comprising at least one of connection information, and displaying or supporting display on the control display.\n | 5. According to claim 4, When the first connection information is selected by the manager, the control server, (i) information related to the state of the internal device of the autonomous vehicle included in the vehicle state information and the autonomous driving at least one of information related to the driving state of the vehicle is displayed or supported on the control display; The method of verifying the operation of an autonomous vehicle, characterized in that the display or support to display the traffic signal information including at least one of specific speed limit information and specific traffic sign information on the control display.\n | 5. The method of claim 4, wherein when the second connection information is selected by the manager, the control server, the information on the specific verification section, the information on the specific operation event, the first verification result information, the second verification The method of verifying the operation of the autonomous vehicle, characterized in that the verification state information including at least one of result information and the third verification result information is displayed or supported on the control display.\n | 7. According to claim 1, wherein the first verification result information is specific information input by the user - The specific information is detailed information about the failure (fail) with respect to the specific operation event of the autonomous vehicle Information corresponding to - A method of verifying the operation of an autonomous vehicle comprising:\n | 8. According to claim 1, wherein the control server, (i) When obtaining the third verification result information corresponding to the case where the first verification result information and the second verification result information match, the control server responds to the specific operation event the verification is completed, and (ii) when the third verification result information corresponding to the case where the first verification result information and the second verification result information do not match is obtained, the first verification result information, the A method of verifying the operation of the autonomous vehicle, characterized in that the second verification result information and the third verification result information are reported and stored.\n | 9. The method of claim 1, wherein the control server repeats steps (a) to (c) for each of the plurality of verification sections.\n | 10. The method of claim 9, wherein the verification of the operation of the autonomous vehicle is repeated for each of the plurality of verification sections to count the cycle order for the specific road section, and the control server sets as a mission for the specific road section With reference to the mission order data, it is determined whether the circulating order satisfies the mission order data, and if the order does not satisfy the mission order data, verification of the operation of the autonomous vehicle is performed on the plurality of A method of verifying the operation of an autonomous vehicle, characterized in that the number of cycles is increased for the specific road section by repeating for each verification section.\n | 11. In the control server that verifies the operation of an autonomous vehicle using a QC (Quality Control) verification app, in a state in which the driving PVD (Prove Vehicle Data) of the autonomous vehicle is being transmitted to the control server, in a specific road section a database in which a specific verification scenario is divided into a plurality of verification sections, and a reference PVD (Probe Vehicle Data) for each operation event in which the autonomous vehicle must operate for each verification section is stored; at least one memory storing instructions; and at least one processor configured to execute the instructions.\n, wherein the processor includes: (1) specific driving PVD (Prove Vehicle Data) transmitted from the autonomous driving vehicle - The specific driving PVD is included in a specific verification section in which the autonomous driving vehicle is one of the plurality of verification sections The process of acquiring - which is the driving PVD of the autonomous vehicle corresponding to the execution of a specific motion event; (2) (i) First verification result information transmitted by the user riding in the autonomous vehicle - The first verification result information is the autonomy for the specific operation event input by the user through the QC verification app A process of obtaining - which is information on the success or failure of the operation performed by the driving vehicle; A process of obtaining second verification result information, which is a result of determining whether or not matching, and (3) determining whether the first verification result information and the second verification result information match are performed by the autonomous vehicle for the specific operation event A control server for verifying the operation of an autonomous vehicle, characterized in that it performs a process of obtaining third verification result information, which is the final verification result information of one operation. \n | 12. The method of claim 11 , wherein each of the reference PVD and the driving PVD is used for (i) a standard field indicating a data item of the autonomous vehicle standardized for V2X communication and (ii) verifying each of the plurality of operation events A control server for verifying the operation of the autonomous vehicle, characterized in that it includes a non-standard field indicating the data item of the autonomous vehicle.\n | 13. The method of claim 11, wherein the processor additionally acquires traffic signal information transmitted from a plurality of traffic signal controllers installed in the specific road section, (i) the autonomous vehicle performs V2X communication with the plurality of traffic signal controllers. an indirect acquisition method of acquiring the traffic signal information through the A control server for verifying the operation of an autonomous vehicle, characterized in that the traffic signal information is acquired by at least one of direct acquisition methods for acquiring the traffic signal information through direct communication with a traffic signal controller.\n | 14. The method according to claim 13, wherein the processor supports displaying or displaying a screen on which the location of the autonomous vehicle is displayed on a map including the specific road section with reference to the specific driving PVD on the control display of the control system. , (i) first connection information allowing the manager of the control system to access vehicle status information of the autonomous driving vehicle, and (ii) second connection allowing access to verification status information of the autonomous driving vehicle A control server for verifying the operation of an autonomous vehicle, comprising at least one of information and displaying or supporting display on the control display.\n | 15. The method of claim 14, wherein the processor, when the first connection information is selected by the manager, (i) information related to the state of the internal device of the autonomous vehicle included in the vehicle state information and the autonomous vehicle at least one of information related to the driving state of the vehicle is displayed or supported on the control display; The control server for verifying the operation of the autonomous vehicle, characterized in that the display or support to display the traffic signal information including at least one of speed limit information and specific traffic sign information on the control display.\n | 16. The method of claim 14, wherein the processor, when the second connection information is selected by the manager, information on the specific verification section, information on the specific operation event, the first verification result information, and the second verification result The control server for verifying the operation of the autonomous vehicle, characterized in that the display or support to display the verification state information including at least one of information and the third verification result information on the control display.\n | 12. The method of claim 11, wherein the first verification result information is specific information input by the user - the specific information is detailed information about the failure of the specific operation event of the autonomous vehicle It is information corresponding to - A control server that verifies the operation of the autonomous vehicle, characterized in that it includes.\n | 18. The method of claim 11 , wherein the processor (i) obtains the third verification result information corresponding to a case in which the first verification result information and the second verification result information are identical to each other, When the verification is processed as completed, and (ii) the third verification result information corresponding to the case where the first verification result information and the second verification result information do not match is obtained, the first verification result information, the second verification result information A control server that verifies the operation of the autonomous vehicle, characterized in that the second verification result information and the third verification result information are reported and stored.\n | 19. The control server of claim 11 , wherein the processor repeats steps (1) to (3) for each of the plurality of verification sections.\n | 20. The method of claim 19, wherein the processor repeats the verification of the operation of the autonomous vehicle for each of the plurality of verification sections, counts the circulation order for the specific road section, and is set as a mission for the specific road section With reference to the existing mission order data, it is determined whether the circulating order satisfies the mission order data, and if the order does not satisfy the mission order data, the verification of the operation of the autonomous vehicle is performed by the plurality of verifications. A control server for verifying the operation of an autonomous driving vehicle, characterized in that by repeating for each section and increasing the circulation order for the specific road section.", "output": "The method involves transmitting the specific driving prove vehicle data (PVD) from an autonomous driving vehicle (600) by a control server (100). The second verification result information which determines whether the operation performed by the autonomous driving vehicle for the event is successful is obtained and a second verification result information, which is a result of determining whether the specific reference PVD is matched is obtained. A determination is made to check whether the first verification result information and the second verification result information match, and a third information that is final verification result information of the operation performed by the autonomous vehicle with respect to the specific operation event obtaining verification result information. An INDEPENDENT CLAIM is included for a control server for verifying operation of autonomous vehicle using quality control verification app. Method for verifying operation of autonomous vehicle using quality control verification app. The method enables verifying the operation of the autonomous driving vehicle by using data directly confirmed by a user riding in the autonomous vehicle, so that monitoring of information related to verifying operation of an autonomous vehicle through a control display of a control system can be enabled. The drawing shows a schematic view of a control server that verifies the operation of an autonomous vehicle. 100Control server110Memory600Autonomous driving vehicle700Terminal900Database", "instruction": "Please summarize the input" }, { "input": "METHOD FOR PREVENTING POSSIBLE MALFUNCTIONS OF DCU OCCURING DURING AUTONOMOUS DRIVING BY REFERRING TO ADS USING OUTPUTS OF HETEROGENEOUS DCUS AND METHOD USING THE SAMEDisclosed is a method of supporting the analysis of the DCUs in order to prevent a misjudgment situation of the DCUs that may occur in autonomous driving by using an Anomaly Detection System (ADS) for a heterogeneous Domain Control Unit (DCU). That is, (a) the computing device operating in conjunction with the target vehicle that performs autonomous driving, a predetermined first to Nth time point corresponding to the autonomous driving state-N is an integer greater than or equal to 1-to the target vehicle. Allowing the mounted sensor module to acquire first situation information to Nth situation information about a situation around the target vehicle; (b) the computing device allows at least some of the first DCU to M-th DCU operating in conjunction with the computing device-M is an integer of 2 or more-which is one of the first context information to the N-th context information. Generating at least a part of K_1th determination information to K_Mth determination information with reference to K context information-K is an integer of 1 or more and N or less; And (c) the computing device causes the ADS to operate in conjunction with the K_1-th determination information to at least part of the first DCU to the M-th DCU with reference to at least a part of the K_1th determination information to the K_Mth determination information. At least a part of the first DCU to the M-th DCU can be analyzed by calculating the K-th determination match degree for and by causing an edge logger to tag and store the K-th situation information with reference to the K-th decision match degree. Disclosed is a method comprising the step of assisting to perform.|1. (a) A computing device that operates in conjunction with a target vehicle performing autonomous driving, a predetermined first point to an Nth point in time corresponding to the autonomous driving state-N is an integer greater than or equal to 1-a sensor mounted on the target vehicle for each Allowing the module to acquire first situation information to Nth situation information about a situation around the target vehicle;\n(b) the computing device allows at least some of the first DCU to M-th DCU operating in conjunction with the computing device-M is an integer of 2 or more-which is one of the first context information to the N-th context information. Generating at least a part of K_1th determination information to K_Mth determination information with reference to K context information-K is an integer of 1 or more and N or less; And (c) the computing device causes the ADS operating in conjunction therewith to refer to at least a portion of the K_1th determination information to the K_Mth determination information to determine at least a portion of the first DCU to the Mth DCU. Calculate the K-th decision match degree, and cause the edge logger to tag and store the K-th situation information with reference to the K-th decision match degree so that at least a portion of the first DCU to the M-th DCU can be analyzed. A method comprising the step of supporting. \n | 2. The method of claim 1, wherein in step (c), the computing device causes the ADS to apply a Dynamic Time Warping algorithm according to the following equation to at least a part of the K_1th determination information to the K_Mth determination information Calculate the degree of agreement of the K-th determination for at least some of the first DCU to the M-th DCU, In the above formula, Denotes a first specific time series vector including information on at least one of the K_1th determination information to the K_Mth determination information, Means a second specific time series vector including information on the other one of the K_1th determination information to the K_Mth determination information.\n | 3. The method of claim 1, wherein (d) the computing device causes the edge logger to analyze a predetermined log when tag information of the K-th context information indicates that at least a part of the matching degree of the K-th determination is less than a threshold value. The Kth situation information and the K_1th determination information to the K_Mth determination information are transmitted to the system, and the log analysis system causes the Kth situation information and the K_1th determination information to the Kth determination information through a predetermined display device. The step of supporting the manager to analyze at least some of the process processes of the first DCU to the M-th DCU at the K-th point in time corresponding to the K-th situation information by transmitting K_M determination information to the manager. How to characterize.\n | 4. The method of claim 3, wherein (e) when the computing device obtains analysis information about a problem in a specific DCU among the first DCU to the Mth DCU from the log analysis system, the analysis information is referred to And modifying the algorithm of the specific DCU.\n | 5. The method of claim 1, wherein in the step (a), the computing device causes the sensor module including at least some of a camera, a radar, a lidar, a GPS, and a V2X communication module to cause the first situation information to the Nth situation. A method, characterized in that to obtain information.\n | 6. The method of claim 1, wherein at least some of the first DCU to the M-th DCU are implemented in the form of a neural network consisting of a plurality of layers each including a plurality of virtual neurons, and the other part is in the form of a rule-based algorithm. Implemented, wherein each of the first DCU to the Mth DCU outputs results according to different logics.\n | 7. The method of claim 1, wherein the step (b) comprises: at least one of K_1th determination information to K_Mth determination information generated from at least one preset main DCU among the first DCU to the Mth DCU. And transmitting one main determination information to an actuator of the target vehicle to support the target vehicle to perform the autonomous driving according to the main determination information.\n | 8. In a computing device that supports analysis of the DCUs to prevent a misjudgment situation of the DCUs that may occur in autonomous driving by using an Anomaly Detection System (ADS) for heterogeneous Domain Control Units (DCUs), instructions are provided. One or more memories to store; And one or more processors configured to perform the instructions, wherein the processor includes: (I) a predetermined first to Nth time point corresponding to the autonomous driving state-N is an integer greater than or equal to 1-each mounted on the target vehicle A process of causing the sensor module to acquire first situation information to Nth situation information about a situation around the target vehicle; (II) The first DCU to the M-th DCU operating in conjunction with the computing device-M is an integer greater than or equal to 2-allowing at least some of the K-th context information, which is one of the first context information to the N-th context information-K Is an integer equal to or greater than 1 and equal to or less than N; a process of generating at least some of the K_1th determination information to the K_Mth determination information with reference to; And (III) the ADS operating in conjunction therewith, with reference to at least a part of the K_1-th determination information to the K_M-th determination information, and the K-th determination match degree with respect to at least a portion of the first DCU to the M-th DCU. A process for supporting at least some of the first DCU to the M DCU to be analyzed by calculating and storing the K-th context information by tagging and storing the K-th situation information with reference to the K-th determination matching degree by an edge logger. Device characterized in that performing.\n | 9. The method of claim 8, wherein in the (III) process, the processor causes the ADS to apply a Dynamic Time Warping algorithm according to the following equation to at least a part of the K_1th determination information to the K_Mth determination information 1 DCU to calculate the K-th determination agreement for at least a portion of the M-th DCU, In the above formula, Denotes a first specific time series vector including information on at least one of the K_1th determination information to the K_Mth determination information, Means a second specific time series vector including information on the other one of the K_1-th determination information to the K_M-th determination information.\n | 10. The system of claim 8, wherein (IV) the processor, when the edge logger causes the tag information of the K-th context information to indicate that at least a part of the matching degree of the K-th determination is less than or equal to a threshold value, The K-th situation information and the K_1-th determination information to the K_M-th determination information are transmitted to the K-th situation information and the K-th determination information to the K_Mth determination information through a predetermined display device to the log analysis system. By transmitting the determination information to the manager, further performing a process of supporting the manager to analyze at least some of the process processes of the first DCU to the M-th DCU at the K-th time point corresponding to the K-th situation information. Device.\n | 11. The method of claim 10, wherein, when the processor (V) obtains analysis information about a problem in a specific DCU among the first DCU to the Mth DCU from the log analysis system, the analysis information is referred to An apparatus, characterized in that further performing a process of modifying an algorithm of a specific DCU.\n | 12. The method of claim 8, wherein in the (I) process, the processor causes the sensor module including at least some of a camera, a radar, a lidar, a GPS, and a V2X communication module to cause the first situation information to the Nth situation information. Device, characterized in that to obtain.\n | 13. The method of claim 8, wherein at least some of the first DCU to the M-th DCU are implemented in the form of a neural network consisting of a plurality of layers each including a plurality of virtual neurons, and the other part is in the form of a rule-based algorithm. Embodied, wherein each of the first DCU to the Mth DCU outputs results according to different logics.\n | 14. The method of claim 8, wherein in the (II) process, the processor comprises at least one of K_1th determination information to K_Mth determination information generated from at least one preset main DCU among the first DCU to the Mth DCU. And transmitting the main determination information of the target vehicle to an actuator of the target vehicle to support the target vehicle to perform the autonomous driving according to the main determination information.", "output": "The method involves allowing a sensor module (210) mounted on the target vehicle to obtain first situation information to N-th situation information about a situation around the target vehicle where N is integer number. The portion of first domain control unit (DCU) to M DCU-M (220-1-220-M) is caused to operate in conjunction with the computing device to generate portion of the K-1 determination information to the K-M determination information where K is an integer of 1 or more and M is integer of 2 or more. The anomaly detection system (ADS) (130) is allowed to calculate a K decision match for portion of the first DCU to the M DCU with reference to portion of the K-1 determination information to the K-M determination information. The edge logger (140) is supported to tag the K state information with reference to the K determination match so that portion of the first DCU to the M DCU is analyzed. An INDEPENDENT CLAIM is included for a device for preventing misjudgment situation of DCU occurring during autonomous driving of vehicle. Method for preventing misjudgment situation of DCU occurring during autonomous driving of vehicle on road. The misjudgment situation of DCU occurring during autonomous driving of vehicle is prevented effectively by using ADS for heterogeneous DCU. The problem of a specific DCU is checked to correct the algorithm of a specific DCU. The drawing shows a schematic diagram of the computing device. (Drawing includes non-English language text) 130ADS140Edge logger210Sensor module220-1-220-MDCU230Actuator", "instruction": "Please summarize the input" }, { "input": "SYSTEMS AND METHODOLOGY FOR VOICE AND/OR GESTURE COMMUNICATION WITH DEVICE HAVING V2X CAPABILITYA system includes a first communication module for receiving a user message, a processing unit for converting the user message to a vehicle-to-everything (V2X) message, and a second communication module. The first communication module, the processing unit, and the second communication modules are implemented in a first vehicle. The second communication module is configured to transmit the V2X message from the first vehicle via a wireless communication link. The first vehicle may be a drone configured to communicate with a user device positioned on or near a user, and the user message may be an audible message or user gestures. Alternatively, the first vehicle may be inhabited by the user, with the user message being an audible message. The system may enable communication with an autonomous vehicle or another device equipped with V2X capability.|1. A system comprising: \n * a first communication module configured to receive a user message;\n * a processing unit configured to convert the user message to a vehicle-to-everything (V2X) message; and\n * a second communication module, wherein the first communication module, the processing unit, and the second communication module are implemented in a first vehicle, and the second communication module is configured to transmit the V2X message from the first vehicle via a wireless communication link.\n | 2. The system of claim 1 wherein the wireless communication link is a first wireless communication link, and the system further comprises an electronic device configured to be positioned proximate a user, the electronic device including a third communication module, wherein the first and third communication modules are configured to enable a second wireless communication link between first vehicle and the electronic device for communication of the user message from the user to the first vehicle.\n | 3. The system of claim 2 wherein the first vehicle is an unmanned vehicle, and: \n * the electronic device comprises: \n * a first wearable structure configured to be positioned on the user, the first wearable structure including the third communication module, wherein the first and third communication modules are configured to enable the second wireless communication link between the unmanned vehicle and the first wearable structure; and\n * a second wearable structure configured to be positioned on the user, the second wearable structure being physically displaced away from the first wearable structure, the second wearable structure including a fourth communication module, wherein the first and fourth communication modules are configured to enable a third wireless communication link between the unmanned vehicle and the second wearable structure;\n * the processing unit implemented in the unmanned vehicle is further configured to determine a current location of the unmanned vehicle relative to the user in response to the second and third wireless communication links; and\n * the system further comprises a drive control unit in communication with the processing unit and configured to adjust a speed and a position of the unmanned vehicle to move the unmanned vehicle from the current location to a predefined location relative to the user.\n | 4. The system of claim 3 wherein the predefined location is included in the user message, the user message is an audible message from the user, and at least one of the first and second wearable structures comprises a microphone configured to capture the audible message from the user and at least one of the third and fourth communication modules is configured to communicate the audible message with the predefined location via at least one of the second and third communication links.\n | 5. The system of any preceding clam further comprising a camera implemented in the first vehicle and configured to capture motion of a user and provide visual information of the user to the processing unit, wherein the processing unit is further configured to determine the user message from the visual information.\n | 6. The system of any preceding claim wherein the first vehicle is an unmanned vehicle, and the system further comprises a camera implemented in the unmanned vehicle and configured to capture an ambient environment visible from the camera and provide visual information of the ambient environment to the user, and the user message is an audible message from the user responsive to the visual information.\n | 7. The system of any preceding claim wherein the user message is a first user message, the V2X message is a first V2X message, and: \n * the second communication module is further configured to receive a second V2X message via the first wireless communication link;\n * the processing unit is further configured to convert the second V2X message to a second user message for communication of the second user message from the first vehicle to the electronic device; and\n * the electronic device further comprises a speaker configured to output the second user message as an audible message to the user.\n | 8. The system of any preceding claim wherein a user is positioned in the first vehicle and the first communication system comprises a microphone for capturing the user message as an audible message from the user, wherein the processing unit is configured to convert the user message to the V2X message for transmission via the wireless communication link.\n | 9. The system of claim 8 wherein the V2X message is configured for transmission to a second vehicle having at least semi-autonomous motion capability, the user message includes a voice command from the user configured to influence navigation of the second vehicle, and the V2X message includes the voice command for commanding navigation of the second vehicle.\n | 10. The system of claim 8 or 9 wherein the user message is a first user message, the V2X message is a first V2X message, and: \n * the second communication module is further configured to receive a second V2X message via the wireless communication link;\n * the processing unit is further configured to convert the second V2X message to a second user message; and\n * the system further comprises a speaker implemented in the first vehicle configured to output the second user message as an audible message to the user.\n | 11. The system of any preceding claim wherein: \n * the first communication module is configured to implement a first wireless communication technology to enable receipt of the user message; and\n * the second communication module is configured to implement a second wireless communication technology to enable transmission of the V2X message, the second wireless communication technology differing from the first wireless communication technology.\n | 12. A method comprising: \n * receiving a user message at a first vehicle;\n * converting the user message to a vehicle-to-everything (V2X) message at the first vehicle; and\n * transmitting the V2X message from the first vehicle via a wireless communication link.\n | 13. The method of claim 12 wherein the wireless communication link is a first wireless communication link, and the method further comprises: \n * enabling a second wireless communication link between the first vehicle and an electronic device positioned proximate a user for communication of the user message from the user to the first vehicle.\n | 14. The method of claim 13 wherein the first vehicle is an unmanned vehicle, and the method further comprises: \n * positioning first and second wearable structures of the electronic device on the user, the first and second wearable structures being physically displaced away from one another;\n * enabling a second wireless communication link between the first wearable structure and the unmanned vehicle;\n * enabling a third wireless communication link between the second wearable structure and the unmanned vehicle;\n * determining a current location of the unmanned vehicle relative to the target in response to the second and third wireless communication links; and\n * adjusting a speed and a position of the unmanned vehicle to move the unmanned vehicle from the current location to a predefined location relative to the user.\n | 15. The method of claim 14 further comprising: \n * capturing an audible message from the user at the electronic device, the predefined location being included in the audible message; and\n * communicating the audible message with the predefined location via at least one of the second and third communication links.", "output": "The system comprises a first communication module to receive a user message. A processing unit is provided to convert the user message to a vehicle-to-everything message. The first and second communication modules, the processing unit, and the second communication module are implemented in a first vehicle. The second communication is provided for transmitting the V2X message from the first vehicle through a wireless communication link. A camera implemented in vehicle and provided to capture motion of user and provide visual information of user to processing unit. An INDEPENDENT CLAIM is included for a method for enabling communication between human users and vehicles. System for enabling communication between human users and vehicles. System for enabling communication between human users and vehicles having semi-autonomous motion capability and other devices equipped with V2X capability by conversion of user messages e.g. voice and gesture, to vehicle-to-everything messages and vice versa. The drawing shows a schematic view of a system for enabling communication between human users and vehicles.22Electronic device 24Appropriate user 26On-board drone 30First wearable structure 41First location ", "instruction": "Please summarize the input" }, { "input": "IDENTIFYING A STOPPING PLACE FOR AN AUTONOMOUS VEHICLEAmong other things, a vehicle is caused to drive autonomously through a road network toward a defined goal position. Current information is analyzed about potential stopping places in the vicinity of the goal position, to make a choice of a currently selected stopping place that is acceptable and feasible. The vehicle is caused to drive autonomously toward the currently selected stopping place. The activities are repeated until the vehicle stops at a currently selected stopping place.", "output": "The method involves causing a vehicle (10) to drive autonomously through a road network toward a defined goal position (102). Current information about potential stopping places in vicinity of the goal position is analyzed to make choice of a currently selected stopping place that is acceptable and feasible by applying a predefined strategy for choosing the currently selected stopping place. The vehicle is caused to drive autonomously toward the currently selected stopping place until the vehicle stops at the currently selected stopping place. An INDEPENDENT CLAIM is also included for an autonomous vehicle. Method for identifying stopping places for an autonomous vehicle (claimed). The method enables providing the passenger with option of switching the autonomous vehicle from an autonomous mode to a partially or fully manual mode, so that the passenger can locate an acceptable feasible stopping place. The method enables providing autonomous driving capability to safely and reliably drive through a road environment to the goal position while avoiding vehicles, pedestrians, cyclists, and other obstacles and obeying rules of the road. The drawing shows a schematic view of a map. 10Vehicle100Selected stopping point102Defined goal position104Passenger132Obstacle", "instruction": "Please summarize the input" }, { "input": "Identifying a stopping place for an autonomous vehicleAmong other things, stored data is maintained indicative of potential stopping places that are currently feasible stopping places for a vehicle within a region. The potential stopping places are identified as part of static map data for the region. Current signals are received from sensors or one or more other sources current signals representing perceptions of actual conditions at one or more of the potential stopping places. The stored data is updated based on changes in the perceptions of actual conditions. The updated stored data is exposed to a process that selects a stopping place for the vehicle from among the currently feasible stopping places.The invention claimed is: \n | 1. A computer-implemented method comprising: \nreceiving, by one or more processors, static map data for a region, wherein the static map data identifies one or more potential stopping places for a vehicle within the region; \nmaintaining, by the one or more processors, stored data indicative of one or more currently feasible stopping places for the vehicle within the region, wherein the one or more currently feasible stopping places area subset of the one or more potential stopping places, and wherein at least one potential stopping place of the one or more potential stopping places is determined to be a currently feasible stopping place based on: \nan amount of time elapsed since the potential stopping place was determined to be infeasible for parking stopping exceeding a first threshold value, \na reason for the determination that the potential stopping place is infeasible for stopping, and \nat least one of a historical level of demand for parking in a vicinity of the potential stopping place being less than a second threshold value or traffic volume in the vicinity of the potential stopping place being less than a third threshold value; \nreceiving from one or more sensors or one or more other sources current signals representing perceptions of actual conditions at the one or more currently feasible stopping places; \nupdating, by the one or more processors, the stored data based on the perceptions of actual conditions to include one or more updated currently feasible stopping places; and \nexposing, by the one or more processors, the updated stored data to a process that selects a stopping place for the vehicle from among the one or more updated currently feasible stopping places. \n | 2. The method of claim 1 comprising: \ndiscretizing, by the one or more processors, the one or more potential stopping places as a finite number of points within the region. \n | 3. The method of claim 2 comprising: defining, by the one or more processors, the potential stopping place as a shape containing at least one of the points, the potential stopping place configured to accommodate a footprint of the vehicle.\n | 4. The method of claim 3 comprising: attributing, by the one or more processors, an orientation to the shape, the orientation corresponding to a direction of traffic flow at the potential stopping place.\n | 5. The method of claim 2 comprising: \ninitializing, by the one or more processors, the one or more potential stopping places as one or more stopping places expected to be feasible based on prior signals from the one or more sensors, the prior signals representing past perceptions of past actual conditions at some of the one or more potential stopping places. \n | 6. The method of claim 1 in which the one or more sensors comprise at least one sensor that is physically located on the vehicle.\n | 7. The method of claim 1 in which the one or more sensors comprise at least one sensor that is physically remote from the vehicle, \nwherein the at least one sensor is located inside a parking garage. \n | 8. The method of claim 1 in which the current signals received from the one or more sensors are received through vehicle-to-vehicle or vehicle-to-infrastructure communication.\n | 9. The method of claim 1 in which the one or more other sources comprise crowd-sourced data sources.\n | 10. The method of claim 1 in which the vehicle is part of a fleet of vehicles managed from a central server and the method comprises the server distributing information received from sensors at one of the vehicles to other vehicles of the fleet.\n | 11. An autonomous vehicle, comprising: \none or more processors; \none or more sensors; and \none or more data storage devices including instructions that when executed by the one or more processors, cause the autonomous vehicle to perform functions comprising: \nreceiving static map data for a region, wherein the static map data identifies one or more potential stopping places for a vehicle within the region; \nmaintaining stored data indicative of one or more currently feasible stopping places for the vehicle within the region, wherein the one or more currently feasible stopping places are a subset of the one or more potential stopping places, and wherein a at least one potential stopping place of the one or more potential stopping places is determined to be a currently feasible stopping place based on: \nan amount of time elapsed since the potential stopping place was determined to be infeasible for stopping exceeding a first threshold value, \na reason for the determination that the potential stopping place is infeasible for stopping, and \nat least one of a historical level of demand for parking in a vicinity of the potential stopping place being less than a second threshold value or traffic volume in the vicinity of the potential stopping place being less than a third threshold value; \nreceiving from the one or more sensors or one or more other sources current signals representing perceptions of actual conditions at the one or more currently feasible stopping places; \nupdating the stored data based on the perceptions of actual conditions to include one or more updated currently feasible stopping places; and exposing the updated stored data to a process that selects a stopping place for the vehicle from among the one or more updated currently feasible stopping places. \n | 12. The autonomous vehicle of claim 11, wherein the functions comprise: \ninitializing the potential stopping places as all of the potential stopping places identified as part of the static map data for the region. \n | 13. The autonomous vehicle of claim 11, wherein the functions comprise: \ndiscretizing the potential stopping places as a finite number of points within the region corresponding to potential stopping places. \n | 14. The autonomous vehicle of claim 13, wherein the functions comprise: \ndefining a potential stopping place as a shape containing one of the points, the shape corresponding to a footprint of the vehicle. \n | 15. The autonomous vehicle of claim 14, wherein the functions comprise: \nattributing an orientation to the shape, the orientation corresponding to a direction of traffic flow. \n | 16. The autonomous vehicle of claim 13, wherein the functions comprise: \ninitializing the potential stopping places as potential stopping places expected to be feasible based on prior signals from the one or more sensors representing perceptions of actual conditions at one or more of the potential stopping places. \n | 17. The autonomous vehicle of claim 11 in which the current signals received from the one or more sensors are received through vehicle-to-vehicle or vehicle-to-infrastructure communication.\n | 18. A non-transitory computer readable medium storing instructions thereon that, when executed by one or more processors, cause the one or more processors to perform functions comprising: \nreceiving static map data for a region, wherein the static map data identifies one or more potential stopping places for a vehicle within the region; \nmaintaining stored data indicative of one or more currently feasible stopping places for the vehicle within the region, wherein the one or more currently feasible stopping places are a subset of the one or more potential stopping places, and wherein at least one potential stopping place of the one or more potential stopping places determined to be a currently feasible stopping place based on: \nan amount of time elapsed since the potential stopping place was determined to be infeasible for stopping exceeding a first threshold value, \na reason for the determination that the potential stopping place is infeasible for stopping, and \nat least one of a historical level of demand for parking in a vicinity of the potential stopping place being less than a second threshold value or traffic volume in the vicinity of the potential stopping place being less than a third threshold value; \nreceiving from one or more sensors or one or more other sources current signals representing perceptions of actual conditions at the one or more currently feasible stopping places; \nupdating the stored data based on the perceptions of actual conditions to include one or more updated currently feasible stopping places; and \nexposing the updated stored data to a process that selects a stopping place for the vehicle from among the one or more updated currently feasible stopping places.", "output": "The computer-based method involves maintaining stored data indicative of potential stopping places that are currently feasible stopping places for a vehicle within a region. The sensors (24) or several other sources current signals are received that represents perceptions of actual conditions at several potential stopping places. The stored data is updated based on changes in the perceptions of actual conditions. The updated stored data is exposed to a process that selects a stopping place for the vehicle from among the currently feasible stopping places. Method for identifying stopping places for autonomous vehicle. The information received from the device of the passenger includes an indication that the time spent searching for an acceptable stopping place is acceptable to the passenger. The set of elements or components located on an autonomous vehicle or at other locations that enables an autonomous vehicle to operate. The stopping place that are closer to curbs are generally preferred as they allow the passenger to access the activity AV more easily. The drawing shows a block diagram of the method of identifying stopping places for autonomous vehicle. 10Automonus vehicle12Road environment14Global position24Sensors34Data base", "instruction": "Please summarize the input" }, { "input": "INTERVENTION IN OPERATION OF A VEHICLE HAVING AUTONOMOUS DRIVING CAPABILITIESAmong other things, a determination is made that intervention in an operation of one or more autonomous driving capabilities of a vehicle is appropriate. Based on the determination, a person is enabled to provide information for an intervention. The intervention is caused in the operation of the one or more autonomous driving capabilities of the vehicle.|1. A vehicle comprising: \nat least one processor; and \na non-transitory computer-readable storage medium storing instructions which when executed by the at least one processor cause the at least one processor to: \noperate the vehicle in an autonomous mode; \nreceive a command using a vehicle-to-infrastructure (V2I) communication device of the vehicle, the command instructing the vehicle to maneuver to a goal location; \ndetermine that the vehicle is unable to convert the command into machine instructions to operate the vehicle to maneuver to the goal location; and \nresponsive to determining that the vehicle is unable to convert the command into machine instructions, transmit a teleoperation request to a teleoperation server. \n | 2. The vehicle of claim 1, wherein the teleoperation request comprises a current location of the vehicle.\n | 3. The vehicle of claim 1, wherein the teleoperation request comprises one or more trajectory sampling points for the vehicle.\n | 4. The vehicle of claim 1, wherein maneuvering to the goal location comprises: \ntreating a current location of the vehicle as prior knowledge; and \nusing an inference algorithm to update the a current location of the vehicle based on the command. \n | 5. The vehicle of claim 1, wherein maneuvering to the goal location comprises inferring a speed profile from the command.\n | 6. The vehicle of claim 1, wherein maneuvering to the goal location comprises inferring a steering angle from the command using a learning algorithm.\n | 7. The vehicle of claim 1, wherein converting the command into machine instructions comprises enabling, editing or disabling a hardware component or a software process.\n | 8. The vehicle of claim 1, wherein converting the command into machine instructions comprises overwriting a travel preference or a travel rule.\n | 9. The vehicle of claim 1, wherein converting the command into machine instructions comprises editing data comprising one or more of a map, sensor data in the vehicle or a related AV system, trajectory data in the vehicle or a related AV system, vision data in the vehicle or a related AV system, or any past data in the vehicle or a related AV system.\n | 10. A non-transitory computer-readable storage medium storing instructions, which when executed by one or more processors cause the one or more processors to: \noperate a vehicle in an autonomous mode; \nreceive a command using a vehicle-to-infrastructure (V2I) communication device of the vehicle, the command instructing the vehicle to maneuver to a goal location; \ndetermine that the vehicle is unable to convert the command into machine instructions to operate the vehicle to maneuver to the goal location; and \nresponsive to determining that the vehicle is unable to convert the command into machine instructions, transmit a teleoperation request to a teleoperation server. \n | 11. The non-transitory computer-readable storage medium of claim 10, wherein the teleoperation request comprises a current location of the vehicle.\n | 12. The non-transitory computer-readable storage medium of claim 10, wherein the teleoperation request comprises one or more trajectory sampling points for the vehicle.\n | 13. The non-transitory computer-readable storage medium of claim 10, wherein maneuvering to the goal location comprises: \ntreating a current location of the vehicle as prior knowledge; and \nusing an inference algorithm to update the a current location of the vehicle based on the command. \n | 14. The non-transitory computer-readable storage medium of claim 10, wherein maneuvering to the goal location comprises inferring a speed profile from the command.\n | 15. The non-transitory computer-readable storage medium of claim 10, wherein maneuvering to the goal location comprises inferring a steering angle from the command using a learning algorithm.\n | 16. The non-transitory computer-readable storage medium of claim 10, wherein converting the command into machine instructions comprises enabling, editing or disabling a hardware component or a software process.\n | 17. The non-transitory computer-readable storage medium of claim 10, wherein converting the command into machine instructions comprises overwriting a travel preference or a travel rule.\n | 18. The non-transitory computer-readable storage medium of claim 10, wherein converting the command into machine instructions comprises editing data comprising one or more of a map, sensor data in the vehicle or a related AV system, trajectory data in the vehicle or a related AV system, vision data in the vehicle or a related AV system, or any past data in the vehicle or a related AV system.\n | 19. A method comprising: \noperating, by one or more processors, a vehicle in an autonomous mode; \nreceiving, by the one or more processors, a command using a vehicle-to-infrastructure (V2I) communication device of the vehicle, the command instructing the vehicle to maneuver to a goal location; \ndetermining, by the one or more processors, that the vehicle is unable to convert the command into machine instructions to operate the vehicle to maneuver to the goal location; and \nresponsive to determining that the vehicle is unable to convert the command into machine instructions, transmitting, by the one or more processors, a teleoperation request to a teleoperation server. \n | 20. The method of claim 19, wherein the teleoperation request comprises a current location of the vehicle.", "output": "The vehicle (10) comprises one processor. A non-transitory computer-readable storage medium stores instructions which is executed by the one processor. A command is received using a vehicle-to-infrastructure (V2I) communication device of the vehicle. The command instructs the vehicle to maneuver to a goal location. Determines that the vehicle is unable to convert the command into machine instructions to operate the vehicle to maneuver to the goal location. A teleoperation request is transmitted to a teleoperation server. The teleoperation request provides a current location of the vehicle and multiple trajectory sampling points for the vehicle. INDEPENDENT CLAIMS are included for the following:a non-transitory computer-readable storage medium storing instructions; anda method involves operating a vehicle in an autonomous mode. Vehicle. Vehicle ensures the resulting transition exhibits smooth and gradual changes in driving orientations. The drawing shows a block diagram of the AV system. 10Vehicle24Sensor28Communication devices40Computing device42Processor44Interface devices", "instruction": "Please summarize the input" }, { "input": "V2V latency measurement reporting to traffic server for optimizing the inter vehicle distance for self-driving carsMethods and apparatus, including computer program products, are provided for autonomous vehicles. In one aspect there is provided a method. The method may include detecting, at an autonomous vehicle, at least one vehicle within a certain range of the autonomous vehicle; measuring a latency representative of a time to communicate via a wireless link to the at least one detected vehicle; reporting the measured latency to the network; and receiving, by the autonomous vehicle, information to enable the autonomous vehicle to determine an intervehicle distance for configuration at the autonomous vehicle. Related apparatus, systems, methods, and articles are also described.What is claimed: \n | 1. A method, comprising: \ndetecting, at an autonomous vehicle, at least one vehicle within a certain range of the autonomous vehicle; \nmeasuring a latency representative of a time for the at least one detected vehicle to respond to a message sent by the autonomous vehicle via a wireless link; \nreporting, to a network, the measured latency; and \nreceiving, at the autonomous vehicle, information from the network, the information including an intervehicle distance for configuration at the autonomous vehicle, the intervehicle distance being determined at the network, and the intervehicle distance being determined based at least on the measured latency reported to the network. \n | 2. The method of claim 1, wherein the measuring of the latency includes sending, by the autonomous vehicle, the message to the at least one detected vehicle, and wherein the latency is determined based at least on a first time when the autonomous vehicle sent the message and a second time when the autonomous vehicle receives, from the at least one detected vehicle, a response to the message.\n | 3. The method of claim 2, further comprising: \nin response to the at least one detected vehicle failing to respond to the message within a threshold quantity of time, reporting, to the network, an indication that the at least one detected vehicle is non-autonomous, wherein the intervehicle distance being determined further based on the reported indication. \n | 4. The method of claim 1, wherein the information includes a value representative of the intervehicle distance for configuration at the autonomous vehicle.\n | 5. The method of claim 1, further comprising: \nconfiguring, by the autonomous vehicle, operation based on the intervehicle distance. \n | 6. The method of claim 1, wherein the intervehicle distance represents a minimum and/or an optimum distance between the autonomous vehicle and the at least one vehicle.\n | 7. An apparatus, comprising: \nat least one processor; and \nat least one memory including computer program code, the at least one memory and the computer program code configured to, with the at least one processor, cause the apparatus to perform at least the following: \ndetect, at the apparatus, at least one vehicle within a certain range of the apparatus; \nmeasure a latency representative of a time for the at least one detected vehicle to respond to a message sent by the apparatus via a wireless link; \nreport, to a network, the measured latency; and \nreceive, at the apparatus, information from the network, the information including an intervehicle distance for configuration at the apparatus, the intervehicle distance being determined at the network, and the intervehicle distance being determined based at least on the measured latency reported to the network. \n | 8. The apparatus of claim 7, wherein the apparatus measures the latency by at least sending, to the at least one detected vehicle, the message.\n | 9. The apparatus of claim 8, wherein the latency is determined based at least on a first time when the apparatus sent the message and a second time when the apparatus receives, from the at least one detected vehicle, a response to the message.\n | 10. The apparatus of claim 9, wherein the apparatus is further configured to at least: \nin response to the at least one detected vehicle failing to respond to the message within a threshold quantity of time, report, to the network, an indication that the at least one detected vehicle is non-autonomous, the intervehicle distance being determined further based on the reported indication. \n | 11. The apparatus of claim 8, wherein the apparatus reports, to the network, the measured latency in response to receiving the response from the at least one detected vehicle.\n | 12. The apparatus of claim 7, wherein the received information includes a value representative of the intervehicle distance for configuration at the apparatus.\n | 13. The apparatus of claim 7, wherein the apparatus is further configured to at least configure, based on the intervehicle distance, an operation of the apparatus.\n | 14. The apparatus of claim 7, wherein the intervehicle distance represents a minimum and/or an optimum distance between the apparatus and the at least one detected vehicle.\n | 15. An apparatus, comprising: \nat least one processor; and \nat least one memory including computer program code, the at least one memory and the computer program code configured to, with the at least one processor, cause the apparatus to perform at least the following: \nreceive, at the apparatus, a latency measurement representative of a time for at least one vehicle to respond to a message sent by an autonomous vehicle, via a wireless link, the at least one vehicle detected at the autonomous vehicle to be within a certain range of the autonomous vehicle, the latency measurement being determined by the autonomous vehicle; \ndetermine, based at least on the received latency measurement, an intervehicle distance; and \nsend, to the autonomous vehicle, information including the intervehicle distance for configuration at the autonomous vehicle. \n | 16. The apparatus of claim 15, wherein the intervehicle distance is determined further based on an indication that the at least one detected vehicle is non-autonomous.\n | 17. The apparatus of claim 16, wherein the apparatus is further configured to a least receive, from the autonomous vehicle, the indication that the at least one detected vehicle is non-autonomous, and wherein the autonomous vehicle sends the indication in response to the at least one detected vehicle failing to respond to a message from the autonomous vehicle within a threshold quantity of time.\n | 18. The apparatus of claim 15, wherein the intervehicle distance is determined further based on a road condition, a weather condition, a characteristic of the autonomous vehicle, and/or a characteristic of the at least one detected vehicle.\n | 19. The apparatus of claim 15, wherein the information includes a value representative of the intervehicle distance for configuration at the autonomous vehicle.", "output": "The method (400) involves detecting (405) a vehicle within a certain range of an autonomous vehicle and measuring (415) a latency representative of a time to communicate through a wireless link to the detected vehicle. The measured latency is reported (420) to the network. The information is received (425) to enable the autonomous vehicle to determine an inter-vehicle distance for configuration at the autonomous vehicle. INDEPENDENT CLAIMS are included for the following:an apparatus for controlling autonomous vehicles; anda non-transitory computer-readable storage medium with program code for controlling autonomous vehicles. Method for controlling autonomous vehicles. The information is received to enable the autonomous vehicle to determine an inter-vehicle distance for configuration at the autonomous vehicle, thus traffic congestion is alleviated while improving road safety. The drawing shows a flowchart of a process for latency measurement reporting. 400Autonomous vehicle controlling method405Detecting a vehicle415Measuring a latency420Reporting the measured latency425Receiving the information", "instruction": "Please summarize the input" }, { "input": "Positioning system based on geofencing frameworkThis provides methods and systems for the global navigation satellite system (GNSS) combined with the dead-reckoning (DR) technique, which is expected to provide a vehicle positioning solution, but it may contain an unacceptable amount of error due to multiple causes, e.g., atmospheric effects, clock timing, and multipath effect. Particularly, the multipath effect is a major issue in the urban canyons. This invention overcomes these and other issues in the DR solution by a geofencing framework based on road geometry information and multiple supplemental kinematic filters. It guarantees a road-level accuracy and enables certain V2X applications which does not require sub-meter accuracy, e.g., signal phase timing, intersection movement assist, curve speed warning, reduced speed zone warning, and red-light violation warning. Automated vehicle is another use case. This is used for autonomous cars and vehicle safety, shown with various examples/variations.The invention claimed is: \n | 1. A method for positioning based on geofencing framework for a road vehicle navigation system for a vehicle, said method implemented by one or more processors, said method comprising: \nreceiving vehicle states for said vehicle, said vehicle states including at least data of a position, a speed, a heading angle and a yaw rate of said vehicle, said yaw rate having a yaw rate bias; \nremoving said yaw rate bias of said yaw rate; \nresponsive to removing said yaw rate bias of said yaw rate, determining whether a reference road exists, said reference road providing data of at least a road heading angle and a road curvature; \nin case said reference road exists, determining whether said existing reference road is valid; \nin case said reference road does not exist or said existing reference road is invalid, searching for said reference road; \nin case said reference road is found based on said search or said existing reference road is valid, determining whether a lane change is detected for said vehicle; \nin case said lane change is detected for said vehicle, performing retrospective integrations of said speed and yaw rate for said vehicle; \ndetermining a reference yaw rate based on said road curvature and said speed of said vehicle; \ndetermining whether a yaw rate error between said yaw rate and said reference yaw rate is less than a yaw rate threshold; \nin case said yaw rate error is less than said yaw rate threshold, forcing said heading angle of said vehicle to said road heading angle; \nupdating said vehicle states; \ndetermining geofencing conditions of said position, speed, heading angle and yaw rate of said vehicle; \ndetermining whether said geofencing conditions are met; \nin case said geofencing conditions are met, applying geofencing to limit said position of said vehicle between road boundaries of said reference road; \nupdating said data of said vehicle's position; and \noutputting said data of said vehicle's position to an upper layer of said road vehicle navigation system for said vehicle. \n | 2. The method for positioning based on geofencing framework for a road vehicle navigation system for a vehicle as recited in claim 1, wherein said road vehicle navigation system works with or communicates with a global navigation satellite system.\n | 3. The method for positioning based on geofencing framework for a road vehicle navigation system for a vehicle as recited in claim 1, wherein said vehicle is interior to said reference road.\n | 4. The method for positioning based on geofencing framework for a road vehicle navigation system for a vehicle as recited in claim 1, wherein a distance from said vehicle to a next intersection is greater than a first threshold.\n | 5. The method for positioning based on geofencing framework for a road vehicle navigation system for a vehicle as recited in claim 1, wherein said vehicle's speed is greater than a second threshold.\n | 6. The method for positioning based on geofencing framework for a road vehicle navigation system for a vehicle as recited in claim 1, wherein applying geofencing comprises: timely geofencing.\n | 7. The method for positioning based on geofencing framework for a road vehicle navigation system for a vehicle as recited in claim 1, wherein applying geofencing comprises: predicted geofencing.\n | 8. The method for positioning based on geofencing framework for a road vehicle navigation system for a vehicle as recited in claim 1, wherein said method comprises: determining an incorrect position of said vehicle and correcting the determined position for said vehicle based on reducing a lateral error.\n | 9. The method for positioning based on geofencing framework for a road vehicle navigation system for a vehicle as recited in claim 1, wherein in case said reference road does not exist or said existing reference road is invalid, said search for said reference road comprises: \ndetermining candidate reference roads where said vehicle's position is interior to end points of said candidate reference roads; \ndetermining, from said candidate reference roads, a candidate reference road satisfying a heading error below a threshold; and \nadjusting an order of said end points to be consistent with a travel direction of said vehicle. \n | 10. The method for positioning based on geofencing framework for a road vehicle navigation system for a vehicle as recited in claim 1, wherein said method comprises: determining a lateral error for said vehicle.\n | 11. The method for positioning based on geofencing framework for a road vehicle navigation system for a vehicle as recited in claim 1, wherein said method comprises: in case said reference road is not found based on said search, outputting said data of said vehicle's position to said upper layer of said road vehicle navigation system for said vehicle.\n | 12. The method for positioning based on geofencing framework for a road vehicle navigation system for a vehicle as recited in claim 1, wherein said method comprises: determining a longitudinal error for said vehicle.\n | 13. The method for positioning based on geofencing framework for a road vehicle navigation system for a vehicle as recited in claim 1, wherein said method comprises: determining a predicted position for said vehicle based on at least one of lateral correction and longitudinal correction.\n | 14. The method for positioning based on geofencing framework for a road vehicle navigation system for a vehicle as recited in claim 1, wherein said method comprises: determining an average value of said yaw rate bias within a moving time window, and correcting said yaw rate bias based on said determined average value.\n | 15. The method for positioning based on geofencing framework for a road vehicle navigation system for a vehicle as recited in claim 1, wherein said method comprises: determining a sensor temperature, and determining said yaw rate bias that varies with said sensor temperature.\n | 16. The method for positioning based on geofencing framework for a road vehicle navigation system for a vehicle as recited in claim 1, wherein said method comprises: determining vibration or noise for removing said yaw rate bias.\n | 17. The method for positioning based on geofencing framework for a road vehicle navigation system for a vehicle as recited in claim 1, wherein said method comprises: applying a security layer for said road vehicle navigation system for said vehicle.\n | 18. The method for positioning based on geofencing framework for a road vehicle navigation system for a vehicle as recited in claim 1, wherein said method comprises: applying an application layer for said road vehicle navigation system for said vehicle.\n | 19. The method for positioning based on geofencing framework for a road vehicle navigation system for a vehicle as recited in claim 1, wherein said method comprises: applying a network layer for said road vehicle navigation system for said vehicle.\n | 20. The method for positioning based on geofencing framework for a road vehicle navigation system for a vehicle as recited in claim 1, wherein said method comprises: applying a physical layer for said road vehicle navigation system for said vehicle.", "output": "The method involves determining whether a yaw rate error is small by a processor. A vehicle's heading angle is forced to a road heading angle when the yaw rate error is small. Vehicle states are updated by the processor. Geo-fencing conditions are evaluated by the processor. Determination is made whether the geo-fencing conditions are met. Geo-fencing for a vehicle is applied in case the geo-fencing conditions are met. Vehicle's position data is updated. The vehicle's position data is outputted to an upper layer of a road vehicle navigation system for the vehicle. Method for positioning a vehicle i.e. autonomous car, based on a geo-fencing framework for a road vehicle navigation system for safety. The method enables identifying vehicle movements accurately and generating better results in a shorter time period. The method enables guaranteeing road-level accuracy and providing vehicle-to-everything (V2X) applications to eliminate need to require signal phase timing, intersection movement assist, curve speed warning, reduced speed zone warning and red-light violation warning. The method enables providing a reference road i.e. line or curve, for connecting two adjacent intersections to correct a vehicle position and heading, thus providing necessary information for coordinates of end points of the reference road, road heading angle, curvature and road width. The method enables performing weighted-averaging process based on redundancies between coverage of different units, weighted-average of data for accurate results and more weights for more reliable units or sources or higher weights for results that are closer to a center of curve representing distribution of values, thus eliminating or reducing fringe results or erroneous data. The drawing shows a flow diagram illustrating a development of fully automated vehicles.", "instruction": "Please summarize the input" }, { "input": "DRIVE CONTROL METHOD AND DRIVE CONTROL DEVICEIn a drive control method for using a drive control device to control the operation of a host vehicle using at least two autonomous driving modes that have different levels of driving assistance. The drive control method includes shifting the autonomous driving mode from a first mode to a second mode in which the driving assistance level of the second mode is higher than the driving assistance level of the first mode upon detecting a preceding vehicle in front of a host vehicle while traveling in the first mode. In this drive control method, a detectable distance to the preceding vehicle for shifting to the second mode is greater than a followable distance to the preceding vehicle when following travel is permitted in the first mode.|1. A drive control method having at least two autonomous driving modes having different driving assistance levels, the drive control method comprising: \nshifting the autonomous driving mode from a first mode to a second mode in which the driving assistance level of the second mode is higher than the driving assistance level of the first mode upon detecting a preceding vehicle traveling in front of the host vehicle while traveling in the first mode, wherein \na detectable distance to the preceding vehicle for shifting to the second mode is greater than a followable distance to the preceding vehicle when following travel is permitted in the first mode. \n | 2. The drive control method according to claim 1, further comprising \ncalculating a reliability of the preceding vehicle using the drive control device based on a behavior of the preceding vehicle upon detecting the preceding vehicle while the operation of the host vehicle is controlled using the first mode, and \nthe drive control device not shifting the autonomous driving mode to the second mode upon determining the reliability of the preceding vehicle is less than a predetermined defined value. \n | 3. The drive control method according to claim 2, wherein \nthe drive control device maintains the autonomous driving mode in the first mode upon determining the reliability of the preceding vehicle is less than the defined value. \n | 4. The drive control method according to claim 2, wherein \nthe calculating of the reliability of the preceding vehicle using the drive control device is based on at least one of a lateral displacement amount of the preceding vehicle, a frequency of acceleration or deceleration, and a frequency of an illumination of brake lights. \n | 5. (canceled)\n | 6. The drive control method according to claim 1, wherein \nan upper limit distance of the followable distance is a distance at which the host vehicle and the preceding vehicle can carry out vehicle-to-vehicle communication. \n | 7. The drive control method according to claim 1, wherein \nthe drive control device detects another vehicle as a preceding vehicle traveling in front of the host vehicle when travel history information about the other vehicle is received and the travel history information that is received includes information indicating that the other vehicle was traveling at a point in front of the host vehicle within a prescribed period of time. \n | 8. The drive control method according to claim 1, wherein \nthe drive control device does not shift the autonomous driving mode to the second mode when the host vehicle is traveling in the first mode and a vehicle speed of the host vehicle is greater than or equal to a prescribed speed. \n | 9. The drive control method according to claim 1, wherein \nthe first mode is an autonomous driving mode that requires a driver to visually monitor the surrounding conditions of the host vehicle, and \nthe second mode is an autonomous driving mode in which the drive control device executes monitoring of the surrounding conditions of the host vehicle. \n | 10. The drive control method according to claim 1, wherein \nthe first mode is a hands-on mode in which steering control by the control device does not operate when the driver is not holding the steering wheel, and \nthe second mode is a hands-off mode in which steering control by the drive control device operates even if the driver's hands leave the steering wheel. \n | 11. The drive control method according to claim 1, wherein \nanother vehicle is excluded as a preceding vehicle when a ride height of the other vehicle traveling in front of the host vehicle is greater than a ride height of the host vehicle. \n | 12. The drive control method according to claim 1, wherein \nanother vehicle is excluded as a preceding vehicle when the other vehicle traveling in front of the host vehicle is a two-wheeled vehicle. \n | 13. The drive control method according to claim 1, wherein \nthe drive control device is configured to \nshift the autonomous driving mode from the first mode to the second mode when a first preceding vehicle is present as the preceding vehicle in a first lane in which the host vehicle travels while the operation of the host vehicle is controlled using the first mode, \ndetermine whether or not the first preceding vehicle and a second preceding vehicle are traveling in the first lane upon detecting a second preceding vehicle traveling in front of the first preceding vehicle is also present in the first lane, \ncause the host vehicle to travel behind the second preceding vehicle upon determining that the first preceding vehicle has changed lanes to another lane that is different from the first lane and that the second preceding vehicle continues to travel in the first lane, and \ncause the vehicle to travel behind the first preceding vehicle and to change lanes to the other lane upon determining that the first preceding vehicle and the second preceding vehicle changed lanes to the other lane. \n | 14. A drive control device comprising: \na control unit configured to control an operation of a host vehicle using at least two autonomous driving modes including a first mode and a second mode that has a driving assistance level that is higher than that of the first mode; and \na preceding vehicle detection unit configured to detect a preceding vehicle traveling in front of the host vehicle, \nthe control unit being configured to shift the autonomous driving mode from the first mode to the second mode when the operation of the host vehicle is controlled using the first mode and the preceding vehicle detection unit detects the preceding vehicle, wherein \nthe control unit is configured to use a detectable distance to the preceding vehicle for shifting to the second mode that is greater than a followable distance to the preceding vehicle when following travel is permitted in the first mode.", "output": "The method involves controlling the driving of the own vehicle by automatic driving mode in which driving assistance levels differ using an operation-control apparatus. The automatic driving mode contains a first mode and a second mode in which a driving assistance level is higher than first mode, when the operation-control apparatus is controlling the driving of the own vehicle using first mode. The automatic driving mode is changed to second mode from first mode, when the preceding vehicle which drives the front of the own vehicle is detected. An INDEPENDENT CLAIM is included for an operation-control apparatus. Operation-control method of vehicle. The apparatus is effective in the ability to make many environments where the apparatus is made to drive the own vehicle in the automatic driving mode in which a driving assistance level is relatively high. The operation-control apparatus can pull down automatic driving mode in a first mode from a second mode, when the reliability of the second preceding vehicle is less than regulation value. The drawing shows a flowchart illustrating the operation-control process. (Drawing includes non-English language text) S1Step for determining whether automatic driving mode of own vehicle is first modeS2Step for determining whether vehicle speed of own vehicle is more than prescribed speedS3Step for determining whether preceding vehicle in which preceding vehicle detection unit drives front of own vehicle is detectedS4Step for calculating reliability of preceding vehicleS5Step for determining regulation value by which reliability of preceding vehicle is defined previously", "instruction": "Please summarize the input" }, { "input": "The automatic operating method and automatic controller of a vehicleAccording to the present invention, whether congestion is detected at a predetermined distance ahead of an own vehicle on a travel route of the own vehicle is determined during travel by automatic driving in which the vehicle speed reaches a target vehicle speed, and, in congestion detection in which congestion is determined, a vehicle speed VSP is decreased to be lower than a target vehicle speed VSPt0 of automatic driving in a normal condition in which congestion is not detected.|1. It is an autonomous driving method of a vehicle which controls the vehicle equipped with an internal combustion engine as a drive source,\n Comprising: \n When moving by the autonomous driving which makes a vehicle speed close to a target vehicle speed, \n On the driving route of a vehicle, it determines whether it detected the traffic congestion which exists ahead of more than the predetermined distance before a vehicle, \n When the said traffic congestion is detected, it sets the target vehicle speed of a said automatic driving|operation lower than the normal time excepting the time of the said traffic congestion detection, \n It controls the vehicle speed of a said vehicle to the said target vehicle speed, \n It makes a said vehicle approach the said traffic congestion. \n When a said vehicle approaches below a distance shorter than the said predetermined distance with respect to the said traffic congestion, it stops supply of the fuel with respect to the said internal combustion engine, \n The autonomous driving method of the vehicle characterized by the above-mentioned. \n | 2. It is an automatic operating method of the vehicle of Claim 1,\n Comprising: \n It is a distance longer than an inter-vehicle distance with the said preceding vehicle set at the time of the follow-up driving which the said predetermined distance tracks the preceding vehicle of vehicle front, and drive|works, \n The automatic operating method of a vehicle. \n | 3. It is an automatic operating method of the vehicle of Claim 1,\n Comprising: \n The said predetermined distance is a distance longer than the detectable distance of the vehicle-mounted sensor mounted so that recognition of the preceding vehicle of vehicle front was possible, \n The automatic operating method of a vehicle. \n | 4. It is an automatic operating method of the vehicle as described in any one of Claims 1-3,\n Comprising: \n The said predetermined distance is extended as the time when the present target vehicle speed is high, and it lengthens distance it drive|works with the target vehicle speed of a said automatic driving|operation lower than said normal time, \n The automatic operating method of a vehicle. \n | 5. It is an automatic operating method of the vehicle as described in any one of Claims 1-4,\n Comprising: \n The said predetermined distance is changed according to the attribute of the road on the said driving route, \n The automatic operating method of a vehicle. \n | 6. It is an automatic operating method of the vehicle as described in any one of Claims 1-5,\n Comprising: \n At the time of the said traffic congestion detection, the said target vehicle speed is gradually reduced from the present target vehicle speed, \n The automatic operating method of a vehicle. \n | 7. It is an automatic operating method of the vehicle as described in any one of Claims 1-6,\n Comprising: \n At the time of the said traffic congestion detection whose present target vehicle speed is higher than the optimal fuel-consumption vehicle speed of the own vehicle, the said target vehicle speed is orient|assigned to a said optimal fuel-consumption vehicle speed, and is reduced, \n The automatic operating method of a vehicle. \n | 8. It is an automatic operating method of the vehicle as described in any one of Claims 1-7,\n Comprising: \n The said traffic congestion is detected based on VICS information, vehicle-to-vehicle communication information, road-to-vehicle communication information, or the road traffic information from a portable terminal, \n The automatic operating method of a vehicle. \n | 9. It is an automatic operating method of the vehicle as described in any one of Claims 1-8,\n Comprising: \n The varying state of the said traffic congestion is estimated, \n According to the prediction result of the said varying state, the deceleration speed at the time of reducing a vehicle speed is changed, \n The automatic operating method of a vehicle. \n | 10. It is an automatic operating method of a vehicle which controls the vehicle equipped with an internal combustion engine as a drive source,\n Comprising: \n It is determined whether traffic congestion was detected on the driving route of the own vehicle during driving|running|working by the automatic driving|operation which brings a vehicle speed close to a target vehicle speed, \n At the time of the traffic congestion detection which detected the said traffic congestion, the target vehicle speed of a said automatic driving|operation lower than the normal time excepting the time of the said traffic congestion detection is set, \n The vehicle speed of the said own vehicle is controled from near side to the said target vehicle speed above predetermined distance rather than the traffic congestion tail end, \n The said own vehicle is made to approach the said traffic congestion. \n When the said own vehicle approaches below a distance shorter than the said predetermined distance with respect to the said traffic congestion, supply of the fuel with respect to the said internal combustion engine is stopped, \n The automatic operating method of a vehicle. \n | 11. It is an automatic operation device of a vehicle which controls the vehicle equipped with an internal combustion engine as a drive source,\n Comprising: \n The driving state detection part which detects driving state of a vehicle, \n The operation-control part which sets the control parameter regarding the automatic driving|operation which brings a vehicle speed close to a target vehicle speed based on driving state of a said vehicle, \n The traffic congestion detection part which detects traffic congestion on the driving route of the own vehicle during driving|running|working by said automatic driving|operation, \n These are provided, \n The said operation-control part, \n The said control parameter is set by the said traffic congestion detection part in both the time of the traffic congestion detection of vehicle front which detected the said traffic congestion previously above predetermined distance, and a normal time other than that, \n A 1st control parameter is set in said normal time, \n At the time of the said traffic congestion detection \n The 2nd control parameter which makes the target vehicle speed of a said automatic driving|operation lower than said normal time reduce a vehicle speed is set, \n The vehicle speed of the said own vehicle is controled by a said 2nd control parameter to the said target vehicle speed, \n The said own vehicle is made to approach the said traffic congestion. \n When the said own vehicle approaches below a distance shorter than the said predetermined distance with respect to the said traffic congestion, supply of the fuel with respect to the said internal combustion engine is stopped, \n The automatic controller of a vehicle. \n | 12. It is determined [ of vehicle front ] on the driving route of the own vehicle whether traffic congestion was detected previously above predetermined distance during driving|running|working by the automatic driving|operation which brings a vehicle speed close to a target vehicle speed,\n At the time of the traffic congestion detection which detected the said traffic congestion, the target vehicle speed of a said automatic driving|operation always lower than the normal time excepting the time of the said traffic congestion detection is set, \n The vehicle speed of the said own vehicle is controled to the said target vehicle speed, \n The said own vehicle is made to approach the said traffic congestion. \n The automatic operating method of a vehicle. \n | 13. It is determined whether it exists on the driving route of the own vehicle at the time of the traffic congestion detection of vehicle front which detected traffic congestion previously above predetermined distance during driving|running|working by the automatic driving|operation which brings a vehicle speed close to a target vehicle speed,\n At the time of the said traffic congestion detection that detected the said traffic congestion during driving|running|working with a vehicle speed higher than the optimal fuel-consumption vehicle speed of the own vehicle, the vehicle speed of the said own vehicle is always reduced towards a said optimal fuel-consumption vehicle speed with respect to determination that the said traffic congestion was detected, \n The automatic operating method of a vehicle.", "output": "The method involves determining whether or not a traffic jam has been detected ahead of the predetermined distance on the traveling route of the own vehicle during traveling by automatic driving that brings the vehicle speed close to the target vehicle speed. The vehicle speed is reduced below the target vehicle speed of the automatic driving at the normal time other than the time when the traffic is detected. An INDEPENDENT CLAIM is included for a automatic controller of a vehicle. Automatic operating method of vehicle. The tail end of the traffic jam is reduced, and the fuel consumption through the entire automatic driving is improved. The drawing shows a graphical view of the change of the vehicle speed. (Drawing includes non-English language text)", "instruction": "Please summarize the input" }, { "input": "VEHICLE CONTROL METHOD AND CONTROL DEVICEA method for controlling a vehicle in which, when a drive source stop condition is established while a vehicle is traveling, a sailing stop control is executed in which a drive source of the vehicle is stopped, an engaging element provided between the drive source and drive wheels is released, and the vehicle travels under inertia, wherein information on a status of a road on which a host vehicle is to travel is acquired; a determination is made based on the information as to whether there is a section on a route where the sailing stop control can be executed; when a section where the sailing stop control can be executed is present, a power shortage amount, which is a shortage in an amount of power during the sailing stop control, is estimated based on the information; and a battery is charged with power equivalent to the power shortage prior to starting the sailing stop control.|1. A vehicle control method in which, when a drive source stop condition is established while a vehicle is traveling, a sailing stop control is executed in which a drive source of the vehicle is stopped, an engaging element provided between the drive source and a drive wheel is released, and the vehicle travels under inertia, the vehicle control method comprising: \n * acquiring (S10) information on a status of a road on which the host vehicle will travel;\n * predicting (S20) whether there is a section on a route where the sailing stop control can be executed based on the information;\n * upon predicting that the section is present where the sailing stop control can be executed, estimating (S30) a power shortage amount, which is a shortage in an amount of power during the sailing stop control, based on the information; and\n * charging a battery with power required to cover the power shortage amount prior to starting the sailing stop control,\n * wherein\n * the power shortage amount is estimated as being larger when autonomous driving is performed in the section where the sailing stop control is executed as compared to when driver-enabled driving is performed in the section and\n * the information includes route information, map information acquired by a navigation system, a travel history of the vehicle, and other information acquired through road-to-vehicle communication and/or vehicle-to-vehicle communication.\n | 2. The vehicle control method according to claim 1, further comprising \n * estimating a frequency and an amount of operation of an operation system that includes steering or braking while in the section where the sailing stop control is executed based on the information; and\n * the power shortage amount is estimated to be larger correspondingly with respect to an increase in the frequency and the amount of the operation.\n | 3. The vehicle control method according to any claims 1 through 2, wherein \n * when the sailing stop control is terminated due to a power deficiency in the section in which the sailing stop control can be executed,\n * learning (S110) an actual electrical power consumption and an actual amount of decrease in a battery SOC for the section within which the sailing stop control can be executed is learned in association with the status of the road and travel history; and\n * increasing a subsequent charge amount in the battery prior to starting the sailing stop control as compared to a present based on results of the learning.\n | 4. The method for controlling a vehicle according to any claims 1 through 3, wherein\n when the section in which the sailing stop control can be executed is a downwardly sloping road that has a gradient equal to or greater than a prescribed gradient and is of a distance equal to or greater than a prescribed distance, the battery is charged via regeneration with electrical power necessary to cover the power shortage amount after the downwardly sloping road has been entered, and the sailing stop control is started thereafter. \n | 5. The vehicle control method according to any claims 1 through 4, wherein\n when a generator is used to generate power in order to cover the power shortage amount prior to entering the section in which the sailing stop control can be executed, and a fuel economy performance has declined by at least a prescribed amount, generation of power for charging the battery with electrical power necessary to cover the power shortage amount is disallowed, and the sailing stop control in the section in which the sailing stop control can be executed is disallowed. \n | 6. The vehicle control method according to any claims 1 through 4, wherein\n when a generator is used to generate power in order to cover the power shortage amount prior to entering a section within which the sailing stop control can be executed, and a fuel economy performance has declined by at least a prescribed amount, generation of power for charging the battery with electrical power necessary to cover the power shortage amount is disallowed, and, in the section within which the sailing stop control can be executed, sailing idle control in which the engaging element is released without stopping the drive source and the vehicle travels under inertia is executed. \n | 7. A control device for controlling a vehicle, the control device comprising: \n * a drive source (1) for use in travel;\n * an automatic transmission (2) connected to the drive source (1) and having a forward engaging element (3);\n * a control unit (9) that performs a sailing stop control in which the drive source (1) of the vehicle is stopped, the forward engaging element (3) is released, and the vehicle travels under inertia upon a drive source stop condition being established while the vehicle is traveling; and\n * an information-acquiring unit that acquires information on a status of a road on which the host vehicle will travel, wherein\n * the control unit (9) \n * predicts whether there is a section on a route where the sailing stop control can be executed based on the information;\n * upon determining the section is present where the sailing stop control can be executed, estimates a power shortage amount, which is a shortage in an amount of power during the sailing stop control, based on the information; and\n * charges a battery (5) with power required to cover the power shortage amount prior to starting the sailing stop control,\n * wherein\n * the power shortage amount is estimated as being larger when autonomous driving is performed in the section where the sailing stop control is executed as compared to when driver-enabled driving is performed in the section and\n * the information includes route information, map information acquired by a navigation system, a travel history of the vehicle, and other information acquired through road-to-vehicle communication and/or vehicle-to-vehicle communication.", "output": "The method involves performing sailing stop control by stopping drive source of vehicle and releasing fastening element between drive source and drive wheel to travel by inertia when drive source stop condition is satisfied while vehicle is traveling. An insufficient power amount, which is amount of power lacking during execution of sailing stop control, is estimated based on acquired information on road condition when stop control is performed. A battery is charged with power necessary to cover the power amount before start of sailing stop control. An INDEPENDENT CLAIM is included for a vehicle control apparatus. Vehicle control method. The fuel-consumption improvement effect is enlarged by sailing stop control. The drawing shows a graphical view showing the vehicle control process. (Drawing includes non-English language text)", "instruction": "Please summarize the input" }, { "input": "Vehicle-to-pedestrian communication systemsVehicle-to-pedestrian information systems that use directional sound transmission on autonomous vehicles are disclosed. A cloud computing system manages messages for transmission to pedestrians via autonomous vehicles having directional speakers. The cloud computing system identifies pedestrians and identifies messages for the pedestrians. Pedestrians may be known and authenticated to the cloud computing system or may be unknown. The cloud computing system maintains profiles for known pedestrians and transmits messages to vehicles based on the profiles. The cloud computing system keeps track of the location of vehicles and causes the vehicles to use directional speakers to transmit messages to the pedestrians based on the relative positions of the vehicles and the pedestrians.What is claimed is: \n | 1. A vehicle-to-pedestrian information system comprising: \na cloud computing system configured to communicate with a vehicle configured for autonomous piloting, the vehicle including a directional speaker, \nwherein the cloud computing system is configured to: \nidentify a message for a pedestrian based on a location of the pedestrian; \ntransmit the message to the vehicle; and \ncause the vehicle to play the message for the pedestrian via the directional speaker. \n | 2. The vehicle-to-pedestrian information system of claim 1, wherein the cloud computing system selects the vehicle to transmit the message to the pedestrian based on relative locations of the pedestrian and vehicle, wherein the vehicle, during play of the message, performs at least some driving functions autonomously, the at least some functions comprising steering and braking, wherein the directional speaker comprises an array of a plurality of ultrasonic transducers that generate first and second modulated ultrasonic waves, the first and second modulated waves being inaudible to the pedestrian, wherein the first and second ultrasonic waves are directed towards the pedestrian but not towards a second pedestrian, and wherein, when the first and second ultrasonic waves contact the pedestrian, the first and second ultrasonic waves mix together, via a parametric interaction, to produce a sound wave for the message that is audible to the pedestrian, wherein the second pedestrian is located outside the paths of travel of the first and second ultrasonic waves and is unable to hear the message, and wherein the cloud computing system is further configured to: \nauthenticate the pedestrian by communicating with a personal device associated with the pedestrian. \n | 3. The vehicle-to-pedestrian information system of claim 2, wherein the cloud computing system is further configured to: \nidentify a user profile based on the authentication with the pedestrian, \nwherein the message is identified based on the user profile, the profile storing data comprising user preferences for the pedestrian indicating message types and/or contents to be provided to the pedestrian. \n | 4. The vehicle-to-pedestrian information system of claim 2, wherein the cloud computing system selects the vehicle to transmit the message to the pedestrian based on relative locations of the pedestrian and vehicle, wherein the vehicle, during play of the message, performs at least some driving functions autonomously, the at least some functions comprising steering and braking, wherein the directional speaker comprises an array of a plurality of ultrasonic transducers that generate first and second modulated ultrasonic waves, the first and second modulated waves being inaudible to the pedestrian, wherein the first and second ultrasonic waves are directed towards the pedestrian but not towards a second pedestrian, and wherein, when the first and second ultrasonic waves contact the pedestrian, the first and second ultrasonic waves mix together, via a parametric interaction, to produce a sound wave for the message that is audible to the pedestrian, wherein the second pedestrian is located outside the paths of travel of the first and second ultrasonic waves and is unable to hear the message, and wherein the cloud computing system is further configured to: \nidentify the location of the pedestrian based on location data reported by the personal device associated with the pedestrian. \n | 5. The vehicle-to-pedestrian information system of claim 1, wherein: \nthe message comprises a first portion of a composite message, and \nthe cloud computing system is further configured to: \ntransmit a second portion of the composite message to a different second vehicle for playback to the pedestrian, wherein the cloud computing system selects the vehicle and the second vehicle from among multiple vehicles proximate to the pedestrian based on the different driving paths of the multiple vehicles. \n | 6. The vehicle-to-pedestrian information system of claim 5, wherein: \nthe cloud computing system is configured to instruct the first vehicle to play the first portion of the composite message and the second vehicle to play the second portion of the composite message in a manner that minimizes Doppler shift observed by the pedestrian. \n | 7. The vehicle-to-pedestrian information system of claim 1, wherein the cloud computing system selects the vehicle to transmit the message to the pedestrian based on relative locations of the pedestrian and vehicle, wherein the vehicle, during play of the message, performs at least some driving functions autonomously, the at least some functions comprising steering and braking, wherein the directional speaker comprises an array of a plurality of ultrasonic transducers that generate first and second modulated ultrasonic waves, the first and second modulated waves being inaudible to the pedestrian, wherein the first and second ultrasonic waves are directed towards the pedestrian but not towards a second pedestrian, and wherein, when the first and second ultrasonic waves contact the pedestrian, the first and second ultrasonic waves mix together, via a parametric interaction, to produce a sound wave for the message that is audible to the pedestrian, wherein the second pedestrian is located outside the paths of travel of the first and second ultrasonic waves and is unable to hear the message, and wherein: \nthe message comprises a safety message. \n | 8. The vehicle-to-pedestrian information system of claim 1, wherein the vehicle is configured to: \ndisplay a first visual indicator communicating that the vehicle is operating autonomously when the vehicle is operating autonomously, the vehicle operating autonomously when the driver does not have control of steering of the vehicle; and \ndisplay a different second visual indicator communicating that the vehicle is operating non-autonomously when the vehicle is operating non-autonomously, the vehicle operating non-autonomously when the driver has control of steering of the vehicle, wherein the first and second visual indicators are positioned on the vehicle to be visible to the pedestrian. \n | 9. An autonomous vehicle capable of communicating information to a pedestrian, the autonomous vehicle comprising: \na steering system and a speed control system; \na directional speaker; and \nan on-board computer configured to: \nautonomously control the steering system and the speed control system based on environmental conditions and navigation conditions; \nreceive a message for a pedestrian from a cloud computing system; \ndetermine a location of the pedestrian; and \ncause the directional speaker to play the message for the pedestrian based on the location of the pedestrian. \n | 10. The autonomous vehicle of claim 9, wherein the cloud computing system selects the vehicle to transmit the message to the pedestrian based on relative locations of the pedestrian and vehicle, wherein the vehicle, during play of the message, performs at least some driving functions autonomously, the at least some functions comprising steering and braking, wherein the directional speaker comprises an array of a plurality of ultrasonic transducers that generate first and second modulated ultrasonic waves, the first and second modulated waves being inaudible to the pedestrian, wherein the first and second ultrasonic waves are directed towards the pedestrian but not towards a second pedestrian, and wherein, when the first and second ultrasonic waves contact the pedestrian, the first and second ultrasonic waves mix together, via a parametric interaction, to produce a sound wave for the message that is audible to the pedestrian, wherein the second pedestrian is located outside the paths of travel of the first and second ultrasonic waves and is unable to hear the message, and wherein: \nthe pedestrian is authenticated to the cloud computing system via a personal device associated with the pedestrian. \n | 11. The autonomous vehicle of claim 10, wherein: \nthe message is based on a user profile that is associated with the authenticated pedestrian, the profile storing data comprising user preferences for the pedestrian indicating message types and/or contents to be provided to the pedestrian. \n | 12. The autonomous vehicle of claim 10, wherein the cloud computing system selects the vehicle to transmit the message to the pedestrian based on relative locations of the pedestrian and vehicle, wherein the vehicle, during play of the message, performs at least some driving functions autonomously, the at least some functions comprising steering and braking, wherein the directional speaker comprises an array of a plurality of ultrasonic transducers that generate first and second modulated ultrasonic waves, the first and second modulated waves being inaudible to the pedestrian, wherein the first and second ultrasonic waves are directed towards the pedestrian but not towards a second pedestrian, and wherein, when the first and second ultrasonic waves contact the pedestrian, the first and second ultrasonic waves mix together, via a parametric interaction, to produce a sound wave for the message that is audible to the pedestrian, wherein the second pedestrian is located outside the paths of travel of the first and second ultrasonic waves and is unable to hear the message, and wherein determining the location comprises: \nreceiving the location from the cloud computing system, which previously received the location from the personal device associated with the pedestrian. \n | 13. The autonomous vehicle of claim 9, wherein: \nthe message comprises a first portion of a composite message; and \nthe composite message also includes a second portion that is sent to a different autonomous vehicle for playback to the pedestrian, wherein the cloud computing system selects the vehicle and the second vehicle from among multiple vehicles proximate to the pedestrian based on the different driving paths of the multiple vehicles. \n | 14. The autonomous vehicle of claim 9, wherein the cloud computing system selects the vehicle to transmit the message to the pedestrian based on relative locations of the pedestrian and vehicle, wherein the vehicle, during play of the message, performs at least some driving functions autonomously, the at least some functions comprising steering and braking, wherein the directional speaker comprises an array of a plurality of ultrasonic transducers that generate first and second modulated ultrasonic waves, the first and second modulated waves being inaudible to the pedestrian, wherein the first and second ultrasonic waves are directed towards the pedestrian but not towards a second pedestrian, and wherein, when the first and second ultrasonic waves contact the pedestrian, the first and second ultrasonic waves mix together, via a parametric interaction, to produce a sound wave for the message that is audible to the pedestrian, wherein the second pedestrian is located outside the paths of travel of the first and second ultrasonic waves and is unable to hear the message, and wherein: \nthe message comprises a safety message. \n | 15. The autonomous vehicle of claim 9, further comprising: \na visual indicator display, wherein the on-board computer is configured to: \ndisplay a first visual indicator on the visual indicator display communicating that the vehicle is operating autonomously when the vehicle is operating autonomously, the vehicle operating autonomously when the driver does not have control of steering of the vehicle; and \ndisplay a different second visual indicator on the visual indicator display communicating that the vehicle is operating non-autonomously when the vehicle is operating non-autonomously, the vehicle operating non-autonomously when the driver has control of steering of the vehicle, wherein the first and second visual indicators are positioned on the vehicle to be visible to the pedestrian. \n | 16. A method for facilitating vehicle-to-pedestrian communication, the method comprising: \nidentifying a message for a pedestrian based on a location of the pedestrian; \ntransmitting the message to an autonomous vehicle that includes a directional speaker; and \ncausing the vehicle to play the message for the pedestrian via the directional speaker. \n | 17. The method of claim 16, wherein the cloud computing system selects the vehicle to transmit the message to the pedestrian based on relative locations of the pedestrian and vehicle, wherein the vehicle, during play of the message, performs at least some driving functions autonomously, the at least some functions comprising steering and braking, wherein the directional speaker comprises an array of a plurality of ultrasonic transducers that generate first and second modulated ultrasonic waves, the first and second modulated waves being inaudible to the pedestrian, wherein the first and second ultrasonic waves are directed towards the pedestrian but not towards a second pedestrian, and wherein, when the first and second ultrasonic waves contact the pedestrian, the first and second ultrasonic waves mix together, via a parametric interaction, to produce a sound wave for the message that is audible to the pedestrian, wherein the second pedestrian is located outside the paths of travel of the first and second ultrasonic waves and is unable to hear the message, and further comprising: \nauthenticating the pedestrian by communicating with a personal device associated with the pedestrian. \n | 18. The method of claim 17, further comprising: \nidentifying a user profile based on the authentication with the pedestrian, wherein the message is identified based on the user profile, the profile storing data comprising user preferences for the pedestrian indicating message types and/or contents to be provided to the pedestrian. \n | 19. The method of claim 17, wherein the cloud computing system selects the vehicle to transmit the message to the pedestrian based on relative locations of the pedestrian and vehicle, wherein the vehicle, during play of the message, performs at least some driving functions autonomously, the at least some functions comprising steering and braking, wherein the directional speaker comprises an array of a plurality of ultrasonic transducers that generate first and second modulated ultrasonic waves, the first and second modulated waves being inaudible to the pedestrian, wherein the first and second ultrasonic waves are directed towards the pedestrian but not towards a second pedestrian, and wherein, when the first and second ultrasonic waves contact the pedestrian, the first and second ultrasonic waves mix together, via a parametric interaction, to produce a sound wave for the message that is audible to the pedestrian, wherein the second pedestrian is located outside the paths of travel of the first and second ultrasonic waves and is unable to hear the message, and further comprising: \nidentifying the location of the pedestrian based on location data reported by the personal device associated with the pedestrian. \n | 20. The method of claim 16, wherein: \nthe message comprises a first portion of a composite message, and the method further comprises: \ntransmitting a second portion of the composite message to a different second vehicle for playback to the pedestrian, wherein the cloud computing system selects the vehicle and the second vehicle from among multiple vehicles proximate to the pedestrian based on the different driving paths of the multiple vehicles.", "output": "The system (100) has a cloud computing system (106) which is configured to communicate with a vehicle (102) including a directional speaker, and for autonomous piloting. The cloud computing system is configured to identify a message for a pedestrian (204) based on a location of the pedestrian. The message is transmitted to the vehicle. The vehicle is caused to play the message for the pedestrian through the directional speaker. The user profile is identified based on the authentication with the pedestrian. An INDEPENDENT CLAIM is included for a method for facilitating vehicle-to-pedestrian communication. Vehicle-to-pedestrian information system used in autonomous vehicle (claimed) e.g. car. The car can make decisions regarding which messages to play through the directional speakers without control of the cloud based system. The directional speakers allow the vehicles to deliver sound messages to specific pedestrians and to avoid delivering the messages to other pedestrians in different locations. The cloud computing system can select different vehicles based on the proximity to the pedestrian, based on a desire to avoid a Doppler shift based on other sound quality considerations. The drawing shows a schematic view illustrating the interactions between the vehicles of the car-to-pedestrian system and pedestrians. 100Vehicle-to-pedestrian information system102Vehicle106Cloud computing system112Communication link204Pedestrian", "instruction": "Please summarize the input" }, { "input": "FUSION AND CALIBRATION OF SENSOR SIGNALS IN A MOVING VEHICLEA sensor processing device located within a vehicle that is being driven, the processing device communicating with plural sensors, each sensor generating signal data, the processing device including a transceiver transmitting data derived by the processing device from the sensor signal data, to one or more remote servers, and receiving sensor-related information from the one or more remote servers, a synchronizer evaluating latencies of the sensors, an error estimator estimating accuracies of the sensor signal data, a sensor validator determining if one or more of the sensors are failed, and a calibrator transforming the sensor signal data to a common vehicle reference system.CLAIMS \n | 1. A sensor processing device located within a vehicle that is being driven, the processing device communicating with plural sensors, each sensor generating signal data, the processing device comprising: \n a transceiver transmitting data derived by the processing device from the sensor signal data, to one or more servers, and receiving sensor-related information from the one or more servers; \n a synchronizer evaluating latencies of the sensors; \n an error estimator estimating accuracies of the sensor signal data; \n a sensor validator determining if one or more of the sensors are failed; and a calibrator transforming the sensor signal data to a common vehicle reference system. \n | 2. The sensor processing device of claim 1 wherein the one or more servers comprise one or more devices within the vehicle or within the processing device.\n | 3. The sensor processing device of claim 1 wherein the sensors are members of the group consisting of an accelerometer, a barometer, a beacon, a gyroscope, a magnetometer, a camera, Lidar, radar, ultrasonic radar, a microphone, a global positioning system, and on-board diagnostic sensors.\n | 4. The sensor processing device of claim 1 wherein the processing device derives driver-related information from the transformed signal data, the driving-related information comprising vehicle position and orientation, autonomous vehicle feedback, driver feedback, or driver scores, and exposes the driving-related information to other devices. \n | 5. The sensor processing device of claim 1 wherein the processing device exposes the synchronized and calibrated signal data through an application programming interface (API) or through a software development kit (SDK).\n | 6. The sensor processing device of claim 1 wherein the processing device responds to a trigger event by logging sensor data on the one or more servers for analysis, wherein rules for trigger events are provided by the processing device or by the one or more servers.\n | 7. The sensor processing device of claim 6 wherein the one or more servers conducts collision analysis or system failure analysis based on the logged sensor data.\n | 8. A vehicle network data processor that receives times series data from transmitters in one or more vehicles that are being driven, the time series data based on plural sensors located in the one or more vehicles, the vehicle data processor deriving, from the received time series data, sensor- related information and driver-related information, the vehicle data processor comprising: \n one or more cellular or Wi-Fi transceivers receiving time series data from the vehicles; a synchronizer evaluating latencies of the time series data; \n an error estimator for estimating accuracies of the time series data; and \n one or more database managers storing the sensor -related information and the driver- related information derived by the processor in one or more databases, \n wherein said one or more cellular or Wi-Fi transceivers transmit the sensor-related information in the databases to the vehicles. \n | 9. The vehicle network data processor of claim 8 wherein the sensors are members of the group consisting of an accelerometer, a barometer, a gyroscope, a magnetometer, a camera, a microphone, a global positioning system, on-board diagnostic sensors, and a temperature sensor. \n | 10. The vehicle network data processor of claim 8 wherein the driving-related information derived by said processor comprises vehicle-to-vehicle network information, advanced driver assistance system information, autonomous driving training information, map information and fleet driver scores.\n | 11. The vehicle network data processor of claim 8 wherein the sensor-related information stored by said one or more database managers includes initial sensor calibration models obtained from at least some of the plural sensors, and wherein other sensors access the initial sensor calibration models from said one or more database managers for use as their calibration models.\n | 12. The vehicle network data processor of claim 11 wherein the initial calibration models are obtained from local calibrations performed by the individual sensors and uploaded to said one or more database managers.\n | 13. A non-transitory computer readable medium storing instructions, which, when executed by a processing device located in a vehicle that is being driven, cause the processing device to process signal data from sensors in the vehicle, comprising causing the processing device to: \n receive signal data from the sensors; \n receive sensor-related information from one or more remote servers; \n evaluate latencies of the sensors so as to synchronize the sensor signal data; estimate accuracies of the sensor signal data; \n determine if one or more of the sensors are failed; and \n transform the sensor signal data to a common vehicle reference system. \n | 14. The computer readable medium of claim 13 wherein the processor evaluates latencies of the sensor signal data based on domain matching whereby the same physical quantity is derived from the sensor signal data in more than one way. \n | 15. The computer readable medium of claim 13 wherein the processing device transforms the sensor signal data to the vehicle reference system by use of a rotation matrix that transforms an orthogonal set of device axes to an orthogonal set of vehicle axes, the device axes comprising two perpendicular axes in a plane of the device and a third axis normal to the plane, and the vehicle axes comprising a roof axis, a forward axis and a side axis.\n | 16. The computer readable medium of claim 13 wherein the sensor-related information comprises calibration data for the sensors in the vehicle.\n | 17. A non-transitory computer readable medium storing instructions, which, when executed by a vehicle network data processor, cause the data processor to receive time series data from transmitters in one or more vehicles that are being driven, the time series data being based on plural sensors located in the one or more vehicles, and to derive sensor-related information and driver- related information from the received time series data, comprising causing the data processor to: \n receive time series data from the vehicles; \n evaluate latencies of the sensors; \n estimate accuracies of the time series data; \n store the sensor-related information and the driver-related information derived by the processor in one or more databases; and \n transmit the sensor-related information in the databases to the vehicles. \n | 18. A vehicle sensor system, comprising a plurality of computer processing units within a vehicle that is being driven, the vehicle including plural sensors that generate signal data, the plurality of computer processing units jointly comprising circuitry for sensor calibration and fusion, the circuitry comprising: \n one or more local area data connections receiving signal data from the sensors; one or more transceivers transmitting data derived from the sensor signal data to one or more remote servers, and receiving sensor-related information from the one or more remote servers; \n a synchronizer evaluating latencies of the sensors; \n an error estimator for estimating accuracies of the received sensor signal data; \n a sensor validator for determining if one or more of the sensors are failed; and a calibrator for transforming the received sensor signal data to a common vehicle reference system. \n | 19. The vehicle sensor system of claim 18 wherein said computer processing units are members of the group consisting of smartphones, Internet of things (IoT) devices, wearable devices, and a vehicle system.\n | 20. The vehicle sensor system of claim 18 wherein the sensors are members of the group consisting of an accelerometer, a barometer, a gyroscope, a magnetometer, a camera, a microphone, a global positioning system, on-board diagnostic sensors and temperature sensors.\n | 21. The vehicle sensor system of claim 18 wherein the plurality of computer processing units jointly derive driving-related information, the driver-related information comprising autonomous vehicle feedback, driver feedback, one or more driver scores, or vehicle orientation and positioning information.\n | 22. A method for a plurality of computer processing units within a vehicle that is being driven, the vehicle comprising plural sensors that generate signal data, to jointly perform sensor data processing, the method comprising dynamically allocating among the computer processing units the following real-time tasks, the allocation being based on currently available bandwidth and computing power of each computer processing unit: evaluate latencies of the sensors so as to synchronize the sensor signal data; \n estimate accuracies of the sensor signal data; \n determine if one or more of the sensors are failed; and \n transform the sensor signal data to a common vehicle reference system. \n | 23. The method of claim 22 wherein said dynamically allocating comprises dynamically selecting one of the computer processing units to be a master over the other computer processing units.\n | 24. The method of claim 22 wherein system data is shared among the processing units, and when one of the processing units is removed one or more others of the processing units perform the removed processing unit's allocated tasks.\n | 25. The method of claim 22 wherein the task to evaluate latencies is performed by domain matching whereby the same physical quantity is derived from the sensor signal data in more than one way.\n | 26. The method of claim 22 wherein the task to transform the sensor signal data to the vehicle reference system is performed using a rotation matrix that transforms an orthogonal set of sensor axes to an orthogonal set of vehicle axes, the sensor axes comprising two perpendicular axes in a plane of the sensor and a third axis normal to the plane, and the vehicle axes comprising a roof axis, a forward axis and a side axis.", "output": "The sensor processing device has a transceiver which transmits data derived by the processing device from the sensor signal data to one or more servers and receives sensor-related information from one or more servers. A synchronizer (152) evaluates latencies of the sensors. An error estimator (154) estimates accuracies of the sensor signal data. A sensor validator (156) determines if one or more of the sensors are failed. A calibrator (158) transforms the sensor signal data to a common vehicle reference system. The processing device derives driver-related information from the transformed signal data and exposes the driving-related information to other devices. The driving-related information comprises vehicle position and orientation, autonomous vehicle feedback, driver feedback, or driver scores. INDEPENDENT CLAIMS are included for the following:a vehicle network data processor;a non-transitory computer readable medium storing program for processing signal data from sensors in the vehicle;a vehicle sensor system; anda method for processing sensor data using several computer processing units within a vehicle. Sensor processing device used for processing data of sensor e.g. accelerometer, a barometer, a gyroscope, a magnetometer, a camera, a microphone, a global positioning system, on-board diagnostic sensors and temperature sensors of vehicle sensor system (all claimed) used in moving vehicle e.g. land vehicle such as a car or a motorcycle, a water vehicle such as a boat or a ship, or an air vehicle such as an airplane or a drone. Can also be used in time series of system measurements of inter alia fuel systems, emission systems, transmission systems, speed control systems and idle control systems of vehicle. The device provides proper unified gateway for data collection, and system to obtain, align, synchronize and calibrate data sources, and assess data validity and analyze it for an Artificial Intelligence-aware driving experience. The error, the timestamp error and the timestamp latency in the data are analyzed accurately. The new sensor immediately provides accurate calibrated results, avoiding the need to wait for the online calibration process to converge. The speed updates are propagated back to correct the gravity vector which improves accuracy and reduces error estimation. The vehicle data processor includes one or more cellular or wireless-fidelity (WiFi) transceivers which receive time series data from the vehicles. The drawing shows a simplified block diagram of a sensor processor for fusing and calibrating data received from a moving vehicle. 150Sensor processor152Synchronizer154Error estimator156Sensor validator158Calibrator", "instruction": "Please summarize the input" }, { "input": "Vehicle Localization and Identification by Map Merging in 5G and 6GAutonomous vehicles, and user-driven vehicles with an emergency intervention capability, can communicate to avoid collisions using 5G/6G technology, but this level of cooperation is possible only if the threatened vehicles have already determined the relative location and wireless address of the other vehicle. Disclosed is a method for wireless vehicles in traffic to exchange distance and angular information of the other vehicles in view, from which a position map can be prepared indicating the relative locations of each participating and non-participating vehicle. In addition, the traffic map can be annotated with the wireless addresses of the participating vehicles, thereby enabling them to communicate instantly in an emergency. The traffic map may be prepared or updated by one of the vehicles in traffic, or by a roadside access point. Satellite data is not necessary for the relative localization, but may be included if available.|1. A method for a first wireless device to determine locations of vehicles in traffic, the method comprising: \na) attempting and failing to determine a location of the first wireless device using a global navigation satellite system (GNSS); \nb) then broadcasting a request message comprising a wireless address of the first wireless device, a time delay, and a request for each wireless entity within radio range to: \ni) wait the time delay; \nii) then measure an angle and a distance of each vehicle in view of the wireless entity; and then \niii) transmit a reply message to the first wireless device, the reply message listing the angles and the distances, and further including a wireless address of the wireless entity; and \nc) determining, according to the angles and distances, a location of at least one vehicle relative to the first wireless device. \n | 2. The method of claim 1, wherein each request message is configured according to 5G or 6G technology.\n | 3. The method of claim 1, wherein: \na) the first wireless entity is a vehicle or a roadside access point; and \nb) each wireless entity is a vehicle or a wireless camera or a roadside access point. \n | 4. The method of claim 1, further comprising: \na) after transmitting the request message, waiting the time delay; and \nb) then measuring an angle and a distance of each vehicle in view of the first wireless device. \n | 5. The method of claim 4, further comprising: \na) receiving, from one or more of the wireless entities, one or more reply messages, wherein each reply message indicates: \ni) one or more angles and one or more distances of one or more vehicles in view of the wireless entity; and \nii) a wireless address of the wireless entity. \n | 6. The method of claim 5, further comprising: \na) combining the angles and distances from the reply messages, with the angles and distances measured by the first wireless device; and \nb) determining, according to the combining, a two-dimensional position of each vehicle that is viewed by at least one of the wireless entities or by the first wireless device. \n | 7. The method of claim 5, further comprising: \na) determining, according to the angle and distance measurements, a two-dimensional position of each vehicle in traffic. \n | 8. The method of claim 5, further comprising determining: \na) which angle and distance measurements correspond to a vehicle that is viewed by exactly one viewer; and \nb) which angle and distance measurements correspond to a vehicle that is viewed by more than one viewer; \nc) wherein a viewer comprises the first wireless device or one of the wireless entities. \n | 9. The method of claim 8, further comprising: \na) performing a fitting analysis according to the angle and distance measurements, wherein the fitting analysis comprises determining a calculated position of each vehicle that is viewed by more than one viewer. \n | 10. The method of claim 5, further comprising: \na) broadcasting a mapping message indicating two-dimensional position coordinates of each vehicle in traffic relative to the first wireless device; and \nb) wherein the mapping message further indicates, for each wireless entity, which coordinates are associated with the wireless entity, and which wireless address is associated with the wireless entity. \n | 11. Non-transitory computer-readable media in a wireless entity, the media containing instructions that when implemented in a computing environment cause a method to be performed, the method comprising: \na) receiving a request message from a first vehicle, the request message requesting the wireless entity to: \ni) determine, at a particular time, an angle measurement and a distance measurement of each vehicle in view of the wireless entity; and \nii) transmit, to a particular wireless address of the first vehicle, a reply message comprising the angle and distance measurements; \nb) at the particular time, measuring an angle and a distance of each vehicle in view of the wireless entity: \nc) transmitting a reply message to the first vehicle, the reply message comprising the angles and distances measured by the wireless entity; and \nd) receiving, from the first vehicle, a traffic map message comprising two-dimensional coordinates of vehicles in traffic. \n | 12. The non-transitory computer-readable media of claim 11, wherein the traffic map message further comprises a wireless address of each vehicle that transmitted a reply message.\n | 13. The non-transitory computer-readable media of claim 11, wherein the traffic map message further comprises one or more visible characteristics of each vehicle in traffic.\n | 14. The non-transitory computer-readable media of claim 13, wherein the visible characteristics comprises a vehicle type encoded in a predetermined code comprising at most six bits.\n | 15. The non-transitory computer-readable media of claim 11, wherein the request message, the reply messages, and the traffic map message are transmitted on a sidelink channel allocated for vehicle-to-vehicle or vehicle-to-anything communications.\n | 16. The non-transitory computer-readable media of claim 11, the method further comprising: \na) determining, according to an electronic compass, a direction of geographical north; and \nb) for each vehicle in view of the wireless entity, measuring an angle of a centroid of the vehicle in view, relative to geographical north. \n | 17. The non-transitory computer-readable media of claim 11, the method further comprising: \na) using a radar or lidar or sonar distance-measuring sensor, measuring a distance between the wireless entity and a closest part of each vehicle in view of the wireless entity; \nb) for each vehicle in view, calculating a centroid correction distance comprising, wherein the centroid correction distance comprises an angle subtended by the vehicle in view times the distance to the closest part of the vehicle in view; and \nc) adding the centroid correction distance to the measured distance between the first vehicle and the closest part of the particular vehicle. \n | 18. A processor comprising an AI (artificial intelligence) model, the processor configured to: \na) take, as input, a plurality of sets of distances and angles; and \nb) provide, as output, a traffic map; \nc) wherein each set of distances and angles is measured by a participating vehicle, of a plurality of participating vehicles in traffic; \nd) wherein each distance and angle corresponds to a measured vehicle viewed by one of the participating vehicles; and \ne) wherein the traffic map comprises a list of current position coordinates of the measured vehicles in traffic. \n | 19. The processor of claim 18, wherein the AI model is further configured to be implemented in a processor of one of the participating vehicles.\n | 20. The processor of claim 18, wherein the AI model is further configured to: \na) for each measured vehicle, determine a difference between the current position coordinates and previously determined position coordinates; \nb) for each measured vehicle, calculate a velocity according to the difference and a time difference between the current position coordinates and the previously determined position coordinates; \nc) for each pair of measured vehicles, calculate a distance between the measured vehicles of the pair and a relative velocity between the measured vehicles of the pair; and \nd) predict, according to the distance and the relative velocity, when an imminent collision is expected to occur.", "output": "The method involves attempting and failing to determine a location of a wireless device using a global navigation satellite system (GNSS ). The request message comprising a wireless address of the wireless device, a time delay, and a request for each wireless entity is broadcasted within radio range to measure an angle and a distance of each vehicle (201) in view of the entity and transmit a reply message to the wireless device, where the reply message includes the angles, the distances, and a wireless address of the wireless entity. The location of the vehicle is determined relative to the wireless device is determined according to the angles and distances. The request message is configured according to 5G or 6G technology. The first wireless entity is a vehicle, a wireless camera or a roadside access point. INDEPENDENT CLAIMS are included for: (1) non-transitory computer-readable media containing instructions for short-range locating and identification of vehicles; (2) a processor comprising an artificial intelligence model for short-range locating and identification of vehicles. Method for short-range locating and identification of vehicles such as autonomous and semi-autonomous vehicles relative to a first wireless device in traffic by map merging based on fifth-generation (5G ) or sixth-generation (6G ) technology to avoid collisions and facilitate flow of traffic. Uses include but are not limited to sedan, delivery van, pickup truck, sports car, motor cycle, semi-trailer, etc. The method enables determining the wireless address of each proximate vehicle in traffic, so that the vehicles can communicate for traffic management and collision avoidance. The vehicles may cooperate to manage the flow of traffic, avoid hazards, and minimize energy consumption. The drawing shows a top view of vehicles. 200Freeway 201, 202, 203Vehicles 204Truck ", "instruction": "Please summarize the input" }, { "input": "V2V and V2X Communications in 5G and 6G Based on Displayed MatrixDisclosed is a “connectivity matrix” that wireless entities (vehicles, fixed assets, etc.) can display indicating the 5G/6G wireless address of the entity. Other wireless devices can then image the connectivity matrix, determine the wireless address, and then communicate in sidelink, on frequencies allocated for ad-hoc networking. Alternatively, the two entities can communicate through a local base station, on managed channels, using the displayed wireless address. The matrix can provide additional information, such as the frequency, bandwidth, and modulation scheme favored by the entity. Alternatively, the matrix can provide a key code maintained by a central authority, so that a second wireless entity can read the code and request the associated wireless address (and frequency, bandwidth, etc.) from the central authority. By either method, the two wireless entities can then communicate explicitly thereafter.|1. Non-transitory computer-readable media containing instructions that, when executed by a computing environment, cause a method to be performed, the method comprising: \na) maintaining, in further non-transitory computer-readable media, a tabulation of entries, each entry comprising an index value and a wireless address of a vehicle or a fixed asset; \nb) receiving a request message specifying a code, wherein the code is indicated by a matrix comprising black and white rectangular fields visibly displayed by a particular vehicle or fixed asset; \nc) determining a particular index value according to the code; \nd) selecting a particular entry of the tabulation according to the particular index value; \ne) determining a particular wireless address comprising the particular entry; and \nf) transmitting a reply message indicating the particular wireless address. \n | 2. The non-transitory computer-readable media of claim 1, wherein the reply message is transmitted according to 5G or 6G technologies.\n | 3. The non-transitory computer-readable media of claim 1, the method further comprising: \na) determining that the request message indicates that the particular entry should be transmitted in entirety; and \nb) transmitting, in the reply message, the particular entry in entirety. \n | 4. The non-transitory computer-readable media of claim 1, the method further comprising: \na) determining that the request message indicates that only the particular wireless address should be transmitted; and \nb) transmitting, in the reply message, the particular wireless address without transmitting other information, if any, comprising the particular entry. \n | 5. The non-transitory computer-readable media of claim 1, the method further comprising: \na) receiving, from the particular vehicle or fixed asset, a change message, the change message specifying the code and providing additional or changed information; \nb) determining a particular index according to the code; \nc) determining a particular entry according to the particular index; and \nd) revising the particular entry according to the additional or changed information. \n | 6. The non-transitory computer-readable media of claim 1, the method further comprising: \na) receiving a joining message from a new vehicle or fixed asset, wherein the new vehicle or fixed asset is not associated with any entry in the tabulation; \nb) determining that the joining message specifies a wireless address of the new vehicle or fixed asset, and requests that a new entry be added to the tabulation; \nc) generating the new entry in the tabulation, the new entry comprising the wireless address of the new vehicle or fixed asset; \nd) determining a new index value, and associating the new entry with the new index value; \ne) generating a new code according to the new index value; and \nf) transmitting a welcome message to the new vehicle or fixed asset, the welcome message indicating the new code. \n | 7. The non-transitory computer-readable media of claim 1, wherein: \na) the joining message further indicates a frequency and a bandwidth; and \nb) the new entry further comprises the frequency and the bandwidth. \n | 8. The non-transitory computer-readable media of claim 1, wherein: \na) the joining message further indicates an MCS (modulation and coding scheme); and \nb) the new entry further comprises the MCS. \n | 9. The non-transitory computer-readable media of claim 1, wherein: \na) the joining message further indicates one or more capabilities or limitations of the new vehicle or fixed asset; and \nb) the new entry further comprises the one or more capabilities or limitations of the new vehicle or fixed asset. \n | 10. A method for a first vehicle, in traffic comprising a second vehicle, the method comprising: \na) observing, using a camera or sensor in or on the first vehicle, a matrix displayed by the second vehicle, the matrix comprising a plurality of fields colored black or white according to a binary code; \nb) determining the binary code of the matrix; \nc) determining an entry in a tabulation, the entry associated with the code; \nd) determining, according to the entry, a wireless address of the second vehicle; and \ne) transmitting a wireless message, according to the wireless address, to the second vehicle. \n | 11. The method of claim 10, wherein: \na) the code comprises a predetermined number of code bits; \nb) the matrix comprises a plurality of border fields surrounding a plurality of data fields; and \nc) the number of code bits equals a number of data fields. \n | 12. The method of claim 10, further comprising: \na) determining, according to the entry, a frequency and a bandwidth; and \nb) transmitting the wireless message according to the frequency and the bandwidth. \n | 13. The method of claim 10, further comprising: \na) determining, according to the entry, an MCS (modulation and coding scheme); and \nb) transmitting the wireless message according to the MCS. \n | 14. The method of claim 10, further comprising: \na) determining, according to the matrix, whether the second vehicle is autonomously driven or human-driven. \n | 15. The method of claim 10, further comprising: \na) displaying, on the first vehicle, a further matrix comprising a plurality of fields colored black or white according to a further binary code; \nb) wherein the further binary code is associated with a further entry in the tabulation, and the further entry comprises a further wireless address of the first vehicle. \n | 16. The method of claim 15, further comprising: \na) turning off the further matrix, by depowering illuminators in the further matrix, while the first vehicle is human-operated; and \nb) turning on the further matrix, by repowering the illuminators in the further matrix, while the first vehicle is computer-operated. \n | 17. A system comprising a blockchain comprising information about wireless addresses, wherein: \na) a first wireless entity comprises a camera, a processor, and a first matrix of black and white rectangular fields; \nb) the black and white rectangular fields are configured to display a first code associated with the first wireless entity; \nc) the camera is configured to image a second matrix displayed by a second wireless entity; \nd) the processor is configured to determine, according to the second matrix, a second code associated with the second wireless entity; and \ne) the processor is further configured to determine, according to the blockchain, a second wireless address associated with the second code. \n | 18. The system of claim 17, wherein: \na) the first code comprises a first wireless address of the first wireless entity; and \nb) the second code comprises a second wireless address, a frequency, and a bandwidth. \n | 19. The system of claim 18, wherein the processor is further configured to determine, according to the blockchain, the second wireless address, the frequency, and the bandwidth associated with the second code.\n | 20. The system of claim 19, wherein the processor is further configured to transmit a message to the second wireless entity, the second message transmitted according to the second wireless address, the frequency, and the bandwidth.", "output": "The non-transitory computer-readable medium comprises a set of instructions for maintaining a tabulation of entries in a non-transitory computer-readable media, where each entry comprises an index value and a wireless address of a vehicle or a fixed asset. A request message specifying a code is received, where the code is indicated by a matrix comprising black and white rectangular fields visibly displayed by a particular vehicle or fixed asset, and a particular index value is determined according to the code. A reply message indicating the particular wireless address is transmitted according to fifth generation or sixth generation technologies. A determination is made that the request message indicates that the particular entry is transmitted in entirety. The step of determining that the request message indicates that the particular entry should be transmitted in entirety. It is determined that the request message indicates that only the particular wireless address should be transmitted. INDEPENDENT CLAIMS are included for: (1) a method for a first vehicle; (2) a system comprising a blockchain comprising information about wireless addresses. Non-transitory computer-readable media for localizing, identifying, and communicating with vehicles in traffic and fixed assets. The non-transitory computer-readable medium ensures that the collision avoidance and traffic efficiency are improved. The drawing shows a schematic sketch of a wireless address tabulation according to the non-transitory computer-readable media for localizing, identifying, and communicating with vehicles in traffic and fixed assets.600Variable connectivity matrix 601Illuminator 602Diffuser 603Opaque separator 610Connectivity matrix 611Variable-transmissive window 612Illuminators 613Optional diffuser ", "instruction": "Please summarize the input" }, { "input": "V2X and vehicle localization by local map exchange in 5G or 6GAutonomous vehicles may communicate with each other in 5G or 6G to avoid hazards, mitigate collisions, and facilitate the flow of traffic. However, for cooperative action, each vehicle must determine the wireless address of other vehicles in proximity, so that they can communicate directly with each other. It is not sufficient to know the wireless address alone; the wireless address must be associated with an actual vehicle in view. Methods disclosed herein enable vehicles to exchange messages that specify the distances and angles of other vehicles in view. Then, each vehicle compares the other vehicle's measurements with its own, along with each vehicle's wireless address. Using an AI-based map-merging algorithm, one or more vehicles can produce a full traffic map from the fragmentary local maps of each vehicle's viewpoint.The invention claimed is: \n | 1. A method for a first vehicle to communicate with a second vehicle, the second vehicle proximate to a third vehicle, the method comprising: \na. measuring a first plurality of angles, the first plurality of angles comprising an angle of the second vehicle and an angle of the third vehicle, as viewed by the first vehicle; \nb. transmitting a request message to the second vehicle, the request message requesting measurement data from the second vehicle; \nc. receiving, from the second vehicle, a reply message comprising a second plurality of angles, the second plurality of angles comprising an angle of the first vehicle and an angle of the third vehicle, as viewed by the second vehicle; and \nd. determining, according to the first plurality of angles and the second plurality of angles, a merged map, the merged map comprising a position of the first vehicle, a position of the second vehicle, and a position of the third vehicle. \n | 2. The method of claim 1, wherein the request message and the reply messages are transmitted according to 5G or 6G technology.\n | 3. The method of claim 1, wherein the angles are measured relative to a direction of a road occupied by the first and second vehicles.\n | 4. The method of claim 1, further comprising: \na. measuring a third plurality of distances, the third plurality of distances comprising a distance from the first vehicle to the second vehicle and a distance from the first vehicle to the third vehicle; \nb. receiving, from the second vehicle, a fourth plurality of distances, the fourth plurality of distances comprising a distance from the second vehicle to the first vehicle and a distance from the second vehicle to the third vehicle; and \nc. determining the merged map according to the first plurality of angles, the second plurality of angles, the third plurality of distances, and the fourth plurality of distances. \n | 5. The method of claim 4, wherein the request message further indicates the first plurality of angles and the third plurality of distances, and the reply message further indicates the second plurality of angles and the fourth plurality of distances.\n | 6. The method of claim 1, further comprising: \na. determining, according to the reply message, a color or a vehicle type, or both, of the second vehicle; \nb. comparing the color or vehicle type, or both, of the second vehicle to each of the vehicles visible to the first vehicle; and \nc. determining the merged map according to the color or vehicle type, or both, of the second vehicle. \n | 7. The method of claim 1, wherein: \na. the request message further indicates a wireless address of the first vehicle and at least one of a GPS location, a vehicle type, a color, or a lane position of the first vehicle; and \nb. the reply message further indicates a wireless address of the second vehicle and at least one of a GPS location, a vehicle type, a color, or a lane position of the second vehicle. \n | 8. The method of claim 1, wherein the merged map further comprises a wireless address of the first vehicle and a wireless address of the second vehicle.\n | 9. The method of claim 1, further comprising transmitting the merged map to the second vehicle and the third vehicle.\n | 10. The method of claim 1, further comprising: \na. determining that a traffic collision with the second vehicle is imminent; \nb. determining, according to the merged map, which wireless address corresponds to the second vehicle; and \nc. transmitting, to the second vehicle, an emergency message. \n | 11. The method of claim 1, wherein the merged map includes a fourth vehicle which is not visible to the first vehicle.\n | 12. The method of claim 1, further comprising: \na. measuring data comprising angles and distances of vehicles in traffic, relative to the first vehicle, and angles and distances of further vehicles in the traffic, relative to the second vehicle; \nb. providing the data to a computer containing an artificial intelligence model; and \nc. determining, according to the artificial intelligence model, a merged map comprising predicted positions of the vehicles. \n | 13. The method of claim 11, further comprising: \na. measuring further data comprising angles and distances of further vehicles in traffic; \nb. receiving at least one message from at least one proximate vehicle, the at least one message comprising additional data comprising angles and distances of vehicles visible to the proximate vehicle or vehicles; \nc. providing the further data and the additional data as input to the algorithm; and \nd. determining, as output from the algorithm, the merged map. \n | 14. Non-transitory computer-readable media in a second vehicle in traffic comprising a first vehicle and at least one other vehicle, the media containing instructions that when implemented by a computing environment cause a method to be performed, the method comprising: \na. receiving, from the first vehicle, a request for geometric traffic data; \nb. determining one or more “visible” vehicles, the visible vehicles being visible to the second vehicle; \nc. measuring, for each of the visible vehicles, an angle of the visible vehicle and a distance of the visible vehicle from the second vehicle; \nd. transmitting, to the first vehicle, a message comprising the measured angles and the measured distances; and \ne. receiving, from the first vehicle, a merged map comprising positions of the first vehicle, the second vehicle, and the at least one other vehicle. \n | 15. The media of claim 14, the method further comprising: \na. determining, for each of the visible vehicles, a vehicle type or a vehicle color; and \nb. transmitting, to the first vehicle, a message comprising the determined vehicle types or vehicle colors. \n | 16. The media of claim 14, the method further comprising transmitting, to the first vehicle, a wireless address of the second vehicle.\n | 17. The media of claim 16, wherein: \na. the merged map further indicates, in association with the position of the second vehicle, the wireless address of the second vehicle; and \nb. the merged map further indicates, in association with the position of the first vehicle, a wireless address of the first vehicle.", "output": "The method involves measuring a first set of angles, where the first set of angles comprise an angle of a second vehicle (202) and an angle of a third vehicle (203) as viewed by a first vehicle (201). A request message is transmitted to the second vehicle, and the request message requests measurement data from the second vehicles. A reply message is received from the second vehicles, where the reply message comprises the second angles. A merged map is determined according to the first set of angles and a second set of angles, where the merged map comprises a position of the first vehicles, a position of the second vehicle, and a position of the third vehicle. The request message and the reply messages are transmitted according to fifth generation or sixth generation technology. The angles are measured relative to a direction of a road. The reply messages are transmitted according to fifth generation( 5G) or sixth generation (6G) technology. INDEPENDENT CLAIMS are included for:(1) a non-transitory computer-readable medium comprising a set of instructions for performing a method for short-range locating and identification of vehicles; and(2) a computer. Method for short-range locating and identification of vehicles i.e. autonomous and semi-autonomous vehicles, in traffic. Uses include but are not limited to a sedan, a delivery van, a pickup lorry, a sports car, a motorcycle and a semi-trailer. The method enables determining wireless address of each proximate vehicle in traffic such that the vehicles can be communicated for traffic management and collision avoidance. The drawing shows a schematic view of structure for vehicles to avoid collisions. 200Freeway 201,202,203Vehicles 204Truck ", "instruction": "Please summarize the input" }, { "input": "Vehicle connectivity, V2X communication, and 5G/6G sidelink messagingCommunication between autonomous vehicles, in 5G or 6G, is necessary for cooperative hazard avoidance and to coordinate the flow of traffic. However, before cooperative action, each vehicle must determine the wireless address of other vehicles in proximity, so that they can communicate directly with each other. Methods and systems disclosed herein include a computer-readable wireless “connectivity matrix”, an array of black and white squares showing a connectivity code. The connectivity code may be the vehicle's wireless address, an index code, or other information about the vehicle. The connectivity code may be an index in a tabulation of information that provides the wireless address, among other data. Other vehicles, or their cameras, may read the connectivity matrix, determine the code therein, and find the vehicle's wireless address. After determining the wireless address of the other vehicles, the vehicles can then communicate and cooperate to avoid accidents and facilitate the flow of traffic.The invention claimed is: \n | 1. A wireless entity comprising: \na) a matrix comprising a plurality of square or rectangular fields, the matrix displayed visually in or on the wireless entity, the fields arranged in a rectangular array, each field colored either black or white according to a binary code, the binary code comprising data related to a wireless address of the wireless entity; \nb) wherein the binary code indicates an index of a particular entry of a tabulation, the tabulation comprising a plurality of entries, each entry related to a wireless vehicle or a wireless fixed asset, respectively, and each entry indicating the wireless address of the related wireless vehicle or wireless fixed asset. \n | 2. The wireless entity of claim 1, wherein the wireless entity is configured to communicate according to 5G or 6G technology.\n | 3. The wireless entity of claim 1, the matrix further comprising a border comprising further fields arranged peripherally around the matrix and colored black or white according to a predetermined pattern.\n | 4. The wireless entity of claim 1, wherein the binary code further indicates the wireless address of the wireless entity.\n | 5. The wireless entity of claim 1, wherein the wireless entity comprises a first vehicle, and the matrix is further configured to indicate, to a second vehicle, the wireless address of the first vehicle.\n | 6. The wireless entity of claim 5, wherein the matrix is configured to be readable by a camera on the second vehicle from a predetermined distance, the predetermined distance in the range of 20 to 100 meters.\n | 7. The wireless entity of claim 1, wherein each field has a predetermined size in the range of 5 to 20 millimeters.\n | 8. The wireless entity of claim 1, wherein the wireless entity is a base station of a wireless network, and the code comprises a frequency of a broadcast channel of the base station.\n | 9. The wireless entity of claim 1, wherein the entity is a fixed asset comprising a traffic signal or a highway sign or a roadside building and the code further indicates a wireless address of a receiver associated with the wireless entity.\n | 10. The wireless entity of claim 1, wherein: \na) each field comprises an illuminator, respectively; \nb) each illuminator is powered individually; and \nc) the code is determined by which of the illuminators are powered and which of the illuminators are unpowered. \n | 11. The wireless entity of claim 1, wherein: \na) each field comprises a filter, respectively, each filter having an individually controllable opacity; and \nb) the code is determined by which of the filters are controlled to have a high opacity and which filters are controlled to have a low opacity. \n | 12. The wireless entity of claim 1, wherein the matrix further comprises 48 fields colored black or white according to a MAC (medium access code) address, surrounded by a border comprising 32 fields colored black or white according to a predetermined pattern.\n | 13. Non-transitory computer-readable media in a first vehicle, the media including instructions that when executed by a computing environment cause a method to be performed, the method comprising: \na) detecting, in or on a second vehicle, a connectivity matrix comprising a plurality of fields colored black or white according to a code; and \nb) determining, from the code, a wireless address of the second vehicle; \nc) wherein the determining of the wireless address comprises: \nd) retrieving, from a tabulation of entries, a particular entry according to the code; and \ne) determining, from the particular entry, the wireless address of the second vehicle. \n | 14. The media of claim 13, the method further comprising transmitting, according to the wireless address, a message to the second vehicle.\n | 15. The media of claim 13, wherein the code is configured to indicate whether the second vehicle is autonomous or semi-autonomous or human-driven.\n | 16. A base station of a wireless network, the base station comprising: \na) a visibly displayed connectivity matrix comprising a plurality of fields arranged in a rectangular array, each field colored black or white according to a code, the code configured to indicate a particular entry, in a tabulation of entries, according to the code, the particular entry comprising a particular frequency; and \nb) a transmitter configured to transmit system information messages on the particular frequency. \n | 17. The base station of claim 16, wherein the system information messages indicate how user devices can become registered with the base station.\n | 18. The base station of claim 16, further comprising a receiver configured to receive messages on a second frequency, different from the particular frequency, the second frequency indicated in the system information messages.", "output": "The wireless entity has a matrix including multiple square or rectangular fields, where the matrix is displayed visually in or on the wireless entity. The fields are arranged in a rectangular array, where each field is colored either black or white according to a binary code. The binary code comprises data related to a wireless address of the entity. The wireless entity is configured to communicate according to 5G or 6G technology. The matrix has a border having fields arranged peripherally around the matrix and colored black or white according to predetermined pattern, and indicates the wireless address. INDEPENDENT CLAIMS are included for: (1) non-transitory computer-readable media includes instructions for localizing, identifying, and communicating with vehicles in traffic and fixed assets; (2) a base station for a wireless network. Wireless entity for localizing, identifying, and communicating with autonomous or semi-autonomous or human-driven vehicles, in traffic and fixed assets for cooperative hazard avoidance and to coordinate the flow of traffic. The wireless entity enables the autonomous and semi-autonomous vehicles to communicate and cooperate to prevent or mitigate collisions, saving countless lives and manage the flow of traffic in an efficient manner, after determining the wireless address of the other vehicles. The drawing shows a schematic view of a computer-readable wireless identification matrix.100Connectivity matrix ", "instruction": "Please summarize the input" }, { "input": "V2X with 5G/6G Image Exchange and AI-Based Viewpoint FusionAutonomous vehicles are required to communicate with each other in 5G or 6G, to avoid hazards, mitigate collisions, and facilitate the flow of traffic. However, for cooperative action, each vehicle must determine the wireless address and position of other vehicles in proximity, so that they can communicate directly with each other. It is not sufficient to know the wireless address alone; the wireless address must be associated with an actual vehicle in view. Methods disclosed herein enable vehicles to simultaneously acquire 360-degree images of other vehicles in traffic, and transmit those images wirelessly along with their wireless addresses. The various images are then “fused” by identifying objects that are viewed from at least two directions, and calculating their positions by triangulation. The resulting traffic map, or a listing of the vehicle positions, is then broadcast along with the wireless addresses of the vehicles The vehicles can then determine which wireless address belongs to which of the vehicles in proximity, and can thereby cooperate with each other to avoid accidents and facilitate the flow of traffic.|1. A method for a first vehicle to communicate with a second vehicle, the second vehicle proximate to a third vehicle, the method comprising: \na. broadcasting a planning message specifying a particular time; \nb. at the particular time, acquiring a first image depicting the second vehicle and the third vehicle; \nc. receiving, from the second vehicle, an imaging message comprising a second image, the second image acquired by the second vehicle at the particular time, the second image depicting the first vehicle and the third vehicle; and \nd. determining, according to the first image and the second image, a coordinate listing comprising a position of the first vehicle, a position of the second vehicle, and a position of the third vehicle. \n | 2. The method of claim 1, wherein the planning message and the imaging message are transmitted according to 5G or 6G technology.\n | 3. The method of claim 1, wherein the second image further includes an indication of a direction of travel of the second vehicle.\n | 4. The method of claim 1, further comprising: \na. determining, from the imaging message, a wireless address of the second vehicle; and \nb. adding, to the coordinate listing, the wireless address of the second vehicle and a wireless address of the first vehicle. \n | 5. The method of claim 1, further comprising: \na. measuring a distance from the first vehicle to either the second vehicle or the third vehicle; and \nb. determining the coordinate listing according to the distance. \n | 6. The method of claim 1, further comprising: \na. providing, according to the coordinate listing, a traffic map comprising a two-dimensional image indicating the position of the first vehicle, the position of the second vehicle, and the position of the third vehicle; and \nb. indicating, on the traffic map, a wireless address of the first vehicle. \n | 7. The method of claim 1, wherein the imaging message further indicates at least one of a vehicle type, a color, or a lane position of the second vehicle.\n | 8. The method of claim 1, wherein the coordinate listing further indicates at least one of a vehicle type, a color, or a lane position of the first vehicle.\n | 9. The method of claim 1, further comprising broadcasting the coordinate listing.\n | 10. The method of claim 1, further comprising: \na. determining that a traffic collision with the second vehicle is imminent; \nb. determining, according to the coordinate listing, which wireless address corresponds to the second vehicle; and \nc. transmitting, to the second vehicle, an emergency message. \n | 11. The method of claim 1, wherein the coordinate listing includes a fourth vehicle which is not depicted in the first image.\n | 12. The method of claim 1, further comprising: \na. acquiring a plurality of images of vehicles in traffic; \nb. providing the plurality of images to a computer containing an artificial intelligence model; and \nc. determining, according to the artificial intelligence model, a predicted coordinate listing comprising predicted positions of the vehicles. \n | 13. The method of claim 12, further comprising: \na. acquiring a further image of further vehicles in traffic; \nb. receiving at least one message from at least one proximate vehicle, the at least one message comprising an additional image of the vehicles in traffic; \nc. providing the further image and the additional image as input to an algorithm based at least in part on the artificial intelligence model; and \nd. determining, as output from the algorithm, an updated coordinate listing comprising predicted positions of the further vehicles. \n | 14. Non-transitory computer-readable media in a second vehicle, the second vehicle in traffic, the traffic comprising a first vehicle and at least one other vehicle, the media containing instructions that when implemented by a computing environment cause a method to be performed, the method comprising: \na. receiving, from the first vehicle, a planning message specifying a time; \nb. acquiring, at the specified time, an image comprising the first vehicle and the at least one other vehicle; \nc. transmitting, to the first vehicle, an imaging message comprising the image; and \nd. receiving, from the first vehicle, a coordinate listing or a traffic map comprising positions of the first vehicle, the second vehicle, and the at least one other vehicle. \n | 15. The media of claim 14, the method further comprising: \na. determining, for each of the first, second, and third vehicles, a vehicle type or a vehicle color; and \nb. transmitting, to the first vehicle, a message comprising the determined vehicle types or vehicle colors. \n | 16. The media of claim 14, the method further comprising transmitting, to the first vehicle, a wireless address of the second vehicle.\n | 17. The media of claim 16, wherein: \na. the coordinate listing or the traffic map further indicates, in association with the position of the second vehicle, the wireless address of the second vehicle; and \nb. the coordinate listing or the traffic map further indicates, in association with the position of the first vehicle, a wireless address of the first vehicle. \n | 18. A computer containing an artificial intelligence structure comprising; \na. one or more inputs, each input comprising an image of traffic, the traffic comprising a plurality of vehicles; \nb. one or more internal functions, each internal function operably linked to one or more of the inputs; and \nc. an output operably linked to the one or more of the internal functions, the output comprising a prediction of a two-dimensional position of each vehicle of the plurality. \n | 19. The computer of claim 18, the artificial intelligence structure further comprising one or more adjustable variables associated with the one or more internal functions, the one or more adjustable variables adjusted by supervised learning according to a plurality of individually recorded inputs.\n | 20. The computer of claim 18, further comprising an algorithm, based at least in part on the artificial intelligence structure, the algorithm configured to take, as input, one or more images of further vehicles in traffic, and to provide, as output, a two-dimensional position of each of the further vehicles.", "output": "The method involves broadcasting (301) a planning message specifying a particular time. A first image depicting a first vehicle and a second vehicle is acquired (302) at the particular time. An imaging message comprising a second image is received from the first vehicle, where the second image depicts the second vehicle and a third vehicle. A coordinate listing comprising a position of the first vehicle, a position of the second vehicle, and a position of the third vehicle is determined according to the first image and the second image. The planning message and the imaging message are transmitted according to fifth-generation (5G ) or sixth-generation (6G ) technology. The imaging message indicates one of a vehicle type, a color, or a lane position of the second vehicle. The coordinate listing indicates one of a vehicle type, a color, or a lane position of the first vehicle. INDEPENDENT CLAIMS are included for: (1) non-transitory computer-readable media for performing short-range locating and wireless addresses identification of vehicles; and (2) a computer containing an artificial intelligence structure for performing short-range locating and wireless addresses identification of vehicles. Method for performing short-range locating and wireless addresses identification of vehicles i.e. autonomous and semi-autonomous vehicles in traffic by a computing device over a 5G or 6G network. Uses include but are not limited to a personal computer, a laptop computer, a notebook computer, a net book computer, a handheld computer, a personal digital assistant, a mobile phone, a smart phone and a tablet computer. The method enables determining the locations and the wireless addresses of other proximate vehicles in the traffic in an efficient manner. The drawings shows a flow diagram of a procedure for determining a traffic map derived by viewpoint fusion.301Broadcasting a planning message specifying a particular time 302Acquiring a first image depicting a first vehicle and a second vehicle at the particular time 303Broadcasting a imaging message including wireless address after randomly-selected delay 304Receiving the imaging messages from the participating vehicles by the first vehicle 305Calculating locations of objects in a two-dimensional coordinate system by the first vehicle ", "instruction": "Please summarize the input" }, { "input": "System and method for vulnerable road user detection using wireless signalsA method for detecting vulnerable road users (VRUs) using wireless signals includes receiving, by a wireless receiver, wireless signals from mobile devices and determining received signal strength indication (RSSI) levels of the wireless signals. The wireless signals and the RSSI levels of the wireless signals received by the wireless receiver are analyzed so as to determine at least one location of the VRUs. A notification is issued to the vehicle or a driver of the vehicle based on the at least one determined location of the VRUs.What is claimed is: \n | 1. A method for detecting a vulnerable road user (VRU) using wireless signals, the method comprising: \nreceiving, by a wireless receiver, wireless signals from a mobile device at a plurality of time intervals and determining received signal strength indication (RSSI) levels of the wireless signals; \nanalyzing the wireless signals and the RSSI levels of the wireless signals received by the wireless receiver so as to determine a location of the VRU, wherein an estimation area for the VRU is determined at each of the time intervals and a calibrated estimation area comprising an overlap of the estimation areas is determined as a measurement of the location of the VRU; and \nissuing a notification to a vehicle or a driver of the vehicle based on the determined location of the VRU. \n | 2. The method according to claim 1, wherein the wireless receiver is disposed at a first static location, and wherein a second wireless receiver is disposed at a second static location, the first and second locations being known with respect to each other, and wherein wireless signals received by the second wireless receiver and associated RSSI levels are analyzed together with the wireless signals received at the first static location to determine the location of the VRU.\n | 3. The method according to claim 1, wherein the wireless receiver is attached to or embedded in the vehicle.\n | 4. The method according to claim 3, wherein a second wireless receiver is disposed at a static location, and wherein wireless signals received by the second wireless receiver and associated RSSI levels are analyzed together with the wireless signals received at the vehicle to determine the location of the VRU.\n | 5. The method according to claim 3, wherein the plurality of time intervals are less than one second apart for determining the estimation areas and the location of the VRU from the calibrated estimation area.\n | 6. The method according to claim 5, further comprising: \ncomparing a distance from the vehicle to the determined location of the VRU to an estimated stopping distance of the vehicle; \ndetermining a behavior of the VRU based on further wireless signals that are received by the wireless receiver at later time intervals; and \ndetermining whether the behavior of the VRU is expected at the determined location of the VRU, \nwherein the notification to the vehicle or the driver includes a description of the behavior where it is determined that the behavior is not expected for the VRU at the determined location of the VRU. \n | 7. The method according to claim 6, wherein the vehicle is an autonomous vehicle, the method further comprising issuing a control action for stopping the vehicle or diverting a path of the vehicle based on a determination that the behavior is not expected for the VRU at the determined location of the VRU.\n | 8. The method according to claim 6, further comprising storing the behavior and the determined location of the VRU in a database, wherein the determining whether the behavior of the VRU is expected at the determined location of the VRU is performed by checking the database.\n | 9. The method according to claim 3, wherein the time intervals are less than 0.5 seconds apart for determining the estimation areas and the location of the VRU from the calibrated estimation area.\n | 10. The method according to claim 3, wherein each of the estimation areas has a circular area comprising an estimated location at the center and a radius representing an expected error range, the estimated locations being based on the RSSI levels received at the respective time intervals.\n | 11. The method according to claim 1, further comprising receiving, by a wireless transceiver, wireless signals sent by the wireless transceiver and reflected back to the wireless transceiver from objects in the vicinity of the vehicle, wherein the wireless signals reflected back to the wireless transceiver are used to determine at least one location of at least one additional VRU which does not have a mobile device.\n | 12. The method according to claim 1, further comprising identifying the mobile device from the wireless signals received by the wireless receiver and determining that the VRU carries at least one additional mobile device based on the wireless signals from the VRU being received by the wireless receiver indicating a single entity carrying the mobile devices.\n | 13. The method according to claim 1, wherein the wireless receiver includes a plurality of antennas which change directions during the receiving of the wireless signals from the mobile device, the method further comprising using trilateration on the received wireless signals to determine the location of the VRU.\n | 14. The method according to claim 1, further comprising the vehicle self-enforcing a dynamic speed limit which was changed in the vehicle based on the VRU detection and broadcasting the changed speed limit to other vehicles in the vicinity using vehicle-to-vehicle communications.\n | 15. A system for detecting a vulnerable road user (VRU), the system being configured to communicate with a wireless receiver configured to receive wireless signals from mobile devices, the system comprising: \na processing server configured to analyze the wireless signals received at a plurality of time intervals from one of the mobile devices and received signal strength indication (RSSI) levels of the wireless signals received by the wireless receiver so as to determine a location of the VRU, wherein an estimation area for the VRU is determined at each of the time intervals and a calibrated estimation area comprising an overlap of the estimation areas is determined as a measurement of the location of the VRU; and \nan alert system configured to issue a notification to a vehicle or a driver of the vehicle based on the determined location of the VRU. \n | 16. The system according to claim 15, wherein the wireless receiver is attached to or embedded in the vehicle.\n | 17. The system according to claim 16, wherein the processing server is configured to analyze wireless signals received by a second wireless receiver disposed at a static location and associated RSSI levels together with the wireless signals received at the vehicle to determine the location of the VRU.", "output": "The method involves receiving wireless signals from mobile devices by a first wireless receiver (14). Received signal strength indication (RSSI) levels of the wireless signals are determined. The wireless signals and the RSSI levels of the wireless signals received by the first wireless receiver are analyzed to determine a location of vulnerable road users (VRUs). A notification is issued to a vehicle (12) or a driver of the vehicle based on the determined location of the VRUs. The first wireless receiver is arranged at a first static location and a second static location. The wireless signals received by a second wireless receiver and associated RSSI levels are analyzed together with the wireless signals received at the first static location to determine the location of the VRUs. An INDEPENDENT CLAIM is also included for a system for detecting VRUs. Method for detecting VRUs e.g. cyclists or pedestrians, around an autonomous or non-autonomous vehicle i.e. car, based on wireless signals. Can also be used for bus, lorry, motorbike and bicycle. The method enables reducing computational costs and constraints for associated hardware, facilitating faster and/or reliable detection of VRUs and choosing a short time interval between measurements to allow for quick and accurate location prediction by leveraging high speed of the vehicle. The drawing shows a schematic view of a system for detecting VRUs based on wireless signals. 12Vehicle14Wireless receiver15Transceiver16Processing server22Cloud server", "instruction": "Please summarize the input" }, { "input": "Global navigation satellite system, navigation terminal, navigation method and programIn a satellite navigation system a navigation terminal continuously receives navigation signals from navigation satellites and continuously implements navigation calculations, thereby obtaining navigation calculation results, and executes in parallel: using clock offset values determined through the navigation calculations, calculates, in real time, changes in difference between time differences with regard to difference between time differences, which are differences between a clock offset value and a standard deviation value, which is the value of the standard deviation of fluctuation amounts of the clock offset values; determines, in real time, two navigation precision indices of the calculated navigation calculation results on the basis of each change in the calculated difference between time differences and standard deviation value; associates, in real time, the determined two navigation precision indices with the calculated navigation calculation results; and outputs, in real time, the navigation calculation results associated with the at least two navigation precision indices.What is claimed is: \n | 1. A positioning terminal, comprising a Global Navigation Satellite System (GNSS) receiver, \nwherein the GNSS receiver is configured to execute processing in parallel while continuously acquiring respective navigation signals from navigation satellites, each navigation satellite configured to broadcast a navigation signal for GNSS and continuously performing positioning computation in real time to obtain a positioning computation result, the processing including: \ni) calculating a standard deviation value of a clock offset value exhibited at a current epoch based on each clock offset value obtained by the positioning computation of each epoch, which is a value of a jitter amount of the each clock offset value; \nii) calculating a change amount of a most recent difference between time differences based on each value of difference between time differences of each epoch, which is the each value of difference between time differences that is a difference between each current clock offset value and a clock offset value immediately preceding the current clock offset value; \ndetermining two positioning accuracy indices of the current epoch based on respective values of the standard deviation value of the clock offset value exhibited at the current epoch and the change amount of the most recent difference between the time differences; and \nassociating the two positioning accuracy indices with the positioning computation result of the current epoch. \n | 2. The positioning terminal according to claim 1, wherein the GNSS receiver is configured to: \ncalculate, from clock offset values exhibited at respective epochs, the standard deviation value of difference between time differences at the current epoch and a predetermined number of past epochs; \ncalculate, from the clock offset values exhibited at the respective epochs, an average value of difference between time differences at a predetermined number of past epochs that do not include the clock offset value exhibited at the current epoch; and \nassociate, as the two positioning accuracy indices, i) the calculated standard deviation value and ii) a value of the difference between time differences at the current epoch and the average as the change amount of the positioning computation result at the current epoch in real time. \n | 3. The positioning terminal according to claim 1, \nwherein the GNSS receiver positioning module includes at least: \na broadcast wave signal processor configured to continuously acquire the respective navigation signals from the navigation satellites each configured to broadcast the navigation signal for GNSS; and \na processor configured to associate the determined two positioning accuracy indices with the positioning computation result. \n | 4. The positioning terminal according to claim 1, wherein the GNSS receiver is configured to: \ncalculate, in a process of performing the positioning computation at each epoch, a value of a magnitude of jitter of the clock offset value exhibited at the current epoch with respect to clock offset values exhibited at respective epochs; \nfurther determine, based on the calculated value of the magnitude of the jitter of the clock offset value exhibited at the current epoch, iii) a positioning accuracy index of the positioning computation result at the current epoch; and \nassociate the positioning accuracy index with the positioning computation result at the current epoch. \n | 5. The positioning terminal according to claim 1, wherein the GNSS receiver is configured to calculate the standard deviation value and the change amount by excluding a clock offset value that fails to fall within a threshold value range of a jitter change amount from clock offset values exhibited at a predetermined number of last epochs when the two positioning accuracy indices for the current epoch is determined through calculation.\n | 6. The positioning terminal according to claim 1, wherein the GNSS receiver is configured to process a unit of the two positioning accuracy indices, which are to be assigned to the positioning computation result, into a unit of a distance through use of the speed of light.\n | 7. The positioning terminal according to claim 1, wherein the GNSS receiver is configured to perform the positioning computation based on a Precise Point Positioning scheme, and assign the two positioning accuracy indices to the positioning computation result based on the Precise Point Positioning scheme.\n | 8. The positioning terminal according to claim 1, \nwherein the positioning terminal is mounted to a vehicle including a communication unit, and \nwherein the positioning terminal is configured to use the communication unit to notify a communication counterpart device of, together with positional information on the positioning terminal being the positioning computation result, at least one of the two positioning accuracy indices each indicating a degree of reliability of the positional information. \n | 9. The positioning terminal according to claim 1, \nwherein the positioning terminal is mounted to a vehicle including a communication unit, and \nwherein the positioning terminal is configured to use the communication unit to receive, together with positional information on a communication counterpart device being the positioning computation result obtained by the communication counterpart device, at least one of the two positioning accuracy indices each indicating a degree of reliability of the positional information from the communication counterpart device, and use the at least one of the two positioning accuracy indices as a discrimination index for automatic driving. \n | 10. The positioning terminal according to claim 1, \nwherein the positioning terminal is mounted to an automobile or an autonomous driving vehicle capable of performing vehicle-to-vehicle communication, and \nwherein the positioning terminal is configured to notify another vehicle of, together with positional information on an own vehicle being the positioning computation result, the two positioning accuracy indices each being an index indicating a degree of reliability of the positional information via the vehicle-to-vehicle communication. \n | 11. The positioning terminal according to claim 1, \nwherein the positioning terminal is mounted to an automobile or an autonomous driving vehicle capable of performing vehicle-to-vehicle communication, and \nwherein the positioning terminal is configured to receive, together with positional information on another vehicle being the positioning computation result of another vehicle, at least one of the two positioning accuracy indices each indicating a degree of reliability of the positional information from the another vehicle via the vehicle-to-vehicle communication, and use the at least one of the two positioning accuracy indices as a discrimination index for automatic driving of an own vehicle. \n | 12. The positioning terminal according to claim 1, wherein the GNSS receiver is configured to: \ndetermine, in a process of performing the positioning computation at each epoch, whether to advance to processing for deriving a combination of navigation satellites excluding one or a plurality of navigation satellites being a jitter factor based on one or both of a magnitude of jitter in continuity of the clock offset value and a stability thereof; \nperform, when a combination of navigation satellites excluding one or a plurality of navigation satellites being a jitter factor is derived, re-positioning-computation based on a navigation signal group that has been transmitted from the navigation satellites excluding the one or plurality of navigation satellites being a jitter factor; and \nperform processing in parallel while obtaining the positioning computation result of the re-positioning-computation, the processing including: \ncalculating, from the navigation signal group for the re-positioning-computation, the standard deviation value; \ndetermining two positioning accuracy indices of the positioning computation result of re-positioning-computation based on the respective values of the standard deviation value and the change amount; and \nassociating the determined two positioning accuracy indices with the positioning computation result of the re-positioning-computation. \n | 13. A positioning method, performed by a positioning terminal of a global navigation satellite system, \nthe positioning method comprising executing, by the positioning terminal, processing in parallel while continuously acquiring respective navigation signals from navigation satellites, each navigation satellite configured to broadcast a navigation signal for GNSS and continuously performing positioning computation in real time to obtain a positioning computation result, the processing including: \ni) calculating a standard deviation value of a clock offset value exhibited at a current epoch based on each clock offset value obtained by the positioning computation of each epoch, which is a value of a jitter amount of the each clock offset value; \nii) calculating a change amount of a most recent difference between time differences based on each value of difference between time differences of each epoch, which is the each value of difference between time differences that is a difference between each current clock offset value and a clock offset value immediately preceding the current clock offset value; \ndetermining two positioning accuracy indices of the current epoch based on respective values of the standard deviation value of the clock offset value exhibited at the current epoch and the change amount of the most recent difference between the time differences; \nassociating the two positioning accuracy indices with the positioning computation result of the current epoch; and \noutputting in real time the positioning computation result associated with at least the two positioning accuracy indices. \n | 14. The positioning method according to claim 13, further comprising: \ndetermining, by the positioning terminal, in a process of performing the positioning computation at each epoch, whether to advance to processing for deriving a combination of navigation satellites excluding one or a plurality of navigation satellites being a jitter factor based on one or both of a magnitude of jitter in continuity of the clock offset value and a stability thereof; \nperforming, by the positioning terminal, when a combination of navigation satellites excluding one or a plurality of navigation satellites being a jitter factor is derived, re-positioning-computation based on a navigation signal group that has been transmitted from the navigation satellites excluding the one or plurality of navigation satellites being a jitter factor; and \nperforming, by the positioning terminal, processing in parallel while obtaining the positioning computation result of the re-positioning-computation, the processing including: \ncalculating, from the navigation signal group for the re-positioning-computation, the standard deviation value; \ndetermining two positioning accuracy indices of the positioning computation result of re-positioning-computation based on the respective values of the standard deviation value and the change amount; and \nassociating the determined two positioning accuracy indices with the positioning computation result of the re-positioning-computation. \n | 15. The positioning method according to claim 13, \nwherein the positioning terminal is mounted to an automobile or an autonomous driving vehicle capable of performing vehicle-to-vehicle communication, and \nwherein the positioning method further comprises notifying, by the positioning terminal, another vehicle of, together with positional information on an own vehicle being the positioning computation result, the two positioning accuracy indices each being an index indicating a degree of reliability of the positional information. \n | 16. The positioning method according to claim 13, \nwherein the positioning terminal is mounted to an automobile or an autonomous driving vehicle capable of performing vehicle-to-vehicle communication, and \nwherein the positioning method further comprises receiving, by the positioning terminal, together with positional information on another vehicle being the positioning computation result of another vehicle, at least one of the two positioning accuracy indices each indicating a degree of reliability of the positional information from the another vehicle via the vehicle-to-vehicle communication, and using the at least one of the two positioning accuracy indices as a discrimination index for automatic driving of an own vehicle. \n | 17. A non-transitory computer-readable recording medium having a program recorded thereon, the program for positioning, for causing a processor of a positioning terminal to be operated to execute processing in parallel while continuously acquiring respective navigation signals from navigation satellites, each navigation satellite configured to broadcast a navigation signal for GNSS and continuously performing positioning computation in real time to obtain a positioning computation result, the processing including: \ni) calculating a standard deviation value of a clock offset value exhibited at a current epoch based on each clock offset value obtained by the positioning computation of each epoch, which is a value of a jitter amount of the each clock offset value; \nii) calculating a change amount of a most recent difference between time based on each value of difference between time differences of each epoch, which is the each value of difference between time differences that is a difference between each current clock offset value and a clock offset value immediately preceding the current clock offset value; \ndetermining two positioning accuracy indices of the current epoch based on respective values of the standard deviation value of the clock offset value exhibited at the current epoch and the change amount of the most recent difference between the time differences; \nassociating the two positioning accuracy indices with the positioning computation result of the current epoch; and \noutputting in real time the positioning computation result associated with at least the two positioning accuracy indices. \n | 18. The non-transitory computer-readable recording medium according to claim 17, wherein the program is configured to cause the processor of the positioning terminal to be operated to: \ndetermine, in a process of performing the positioning computation at each epoch, whether to advance to processing for deriving a combination of navigation satellites excluding one or a plurality of navigation satellites being a jitter factor based on one or both of a magnitude of jitter in continuity of the clock offset value and a stability thereof; \nperform, when a combination of navigation satellites excluding one or a plurality of navigation satellites being a jitter factor is derived, re-positioning-computation based on a navigation signal group that has been transmitted from the navigation satellites excluding the one or plurality of navigation satellites being a jitter factor; and \nperform processing in parallel while obtaining the positioning computation result of the re-positioning-computation, the processing including: \ncalculating, from the navigation signal group for the re-positioning-computation, the standard deviation value; \ndetermining two positioning accuracy indices of the positioning computation result of re-positioning-computation based on the respective values of the standard deviation value and the change amount; and \nassociating the determined two positioning accuracy indices with the positioning computation result of the re-positioning-computation. \n | 19. The non-transitory computer-readable recording medium according to claim 17, wherein the program is configured to cause the positioning terminal, which is mounted to an automobile or an autonomous driving vehicle capable of performing vehicle-to-vehicle communication, to be operated to notify another vehicle of, together with positional information on an own vehicle being the positioning computation result, the two positioning accuracy indices each being an index indicating a degree of reliability of the positional information via the vehicle-to-vehicle communication.\n | 20. The non-transitory computer-readable recording medium according to claim 17, wherein the program is configured to cause the positioning terminal, which is mounted to an automobile or an autonomous driving vehicle capable of performing vehicle-to-vehicle communication, to be operated to receive, together with positional information on another vehicle being the positioning computation result of another vehicle, at least one of the two positioning accuracy indices each indicating a degree of reliability of the positional information from the another vehicle via the vehicle-to-vehicle communication, and use the at least one of the two positioning accuracy indices as a discrimination index for automatic driving of an own vehicle.", "output": "The system determines two positioning precision parameters of the calculated positioning calculation result in real time, respectively based on the variation of the calculated standard deviation value and a time difference. The positioning precision parameter is matched with the positioning calculation result which calculated two determined positioning precision parameters in real time. The positioning calculation result which matched the positioning precision parameters is output in real time. INDEPENDENT CLAIMS are included for the following:positioning terminal;positioning method; andpositioning program. Satellite positioning system for vehicle, mobile telephone, global positioning system (GPS) apparatus, ship, farming machine, mining machinery and drone. The satellite positioning system which relate the probability of the positioning calculation result with a positioning calculation result in real time can be provided about the positioning calculation result based on a navigation signal. The drawing shows a block diagram of the satellite positioning system. (Drawing includes non-English language text) 10Positioning terminal20Navigation satellite", "instruction": "Please summarize the input" }, { "input": "Decision-making method of lane change for self-driving vehicles using reinforcement learning in a motorway environment, recording medium thereofThe present invention relates to a method for determining a lane change of an autonomous vehicle using reinforcement learning in an automobile-only road environment. Selecting an important vehicle, which is a nearby vehicle that has the greatest influence on a lane change determination of an autonomous vehicle, calculating a lane change probability of the important vehicle, adding the lane change probability to vehicle information, Performing pre-processing necessary for reinforcement learning through a pre-processing network on information obtained by adding the lane change probability to vehicle information, and performing reinforcement learning by adding autonomous vehicle information to information pre-processed in the pre-processing network, and outputting a change determination result. According to the present invention, there is an effect of ensuring real-time performance and flexibly coping with motion changes of other vehicles by using the lane change probability of the main vehicle.|1. In a lane change determination method of an autonomous vehicle using reinforcement learning in an automobile-only road environment, the method comprising: receiving vehicle information of surrounding vehicles through V2X communication (Vehicle to Everything communication); using the received vehicle information to select an important vehicle, which is a neighboring vehicle that has the greatest influence in determining a lane change of the autonomous vehicle;\ncalculating a lane change probability of the important vehicle; adding the lane change probability to vehicle information; performing preprocessing necessary for reinforcement learning through a preprocessing network on information obtained by adding the lane change probability to the vehicle information; performing reinforcement learning by adding autonomous vehicle information to information preprocessed in the preprocessing network, and outputting a lane change determination result; and performing a safety check on the determination result through the reinforcement learning and outputting a determination result confirmed to be safe, wherein the preprocessing network and the reinforcement learning are composed of a fully connected layer, wherein the Vehicle information received through V2X communication (Vehicle to Everything communication) (only,= 1, 2,..., n-1, n),= longitudinal relative distance between the ego vehicle and the i-th surrounding vehicle,= relative speed,= relative acceleration,= Relative Lane,= When the left roadway, the current roadway, and the right roadway exist, respectively,(Equation 1)(Equation 2), where R denotes a real number domain and N denotes an integer domain, and in the step of calculating the lane change probability of the important vehicle, the lane change probability of the important vehicle is calculated, but the time t (initial 0), the longitudinal position of the vehicle, the vehicle speed, the vehicle acceleration, the vehicle heading angle, the lateral error derivative, the effective distance to both sides of the lane, the relative distance to the vehicle in front, Relative speed, relative acceleration, relative distance to rear vehicle, relative speed, relative acceleration, left lane ahead, relative distance to rear vehicle, relative speed, relative acceleration, right lane front, relative distance to rear vehicle, relative speed, A lane change determination method for an autonomous vehicle, characterized in that feature information including relative acceleration is input to the LSTM network.\n | 2. delete\n | 3. delete\n | 4. A computer-readable recording medium in which a program capable of executing the method of claim 1 by a computer is recorded.", "output": "The method involves receiving (S101) vehicle information of surrounding vehicles through vehicle to everything (V2X) communication. The received vehicle information is used (S103) to select an important vehicle, which is a neighboring vehicle that has the greatest influence in determining a lane change of the autonomous vehicle. A lane change probability of the important vehicle is calculated (S105). The lane change probability is added (S107) to vehicle information. The pre-processing necessary for reinforcement learning is performed (S109) through a pre-processing network on the information obtained by adding the lane probability to the vehicle information. The reinforcement learning is performed (S111) by adding information about the autonomous driving vehicle to the information preprocessed in the preprocessing network, and a lane change determination result is outputted. An INDEPENDENT CLAIM is included for a computer-readable recording medium storing program for determining lane change of autonomous vehicle. Method for determining lane change of autonomous vehicle using reinforcement learning in vehicle-only road environment. The method enables determining the lane change of the autonomous vehicle using reinforcement learning in an automobile-only road environment so as to ensure real-time performance and flexibly respond flexibly to movement change of other vehicles. The method enables exhibiting better performance even in a road environment in which lanes change is performed by adding direct characteristic information on lane changes. The drawing shows a flowchart of a method for determining a lane change of an autonomous vehicle using reinforcement learning in a vehicle-only road environment. (Drawing includes non-English language text) S101Step for receiving vehicle information of surrounding vehiclesS103Step for using received vehicle information to select important vehicleS105Step for calculating lane change probability of important vehicleS107Step for adding lane change probability to vehicle informationS109Step for performing pre-processing necessary for reinforcement learningS111Step for performing reinforcement learning by adding information about autonomous driving vehicle to information preprocessed in preprocessing network", "instruction": "Please summarize the input" }, { "input": "The audio visual of a vehicle, and cooperative recognitionPROBLEM TO BE SOLVED: To provide a cooperative audio-visual inference solution means to accurately recognize emergency vehicles in diverse geographic locations. \n\nSOLUTION: A vehicle recognition system 100A includes a sound analysis circuit 110 to analyze captured sounds using an audio machine learning technique to identify a sound event. The system includes an image analysis circuit 107 to analyze captured images using an image machine learning technique to identify an image event, and a vehicle identification circuit 105 to identify a type of vehicle based on the image event and the sound event. The vehicle identification circuit 105 may further use V2V or V2I alerts to identify the type of vehicle and communicate a V2X or V2I alert message based on the vehicle type. In some aspects, the type of vehicle is further identified based on a light event associated with light signals detected by the vehicle recognition system. \n\nSELECTED DRAWING: Figure 1A|1. In a system for emergency vehicle recognition in a vehicle; the system analyzes audio data by using a machine learning technique; and is a voice detection circuit for determining a voice event; the audio data are generated by an sauce outside the vehicle. The image detection circuit is detected by a microphone array installed in the vehicle; a sound detection circuit; and an image detection circuit for analyzing image data using the machine learning technique to determine an image event; and the image data are provided. An image detection circuit and a classification circuit are acquired by a camera array installed in the vehicle; and the classification circuit generates an audio-image association; and the audio-image association is used for a plurality of time instances. The audio sample of the voice event is collated with the image frame of the image event; and the audio sample is based on the audio-image association; and an emergency vehicle recognition is performed; and a classification circuit and the classification circuit are provided. In the vehicle interface for transmitting a message to a vehicle control system; the message is based on the emergency vehicle recognition; and a system provided with a vehicle interface. provided.\n | 2. The image event is to detect the visual representation of the emergency vehicle in at least one of the image frames; and the voice event is to detect a voice associated with the emergency vehicle in at least one of the audio samples. System. Included in Claim 1\n | 3. In order to generate the audio-image associations, the classification circuit further uses a sampling rate of the audio samples to normalize the frame rate of the image frame; and for each time instance of the plurality of time instances; An audio sample (ASPIF) parameter per image frame is determined; The system. in the claim 1 or 2\n | 4. The audio-image association is a data structure; the classification circuit further relates to each image frame of the image frame; an identifier of a time instance of the plurality of time instances corresponding to the image frame; and an identifier of the image frame. An identifier of a subset of the audio sample corresponding to the image frame based on the ASPIF parameter; and a detection result associated with the image frame; and a detection result based on the image event, and; A detection result associated with each audio sample of a subset of the audio samples is detected; and a detection result based on the voice event is stored in the data structure; and a system. described in claim 3 is stored.\n | 5. The detection result associated with the image frame is the type of the emergency vehicle detected in the image frame; and the system. described in claim 4 is provided.\n | 6. The detection result associated with each audio sample of the subset of the audio samples is the type of the emergency vehicle detected based on the audio sample; and the system. described in claim 5 is provided.\n | 7. The classification circuit further applies a clustering function to the detection result associated with the subset of the audio samples, and generates a combined detection result associated with a subset of the audio samples, as well as: Data fusion between the detection result associated with the image frame and the combined detection result associated with the subset of the audio samples is executed to execute the emergency vehicle recognition, and the system. described in claim 6 is executed.\n | 8. A prediction of the type of the emergency vehicle detected during the emergency vehicle recognition is generated; and the message is generated for transmission to the vehicle control system; and the message includes the type of the emergency vehicle. Further provided with a prediction generating circuit configured to do so; the system. described in any one of the claims 1 to 7\n | 9. The vehicle control system executes a response action on the basis of the message; and the system. described in claim 8 is provided.\n | 10. The response action includes an autonomous vehicle steering based on the type of the emergency vehicle detected during the emergency vehicle recognition, and the system. described in claim 9\n | 11. The machine learning technique includes the artificial neural network; the system. described in any one of the claims 1 to 10\n | 12. The machine is a procedure for analyzing audio data by using a machine learning technique to determine a voice event; the audio data are sensed by a microphone array installed in a vehicle; and the image data are analyzed by using the machine learning technique. In the procedure for determining an image event, the image data is obtained by a camera array installed in the vehicle; a procedure and a procedure for generating an audio-image association; and the audio-image association is provided. For a plurality of time instances, the audio sample of the voice event is collated with the image frame of the image event; a procedure; and a procedure for performing emergency vehicle recognition based on the audio-image association. In the procedure for outputting a message to a vehicle control system of the vehicle, the message is based on the emergency vehicle recognition; and the vehicle control system executes a response action based on the message; and a program. for performing the procedure is performed.\n | 13. In addition to the above machine, the frame rate of the image frame is normalized by using the sampling rate of the audio sample, and for each time instance of the plurality of time instances; The program. described in Claim 12 for performing the procedure which determines the audio sample (ASPIF) parameter per image frame\n | 14. The audio-image association is a data structure; the program further includes: an identifier of a time instance of the plurality of time instances corresponding to the image frame for each image frame of the image frame; and a method for controlling the image frame. The identifier of the image frame; an identifier of a subset of the audio sample corresponding to the image frame based on the ASPIF parameter; and a detection result associated with the image frame; and a detection result based on the image event, and; In a detection result associated with each audio sample of a subset of the audio samples, a detection result based on the voice event is performed to execute a procedure for storing the detection result in the data structure, and the program. described in claim 13 is executed.\n | 15. The detection result associated with the image frame is of the type of the emergency vehicle detected in the image frame; and the detection result associated with each audio sample of the subset of the audio samples is provided. A type of the emergency vehicle detected based on the audio sample; the program further applies a clustering function to the detection result associated with a subset of the audio sample; and to provide a method for detecting the detection result of the audio sample. A procedure for generating a combined detection result associated with a subset of the audio sample, and a result of detection associated with the image frame and a combined detection result associated with a subset of the audio sample are performed to perform data fusion. The program. described in claim 14 for carrying out the procedure which performs the said emergency vehicle recognition\n | 16. In a means for analyzing audio data using a machine learning technique and determining a voice event, the audio data are sensed by a microphone array installed in a vehicle; and the image data are analyzed by using the machine learning technique. In the means for determining an image event, the image data are acquired by a camera array installed in the vehicle; a means; and an audio-image association; and the audio-image association is provided. For a plurality of time instances, the audio sample of the voice event is collated with the image frame of the image event; a means; and a means for performing emergency vehicle recognition based on the audio-image association. In a means for outputting a message to a vehicle control system, the message is based on the emergency vehicle recognition; and the vehicle control system performs a response action based on the message; and a device. provided with the means.\n | 17. The image event detects the visual representation of the emergency vehicle in at least one of the image frames; the voice event is at least one of the audio samples; and the voice associated with the emergency vehicle is detected; and the device. described in the claim 16 is detected.\n | 18. The means for generating the audio-image association uses a sampling rate of the audio sample; normalizes the frame rate of the image frame; and for each time instance of the plurality of time instances; The device. described in the claim 16 or 17 includes a means for determining the audio sample (ASPIF) parameter per image frame.\n | 19. The audio-image association is a data structure; the device further relates to each image frame of the image frame; the identifier of the time instance of the plurality of time instances corresponding to the image frame; and the identifier of the image frame. An identifier of a subset of the audio sample corresponding to the image frame based on the ASPIF parameter; and a detection result associated with the image frame; and a detection result based on the image event, and; A detection result associated with each audio sample of a subset of the audio samples is provided with a means for storing a detection result based on the voice event in the data structure; and the device. described in the claim 18 is provided.\n | 20. The detection result associated with the image frame is the type of the emergency vehicle detected in the image frame; and the device. described in claim 19 is provided.\n | 21. The detection result associated with each audio sample of the subset of the audio samples is of the type of the emergency vehicle detected based on the audio sample; and the device. described in claim 20 is provided.\n | 22. A clustering function is applied to the detection result associated with the subset of the audio samples; a means for generating a combined detection result associated with a subset of the audio samples; and the detection result associated with the image frame are disclosed. Data fusion with the combined detection result associated with the subset of the audio samples is carried out and further provided with a means for performing the emergency vehicle recognition, and the device. described in claim 21 is provided.\n | 23. A means for generating the prediction of the type of the emergency vehicle detected during the recognition of the emergency vehicle, and a means for generating the message for transmission to the vehicle control system, and the message further includes a step of including the type of the emergency vehicle; and a step for generating the message. Equipment. in any of the claims 16 through 22\n | 24. Non-temporary machine-readable media. which stored the program described in any one of the claim 12 to 15", "output": "The vehicle recognition system (100A) has a microphone arrangement (116) that is operatively mounted in a vehicle (104) to capture sounds outside of the vehicle. A sound analysis circuit (110) analyzes the captured sounds using an audio machine learning technique to identify a sound event. An image capture arrangement (115) is operatively mounted in the vehicle to capture images outside of the vehicle. An image analysis circuit analyzes the captured images using an image machine learning technique to identify an image event. A vehicle identification circuit identifies a type of vehicle based on the image event and the sound event. INDEPENDENT CLAIMS are included for the following: a method for vehicle recognition; anda non-transitory machine-readable medium storing program for vehicle recognition. Vehicle recognition system for identifying type of vehicle such as autonomous vehicle based on image event and sound event. The safety features are designed to avoid collisions and accidents by offering technologies that alert the driver to potential problems or to avoid collisions by implementing safeguards, and taking over control of the vehicle based on such safeguards. The association between the emergency vehicle image, and the emergency vehicle sound takes place to accurately recognize the emergency vehicle type based on the audio, light, and image data. The drawing shows the schematic drawing illustrating the system using the vehicle recognition platform to provide emergency vehicle detection based on sound data, light data, and image data.100AVehicle recognition system 104Vehicle 110Sound analysis circuit 115Image capture arrangement 116Microphone arrangement ", "instruction": "Please summarize the input" }, { "input": "AUTONOMOUS DRIVING SYSTEMThe present invention relates to an autonomous driving system. The autonomous driving system of the present invention includes a server that transmits environmental information including at least one of road conditions and traffic conditions; A V2X communication unit that receives the transmitted environmental information; a state information collection unit that collects driving state information of the vehicle; a display unit that displays at least one of a plurality of driving modes of the vehicle to occupants of the vehicle; and determining a selectable driving mode among the plurality of driving modes based on at least one of the received environmental information and the collected driving state information, and controlling the display unit to display the selectable driving mode distinctly from other driving modes. It may include a control unit that does.|1. a server that transmits environmental information including at least one of road conditions and traffic conditions;\nA V2X communication unit that receives the transmitted environmental information; \na state information collection unit that collects driving state information of the vehicle; \na display unit that displays at least one of a plurality of driving modes of the vehicle to occupants of the vehicle; and determining a selectable driving mode among the plurality of driving modes based on at least one of the received environmental information and the collected driving state information, and controlling the display unit to display the selectable driving mode distinctly from other driving modes. An autonomous driving system that includes a control unit that \n | 2. The method of claim 1, wherein the plurality of driving modes include: an autonomous driving mode in which the vehicle drives on the road by itself;\nA cooperative driving mode in which the vehicle and at least one other vehicle drive while maintaining a predetermined distance apart; and a normal driving mode in which the driver of the vehicle directly drives the vehicle. \n | 3. The method of claim 2, wherein the driving state information includes at least one of speed information of the vehicle, acceleration information of the vehicle, and driving time information of the vehicle, and the control unit controls the driver based on the driving state information. An autonomous driving system characterized by generating status information and determining the selectable driving mode based on the generated driver status information.\n | 4. The method of claim 3, wherein the control unit generates the driver state information based on acceleration information of the vehicle and determines the selectable driving mode based on the generated driver state information and the received environment information. Featured autonomous driving system.\n | 5. The method of claim 4, wherein at least one of the server and the control unit, when the cooperative driving mode is included in the selectable driving mode, displays the generated driver state information and the other driver state generated in the at least one other vehicle. An autonomous driving system that compares information and recommends the vehicle to drive first.\n | 6. The method of claim 5, wherein at least one of the server and the control unit operates in a front vehicle among vehicles operating in the cooperative driving mode while the vehicle and the at least one other vehicle are operating in the cooperative driving mode. If the generated driver status information indicates abnormality, designating the second vehicle among the vehicles in operation as the emergency driving lead vehicle, and controlling the gap between the vehicle in the lead and the vehicle in the second location. An autonomous driving system characterized by releasing.\n | 7. The method of claim 4, wherein the generated driver state information includes a degree of risk, and the process of determining the selectable driving mode by the control unit includes: generating an excess of the acceleration information with respect to a predetermined acceleration reference value;\ndetermining the risk through an operation including at least one of a cumulative operation and an average operation based on the generated excess; and, when the risk is outside a normal range, excluding at least one of the plurality of driving modes from the selectable driving modes. \n | 8. The autonomous driving system according to claim 7, wherein the predetermined acceleration reference value is generated based on the environmental information.\n | 9. The autonomous driving system of claim 8, wherein the environmental information includes at least one of speed limit information on the road and weather information.\n | 10. The autonomous driving system of claim 9, wherein the control unit adjusts an update cycle of the received environmental information based on the risk.\n | 11. A V2X communication unit that receives environmental information including at least one of road conditions and traffic conditions from an external server;\na state information collection unit that collects driving state information of the vehicle; and a control unit that determines which of the driving modes are selectable based on the received environmental information and the collected driving state information, and controls an external display device to display the selectable mode differently from the non-selectable mode..", "output": "The system (10) has server (200) that transmits environmental information includes road conditions and traffic conditions. A V2X communication unit (110) receives the transmitted environmental information. A state information collection unit (120) collects driving state information of the vehicle. A display unit (300) displays multiple driving modes of the vehicle to occupants of the vehicle. A control unit (150) determines a selectable driving mode among multiple driving modes based on received environmental information and collected driving state information. A control unit controls the display unit to display the selectable driving mode distinctly from other driving modes. Autonomous driving system for commercial vehicle i.e. passenger car. The autonomous driving system comprises a display unit that displays multiple driving modes of a vehicle to occupants of the vehicle, where a server transmits environmental information including road conditions and traffic conditions, thus ensuring smooth cooperative driving between vehicles through data transmission and reception between vehicles. The drawing shows a block diagram of a autonomous driving system for commercial vehicle. (Drawing includes non-English language text) 10Autonomous driving system100Driving terminal110V2x communication unit120Collection unit150Control unit200Server300Display unit400Generation device500Steering unit", "instruction": "Please summarize the input" }, { "input": "HAILING A VEHICLETypically, an indication of the potential occupant's intention to use the autonomous vehicle is received via a user interface. In response to receiving this indication, a call request is transmitted by the signaling mode to the at least one autonomous vehicle capable of receiving the call request directly according to the signaling mode.|1. A stationary device comprising: at least one processor;\nscreen; and at least one non-transitory storage medium storing instructions, wherein the instructions, when executed by the at least one processor, cause the at least one processor to: \ndisplay, on the screen, a user interface for hailing a vehicle; \nreceive, from an operator of the stationary device operating the user interface, an indication of a request for pick-up by a vehicle at a location proximate to the stationary device; \ntransmit the pickup request to a plurality of vehicles; \nreceive a response accepting the pickup request from a responsive vehicle among the plurality of vehicles, wherein the response includes at least one message ensuring that a plurality of vehicles do not respond to the pickup request; Received after the vehicle has been exchanged with another vehicle from the plurality of vehicles -; \nand cause the response to be displayed on the screen to an operator of the stationary device. \n | 2. The stationary device of claim 1, wherein the indication of the pickup request includes one or more indications of occupants, number of occupants, destination location, class of service, and time of arrival.\n | 2. The method of claim 1, wherein instructions causing the at least one processor to transmit the pickup request to a plurality of vehicles further cause the at least one processor to transmit the pickup request to a central system for transmission to the plurality of vehicles. A stationary device that causes a request to be transmitted.\n | 4. The method of claim 3, wherein the instructions cause the at least one processor to transmit the pickup request to a plurality of vehicles and cause the at least one processor to broadcast the pickup request directly to the plurality of vehicles. A fixed device.\n | 5. The method of claim 4, wherein instructions causing the at least one processor to broadcast the pickup request directly to the plurality of vehicles cause the at least one processor to use a vehicle to infrastructure (V2I) communication protocol. A stationary device that allows broadcasting a pickup request.\n | 2. The stationary device of claim 1, wherein the stationary device is a kiosk.\n | 1. A method performed by a stationary device at a fixed location, comprising: displaying, on a screen of the stationary device, a user interface for hailing a vehicle;\nReceiving, from an operator of the stationary device operating the user interface, an indication of a request for pick-up by a vehicle at a location proximate to the stationary device; \ntransmitting the pickup request to a plurality of vehicles; \nReceiving a response accepting the pickup request from a responding vehicle among the plurality of vehicles, wherein the response includes at least one message ensuring that the multiple vehicles do not respond to the pickup request, wherein the responding vehicle sends the plurality of vehicles Received after exchanging with another of the vehicles -; and displaying, on the screen, the response to an operator of the stationary device. \n | 8. The method of claim 7, wherein the indication of the pickup request includes one or more indications of passengers, number of passengers, destination location, class of service, and time of arrival.\n | 8. The method of claim 7, wherein transmitting the pickup request to the plurality of vehicles includes broadcasting the pickup request to the plurality of vehicles.\n | 10. The method of claim 9, wherein broadcasting the pickup request to the plurality of vehicles includes broadcasting the pickup request directly to the plurality of vehicles using a vehicle to infrastructure (V2I) communication protocol. How to.\n | 8. The method of claim 7, comprising transmitting the pickup request to a central system for transmission to the plurality of vehicles.\n | 9. The method of claim 8, wherein the stationary device is a kiosk.\n | 13. At least one non-transitory storage medium storing instructions implemented in a fixed device residing in a fixed location, wherein the instructions, when executed by the at least one processor, cause the at least one processor to:\ndisplay, on a screen of the stationary device, a user interface for hailing a vehicle; \nreceive, from an operator of the stationary device operating the user interface, an indication of a request for pick-up by a vehicle at a location proximate to the stationary device; \ntransmit the pickup request to a plurality of vehicles; \nreceive a response accepting the pickup request from a responding vehicle among the plurality of vehicles, wherein the response includes at least one message ensuring that the multiple vehicles do not respond to the pickup request; After exchanging with another of the vehicles, received -; \nAt least one non-transitory storage medium that causes the response to be displayed on the screen to an operator of the stationary device. \n | 14. The at least one non-transitory storage medium of claim 13, wherein the indication of the pickup request includes one or more indications of passengers, number of passengers, destination location, class of service, and time of arrival.\n | 14. The method of claim 13, wherein instructions causing the at least one processor to transmit the pickup request to a plurality of vehicles further cause the at least one processor to transmit the pickup request to the plurality of vehicles. At least one non-transitory storage medium that allows transmitting a request.\n | 14. The method of claim 13, wherein the instructions cause the at least one processor to transmit the pickup request to a plurality of vehicles and cause the at least one processor to broadcast the pickup request to the plurality of vehicles., at least one non-transitory storage medium.\n | 17. The method of claim 16, wherein instructions causing the at least one processor to broadcast the pickup request to the plurality of vehicles cause the at least one processor to broadcast the pickup request to the plurality of vehicles using a vehicle to infrastructure (V2I) communication protocol. At least one non-transitory storage medium that allows broadcasting the pickup request directly to vehicles.\n | 14. The at least one non-transitory storage medium of claim 13, wherein the stationary device is a kiosk.\n | 19. delete\n | 20. delete\n | 21. delete", "output": "The method involves receiving a hailing request by a signaling mode from a receiving device of an autonomous vehicle. Intention indication of a potential rider is determined in the autonomous vehicle corresponding to the hailing request. Hailing request processing operation is performed in the autonomous vehicle. The hailing request is received by direct wireless communication from a mobile device, where signaling mode comprises a visual mode and an audible mode, the visual mode comprises a display of graphical elements and image or light. An INDEPENDENT CLAIM is also included for an apparatus for hailing an autonomous vehicle. Method for hailing an autonomous vehicle (claimed) i.e. taxicab or ride-sharing vehicle. The method enables including temporal properties such as the display duration of image to modulate encode additional information so as to reduce incidence of false detections, thus ensuring appearance of the gesture. The method enables performing hailing process of the autonomous vehicle so as to ensure images or lights displaying effect and sound emitting effect by using the hailing device. The drawing shows a schematic block diagram of an apparatus for hailing an autonomous vehicle.", "instruction": "Please summarize the input" }, { "input": "Hailing a vehicleIn general, an indication is received through a user interface of an intention of a potential rider to use an autonomous vehicle. In response to the receipt of the indication, a hailing request is sent by a signaling mode to at least one autonomous vehicle that can receive the hailing request directly in accordance with the signaling mode.The invention claimed is: \n | 1. A system comprising: \nat least one processor; and \nat least one non-transitory computer-readable media comprising instructions that, upon execution of the instructions by the at least one processor, are to cause the at least one processor to: \nreceive, by at least one sensor of a vehicle from an infrastructure access point, a wireless signal that indicates a request for transportation services, wherein the wireless signal that indicates the request for transportation services is based on a wireless signal received by the infrastructure access point directly from a hailing device of a user; \ndetermine, using the at least one processor of the vehicle, that the vehicle can safely stop to pick up the user; \ndetermine, using the at least one processor of the vehicle, to accept the request based, at least in part, on the determination that the vehicle can safely stop to pick up the user; \nselect, based on the wireless signal received from the infrastructure access point, a stopping place; and \ncause the vehicle to stop at the stopping place. \n | 2. The system of claim 1, wherein the wireless signal that indicates the request for transportation devices is a vehicle-to-infrastructure (V2I) wireless signal.\n | 3. The system of claim 1, wherein the wireless signal received from the infrastructure access point is a vehicle-to-infrastructure (V2I) wireless signal.\n | 4. The system of claim 1, wherein the wireless signal that indicates the request for transportation services is generated by the infrastructure access point based on the wireless signal received from the hailing device without modification by a server that is communicatively coupled with the infrastructure access point.\n | 5. The system of claim 1, wherein the hailing device is a mobile device of a user.\n | 6. The system of claim 1, wherein the infrastructure access point is a WiFi access point.\n | 7. A method comprising: \ndetecting, by at least one sensor of a vehicle, a first wireless signal that indicates a request for transportation services received from an infrastructure access point, wherein the first wireless signal is based on a second wireless signal that was previously received by the infrastructure access point directly from a hailing device of a user; \ndetermining, using the at least one processor of the vehicle, that the vehicle can safely stop to pick up the user; \ndetermining, using the at least one processor of the vehicle, to accept the request based, at least in part, on the determination that the vehicle can safely stop to pick up the user; \nselecting, by at least one processor of the vehicle a stopping place based on the wireless signal received from the infrastructure access point; and \ncausing, by the at least one processor, the vehicle to stop at the stopping place. \n | 8. The method of claim 7, wherein the wireless signal that indicates the request for transportation devices is a vehicle-to-infrastructure (V2I) wireless signal.\n | 9. The method of claim 7, wherein the wireless signal received from the infrastructure access point is a vehicle-to-infrastructure (V2I) wireless signal.\n | 10. The method of claim 7, wherein the wireless signal that indicates the request for transportation services is generated by the infrastructure access point based on the wireless signal received from the hailing device without modification by a server that is communicatively coupled with the infrastructure access point.\n | 11. The method of claim 7, wherein the hailing device is a mobile device of a user.\n | 12. The method of claim 7, wherein the infrastructure access point is a WiFi access point.\n | 13. At least one non-transitory computer-readable media comprising instructions that, upon execution of the instructions by one or more processors of a vehicle, are to cause the vehicle to: \ndetect, by at least one sensor of a vehicle, a first wireless signal that indicates a request for transportation services received from an infrastructure access point, wherein the first wireless signal is based on a second wireless signal that was previously received by the infrastructure access point directly from a hailing device of a user; \ndetermine, using the at least one processor of the vehicle, that the vehicle can safely stop to pick up the user; \ndetermine, using the at least one processor of the vehicle, to accept the request based, at least in part, on the determination that the vehicle can safely stop to pick up the user; \nselect, by at least one processor of the vehicle a stopping place based on the wireless signal received from the infrastructure access point; and \ncause, by the at least one processor, the vehicle to stop at the stopping place. \n | 14. The at least one non-transitory computer-readable media of claim 13, wherein the wireless signal that indicates the request for transportation devices is a vehicle-to-infrastructure (V2I) wireless signal.\n | 15. The at least one non-transitory computer-readable media of claim 13, wherein the wireless signal received from the infrastructure access point is a vehicle-to-infrastructure (V2I) wireless signal.\n | 16. The at least one non-transitory computer-readable media of claim 13, wherein the wireless signal that indicates the request for transportation services is generated by the infrastructure access point based on the wireless signal received from the hailing device without modification by a server that is communicatively coupled with the infrastructure access point.\n | 17. The at least one non-transitory computer-readable media of claim 13, wherein the hailing device is a mobile device of a user.\n | 18. The at least one non-transitory computer-readable media of claim 13, wherein the infrastructure access point is a WiFi access point.\n | 19. The system of claim 1, the instructions when executed are to cause the at least one processor to transmit a message indicating that the vehicle has accepted the request for transportation services to at least one other vehicle using vehicle-to-vehicle (V2V) communication.\n | 20. The method of claim 7, further comprising transmitting a message indicating that the vehicle has accepted the request for transportation services to at least one other vehicle using vehicle-to-vehicle (V2V) communication.\n | 21. The at least one non-transitory computer-readable media of claim 13, the instructions when executed are to cause the at least one processor to transmit a message indicating that the vehicle has accepted the request for transportation services to at least one other vehicle using vehicle-to-vehicle (V2V) communication.", "output": "The system has a processor which is configured to receive a wireless signal that indicates a request for transportation servicesupon execution of the instructions by the processor from an infrastructure access point, by a sensor of a vehicle. The wireless signal that indicates the request for transportation services is based on a wireless signal received by the infrastructure access point directly from a hailing device (72)of a user. The processor is configured to select a stopping place based on the wireless signal received from the infrastructure access point. The processor is configured to cause the vehicle to stop at the stopping place. INDEPENDENT CLAIMS are included for the following:a method for hailing vehicle; and a non-transitory computer-readable media storing program for hailing vehicle. System for hailing vehicle such as autonomous vehicle. The temporal properties such as the display duration of each image is modulated to encode additional information, or reduce incidence of false detections. The characteristics of the emitted light is uncommon, to minimize the chance that emitted light having similar or identical characteristics is erroneously detected by sensors on an autonomous vehicle as a hailing request, thus resulting in a false detection. The appearance of a gesture or gestures is uncommon to minimize the chance that a similar or identical gesture that is not being performed for the purpose of hailing an autonomous vehicle is detected by sensors on an autonomous vehicle, thus resulting in a false detection. The temporal properties such as the display duration of each gesture is modulated to encode additional information, or reduce incidence of false detections. The user interface enables the potential rider to indicate a destination location, a number of riders, a class of service, a time to arrive, and a variety of other pieces of information relevant to the hailing. The drawing shows a block diagram illustrating the system for hailing vehicle.70Gesture 72Hailing device ", "instruction": "Please summarize the input" }, { "input": "WAVELENGTH BASED V2X ANTENNAProvided are wavelength based V2X antennas, and related antenna systems and method, which can include a first antenna having a first wavelength and a second antenna having a second antenna. Some antenna systems control a wavelength of a signal for transmission using one of the first antenna and the second antenna.WHAT IS CLAIMED IS: \n | 1 . An antenna system comprising: a Vehicle to Everything (V2X) antenna comprising: a first antenna having a first wavelength; and a second antenna having a second wavelength, the second wavelength being different than the first wavelength; wherein the V2X antenna is configured to control a wavelength of a signal for transmission using one of the first antenna and the second antenna.\n | 2. The antenna system of claim 1 , wherein the V2X antenna further comprises a switch communicatively coupled to the first antenna and to the second antenna, and wherein the V2X antenna is configured to use the switch for controlling the wavelength of the signal for transmission.\n | 3. The antenna system of claim 2, wherein when the switch is in a first position, the V2X antenna transmits the signal in the first wavelength using the first antenna, and wherein when the switch is in a second position, the V2X antenna transmits the signal in the second wavelength using the second antenna.\n | 4. The antenna system of any one of the previous claims, wherein the V2X antenna is configured to transmit the signal using one of the first antenna and the second antenna.\n | 5. The antenna system of any one of the previous claims, wherein the first antenna has a first electrical length associated with the first wavelength, wherein the second antenna has a second electrical length associated with the second wavelength, and wherein the second electrical length is different from the first electrical length.\n | 6. The antenna system of any one of the previous claims, wherein the V2X antenna is configured to control the wavelength of the signal for transmission by: receiving a control signal indicative of which of the first antenna or the second antenna to transmit from; and transmitting the signal using one of the first antenna and the second antenna based on the control signal.\n | 7. The antenna system of any one of the previous claims, wherein: the V2X antenna is configured to transmit the signal from the first antenna as a V2V signal; and the V2X antenna is configured to transmit the signal from the second antenna as a V2I signal.\n | 8. The antenna system of any one of the previous claims, wherein the V2X antenna further comprises: a compensator configured to compensate for gain degradation of the signal.\n | 9. The antenna system of any one of the previous claims, wherein the V2X antenna further comprises: a matcher configured to control a center of the wavelength.\n | 10. The antenna system of any one of the previous claims, wherein the V2X antenna does not include a beamformer.\n | 1 1 . The antenna system of any one of the previous claims, wherein: the antenna system is configured to determine a signal quality parameter indicative of a quality of a received signal; and the V2X antenna is configured to transmit the signal using one of the first antenna and the second antenna based on the signal quality parameter.\n | 12. A method, performed by a Vehicle to Everything, V2X, antenna comprising a first antenna having a first wavelength and a second antenna having a second wavelength different than the first wavelength, wherein the method comprises: controlling a wavelength of a signal for transmission using one of the first antenna and the second antenna.\n | 13. The method of claim 12, wherein the V2X antenna further comprises a switch communicatively coupled to the first antenna and to the second antenna, and wherein controlling the wavelength of the signal for transmission comprises switching the switch.\n | 14. The method of claim 13, the method further comprising: when the switch is in a first position, transmitting the signal in the first wavelength using the first antenna; and when the switch is in a second position, transmitting the signal in the second wavelength using the second antenna.\n | 15. The method of any one of claims 12-14, the method further comprising: receiving a control signal indicative of which of the first antenna or the second antenna to transmit from; and transmitting the signal using one of the first antenna and the second antenna based on the control signal.\n | 16. The method of any one of claims 12-15, wherein controlling the wavelength of the signal for transmission comprises: transmitting the signal from the first antenna as a V2V signal; and transmitting the signal from the second antenna as a V2I signal.\n | 17. The method of any one of claims 12-16, the method further comprising: controlling, using a matcher, a center of the wavelength.\n | 18. The method of any one of claims 12-17, wherein the method does not comprise beamforming.\n | 19. The method of any one of claims 12-18, the method further comprising: determining a signal quality parameter indicative of a quality of a received signal; and transmitting the signal using one of the first antenna and the second antenna based on the signal quality parameter.\n | 20. An autonomous vehicle comprising: a Vehicle to Everything, V2X, antenna comprising: a first antenna having a first wavelength; and a second antenna having a second wavelength, the second wavelength being different than the first wavelength; wherein the V2X antenna is configured to control a wavelength of a signal for transmission using one of the first antenna and the second antenna.", "output": "The system has vehicle to everything (V2X) antenna comprising a first antenna having a first wavelength and a second antenna having a second wavelength, where the second wavelength being different than the first wavelength. The V2X antenna is configured to control a wavelength of a signal for transmission using one of the first antenna and the second antenna. The V2X antenna further comprises a switch communicatively coupled to the first antenna and to the second antenna, and where the V2X antenna is configured to use the switch for controlling the wavelength of the signal for transmission. The V2X antenna transmits the signal in the first wavelength using the first antenna, and where the V2X antenna transmits the signal in the second wavelength using the second antenna when the switch is in a second position. The V2X antenna is configured to transmit the signal using one of the first antenna and the second antenna. INDEPENDENT CLAIMS are included for the following:a method for performed by a vehicle to everything (V2X), antenna; andan autonomous vehicle. Antenna system for use in an vehicle e.g. autonomous vehicle (claimed) such as car and bus. The vehicle-to-everything communication system provides for robust and reliable performance of an autonomous vehicle. The system provides similar performance as conventional beamforming but using a simple and efficient structure into the vehicle. The wavelength based V2X antenna is configured to optimize beam patterns (e.g., radiation patterns) for the different types of V2X communication, allowing for optimized and/or improved communication. The system provides improved connectivity performance while avoiding the use of complex beamforming schemes. The system improves signal quality by avoiding signal mismatch, while reducing wasted power consumption. The respective radiation patterns, due to changes in wavelength, can be different depending on the type of V2X communication, allowing flexibility for optimizing communication. The drawing shows a diagram of an example implementation of a wavelength based V2X antenna. 702V2X system704RF signal706Compensator708Matcher712Control signal", "instruction": "Please summarize the input" }, { "input": "DYNAMIC ANTENNA SYSTEMProvided are dynamic antenna systems, such as for an autonomous vehicle, which can include at least one modem and at least one antenna operatively connected with the at least one modem. Some antenna systems described also are configured to determine, using at least one processor, a performance parameter indicative of a performance of a communication between the at least one antenna and a network node, and control, using the at least one processor, based on the performance parameter, one or more of a position of the at least one antenna, and a connection of the at least one antenna to the at least one modem.WHAT IS CLAIMED IS: \n | 1 . An antenna system for an autonomous vehicle, the antenna system comprising: at least one modem; and at least one antenna operatively connected with the at least one modem; wherein the antenna system is configured to: determine, using at least one processor, a performance parameter indicative of a performance of a communication between the at least one antenna and a network node; and control, using the at least one processor, based on the performance parameter, one or more of: a position of the at least one antenna, and a connection of the at least one antenna to the at least one modem.\n | 2. The antenna system of claim 1 , wherein the antenna system is configured to control the position of the at least one antenna relative to a position of the at least one modem.\n | 3. The antenna system of any one of the preceding claims, wherein the antenna system is configured to control, using the at least one processor, based on the performance parameter, a position of the at least one antenna by controlling one or more of: an orientation of the at least one antenna, a phase of the at least one antenna, an angle of the at least one antenna, and a pose of the at least one antenna.\n | 4. The antenna system of any one of the preceding claims, wherein the antenna system is configured to control, using the at least one processor, based on the performance parameter, the position of the at least one antenna by controlling a rotation of the at least one antenna.\n | 5. The antenna system of any one of the preceding claims, wherein the antenna system is configured to control, using the at least one processor, based on the performance parameter, the connection of the at least one antenna to the at least one modem. \n | 6. The antenna system of any one of the preceding claims, wherein the antenna system is configured to control the connection of the at least one antenna to the at least one modem using a switch coupled to the at least one antenna and the at least one modem.\n | 7. The antenna system of any one of the preceding claims, wherein the antenna system is configured to control, using the at least one processor, based on the performance parameter, the position of the at least one antenna and/or the connection of the at least one antenna to the at least one modem by: determining if the performance parameter satisfies a criterion; and in response to determining that the performance parameter satisfies the criterion, controlling the position of the at least one antenna and/or the connection of the at least one antenna to the at least one modem.\n | 8. The antenna system of claim 7, wherein the performance parameter satisfies the criterion in response to the performance parameter being below a performance threshold.\n | 9. The antenna system of any one of the preceding claims, wherein the at least one antenna is one or more of a cellular antenna and a V2X antenna.\n | 10. The antenna system of any one of the preceding claims, wherein the antenna system comprises a plurality of antennas associated with at least two different carriers, wherein each carrier of the at least two different carriers operates on a different frequency band.\n | 11. The antenna system of any one of the preceding claims, wherein the antenna system comprises a plurality of modems. \n | 12. The antenna system of any one of the preceding claims, wherein the antenna system comprises a plurality of modems and a plurality of antennas, wherein each of the plurality of antennas is connected to each of the plurality of modems via a switch.\n | 13. An autonomous vehicle comprising: at least one modem; and at least one antenna operatively connected with the at least one modem; wherein the autonomous vehicle is configured to: determine, using at least one processor, a performance parameter indicative of a performance of a communication between the at least one antenna and a network node; and control, using the at least one processor, based on the performance parameter, one or more of: a position of the at least one antenna, and a connection of the at least one antenna to the at least one modem.\n | 14. A method comprising: determining, by at least one processor, a performance parameter indicative of a performance of a communication between at least one antenna and a network node; and controlling, using the at least one processor, based on the performance parameter, one or more of: a position of the at least one antenna, and a connection of the at least one antenna to at least one modem.\n | 15. The method of claim 14, wherein controlling the position of the at least one antenna and/or the connection of the at least one antenna to the at least one modem comprises: controlling the position of the at least one antenna relative to a position of the at least one modem. \n | 16. The method of any one of claims 14-15, wherein controlling the position of the at least one antenna and/or the connection of the at least one antenna to the at least one modem comprises: controlling one or more of: an orientation of the at least one antenna, a phase of the at least one antenna, an angle of the at least one antenna, and a pose of the at least one antenna.\n | 17. The method of any one of claims 14-16, wherein controlling the position of the at least one antenna and/or the connection of the at least one antenna to the at least one modem comprises: rotating the at least one antenna.\n | 18. The method of any one of claims 14-17, wherein controlling the position of the at least one antenna and/or the connection of the at least one antenna to the at least one modem comprises: controlling the connection of the at least one antenna to the at least one modem.\n | 19. The method of any one of claims 14-18, wherein controlling the position of the at least one antenna and/or the connection of the at least one antenna to the at least one modem comprises: determining if the performance parameter satisfies a criterion; and in response to determining that the performance parameter satisfies the criterion, controlling the position of the at least one antenna and/or the connection of the at least one antenna to the at least one modem.\n | 20. The method of claim 19, wherein determining if the performance parameter satisfies a criterion comprises: determining if the performance criteria is below a performance threshold; and in response to determining that the performance criteria is below the performance threshold, determining that the performance parameter satisfies the criterion.", "output": "The system (600) has an antenna (602) operatively connected with a modem (612A). A processor determines a performance parameter indicative of performance of communication between the antenna and a network node. The processor controls position of the antenna based on the performance parameter and connection of antenna to the modem, where the system controls the position of the antenna relative to position of the modem by controlling orientation of the antenna, phase of the antenna and angle of antenna. A switch (614A1) is coupled to the antenna and modem. Multiple antennas are associated with two different carriers. Each carrier operates on different frequency band. Multiple modems and multiple antennas are provided. INDEPENDENT CLAIMS are also included for:an autonomous vehiclea method for operating an antenna system in an autonomous vehicle. Antenna system for an autonomous vehicle (claimed). The system is robustly implemented in the vehicle, and enables the vehicle to operate in an environment in which large amount of wireless connectivity is required in an efficient manner. The antenna provides 360-degree coverage around the vehicle in a cost effective manner. The drawing shows a schematic view of a dynamic antenna system.600Antenna system602A, 602BAntenna610Electronic control unit612A, 612BModems614A1-614A4, 614B1-614B4Switches", "instruction": "Please summarize the input" }, { "input": "Trajectory planning of vehicles using route informationAbstract Title: Trajectory planning of vehicles using route information \n\nAn autonomous vehicle receives information from vehicle sensors indicating the presence of an object (e.g. bus, delivery vehicle, bicycle) nearby. The object is identified (e.g. a bus with a particular model number) and the expected route of the object is retrieved (e.g. from a database, vehicle-to-vehicle communications or a mobile device associated with the object). The current trajectory is estimated using the on-board sensors and is compared with the expected route of the object, including approximating where the object may be in 5 seconds. The information is used to plan the trajectory of the autonomous vehicle (e.g. to avoid collisions). The expected route may be a one-time or temporary route (e.g. recreational drivers, delivery vehicles, construction vehicles) or a re-occurring route (e.g. postal services, busses). A confidence level may be assigned based on an accuracy and/or reliability of the expected route of the object.T. IS CLAMED IS: A method comprising: receiving, by at least one processor, information indicating a presence ofan object operating in an environment; determining, by the at least one processor, a. trajectory of the object, the tra including at least a position, a speed, and a direction of travel of the object; determining, by the at least one processor, an expected. route of the object, wherein the expected route is pre-planned and includes an expected future position of the object at a future time; comparing the trajectory of the object to the expected route of the object; and in accordance with the comparison that the -trajectory of the object is consistent with the expected route of the object, updating the trajectory of the object based on the expected route of the object. \n\nThe method of claim 1, wherein determining the expected route of the object includes re in route information from a server. \n\nThe method of any of the preceding claims, wherein the determination of the exp route is based on received route information from a transceiver or a mobile device associate with the object. \n\nThe method of * ny of the preceding claims wherein the future time is at least S seconds he future, 5. The method of any of the preceding claims, wherein comparing the trajectory of the object to the expected route of the object includes determining that the position of the object is an expected position along the expected Mute. \n\n6. The method of any of the preceding claims, wherein comparing the trajectory of the object to the expected route of the object includes determining that a velocity of the object is an expected velocity along the expected route. -2 - \n\n7 The method of any of the preceding claims, wherein the received information is from at least one sensor of a host vehicle. \n\n8. The method of any of the preceding claims. wherein the at least one processor is part of a remote server. \n\n9. The method of any of the preceding claims, wherein the received information is fron transceiver or mobile device associated with the object. \n\n10. The method of any of the preceding claims, further comprising: determining if the received information is sufficient to determine an expected route of the object and in accordance with errnir -ion that the received data is not sufficient determine the expected route of the object receiving additional information of at least one state of the object. \n\n11. The method of claim 10 herein the received additinnal information is from at least one sensor of a host vehicle. \n\n12. The method of claim 10 or claim 11, further comprising, in accordance with he determination that the received data is not sufficient to determine the expected route of the object, transmitting the received additional information to a machine learning module for object classification. \n\n13. The method of any of the preceding claims, further comprising, in accordance with the comparison that the trajectory of the object is consistent with the expected route of the object, determining an uncertainty of the updated trajectory. \n\n14. The method of any of the preceding claims, further comprising determining a reliability of the expected route. \n\n15. The method of any of the preceding claims, further comprising transmitting the updated trajectory information of the object. \n\n16. A non-transitory computer--readable storage medium comprising at least one program for execution by at least one processor of a first device, the at least one program including instructions which, when executed by the at least one processor, cause the first device to perform the method of any of the preceding claims. \n\n17. A vehicle comprising: at least one sensor configured to capture information of an object; at least one transceiver configured to transmit and receive route information of the object; and at least one processor communicatively coupled to the at least one sensor and the at least one transceiver and configured to execute computer executable instructions, the execution carrying out the method of any of claims 1-15.", "output": "The method (800) involves receiving (802) that information indicating a presence of an object operating in an environment by processor. A trajectory of the object is determined (804) a speed including a position and a direction of travel of the object. An expected route of the object is determined (806) by the processor and the expected route is pre-planned and includes an expected future position of the object at a future time. The trajectory of the object is compared (808) to the expected route of the object. The trajectory of the object is updated (810) based on the expected route of the object and in accordance with the comparison that the -trajectory of the object is consistent with the expected route of the object. An INDEPENDENT CLAIM is included for a non-transitory computer-readable storage medium storing program for trajectory planning of vehicles using route information. Method for trajectory planning of vehicle (claimed) e.g. autonomous vehicle (AV), car, drone, shuttle, train, 4wheel-drive pickup truck, sport utility vehicle (SUV) and bus, using route information. The objects in the environment is safer, since the autonomous vehicle to avoid interfering with the expected route of the objects. The computing devices are located on the AV algorithmically generate control actions based on both real-time sensor data and prior information, thus allowing the AV system to execute autonomous driving capabilities. The cloud that includes cloud data centers along with the network and networking resources, thus facilitating the computing systems access to cloud computing service. The drawing shows a flow diagram illustrating the method for trajectory planning of vehicles using route information.800Method for trajectory planning of vehicles using route information 802Step for receiving information indicating a presence of an object operating in an environment 804Step for determining a trajectory of the object 806Step for determining an expected route of the object 808Step for comparing the trajectory of the object to the expected route of the object 810Step for updating the trajectory of the object based on the expected route of the object ", "instruction": "Please summarize the input" }, { "input": "Road surface condition guided decision making and predictionAbstract Title: ROAD SURFACE CONDITION GUIDED DECISION MAKING AND PREDICTION \n\nA surface detection system for an autonomous vehicle comprising at least one sensor, a computer readable medium and at least one processor. The system receives data from the sensor associated with a surface along which the vehicle is travelling (1402). Using a surface classifier to determine a classification of the surface based on the sensor data (1404). Based on this data determining the drivability of the surface (1406) and planning based on the drivability the behaviour of the vehicle on the surface (1408). Controlling the vehicle over the surface (1410). The system may also receive data from a network outside of the vehicle and from other vehicles (V2V). |\nWHAT IS CLAIMED IS: A system, comprising: at least one sensor; at least one computer-readable medium storing computer-executable instructions; at least one processor configured to communicate with the at least one sensor and to execute the computer executable instructions, the execution carrying out operations including: receiving, from the at least one sensor, sensor data associated with a surface along a path to be traveled by a vehicle; using a surface classifier to determine a classification of the surface based on the sensor data; determining, based on the classification of the surface, drivability properties of the surface, planning, based on the drivability properties of the surface, a behavior of the vehicle when driving near the surface or on the surface; and controlling the vehicle based on the planned behavior. \n | \n\n2 The system of claim 1, wherein determining, based on the surface classification drivability properties of the surface comprises: generating a surface map that includes at least one of: a list of geometric descriptions of the surface or a distribution of the drivability properties on the path of the vehicle. \n | \n\n3. The system of claim I or claim 2, wherein the surface classification includes a known surface, and wherein determining, based on the surface classification, drivability properties of the surface comprises: obtaining, from a database, the drivability properties associated with the known surface. \n | \n\n4. The system of any preceding claim, wherein the surface classification is an unknown surface, and wherein determining, based on the surface classification, the drivability properties of the surface comprises: determining, from a database, sensor measurements included in a label of the unknown surface, wherein the sensor measurements are historical sensor measurements associated with the unknown surface; and determining the drivability properties of the unknown surface based on the sensor measurements. \n | \n\n5. The system of claim 4, wherein the historical sensor measurements are measured by the vehicle or received from another vehicle. \n | \n\n6. The system of any preceding claim, wherein planning, based on the drivability properties of the surface, a behavior of the vehicle when driving near the surface or on the surface comprises: determining, based on the drivability properties, a vehicle motion that is associated with a safety or performance value that is greater than a current safety or performance value associated with a current vehicle motion. \n | \n\n7. The system of any preceding claim, wherein the surface is a first surface, and wherein planning, based on the drivability properties of the surface, a behavior of the vehicle when driving near the surface or on the surface comprises: determining a historical vehicle motion performed on a second surface that has properties similar to the drivability properties of the first surface. \n | \n\n8. The system of any preceding claim, wherein the vehicle is a first vehicle, and wherein planning, based on the drivability properties of the surface, a behavior of the vehicle when driving near the surface or on the surface comprises: detecting a second vehicle in proximity of the first vehicle; determining, based on the drivability properties of the surface, an expected motion of the second vehicle; and determining, based on the expected motion of the second vehicle, the behavior of the first vehicle. \n | \n\n9. The system of any preceding claim, wherein the surface classifier receives, from the at least one sensor, sensor measurements performed when the vehicle drives over the surface. \n | \n\n10. The system of any preceding claim, wherein the surface classification is a known surface classification, and wherein the operations further comprise: updating, based on the sensor measurements, a classifier associated with the surface classification. \n | \n\nI I. The system of any of claims I to 9, wherein the surface classification is an unknown surface, and wherein the operations further comprise: adding the sensor measurements to a label associated with the unknown surface. \n | \n\n12. The system of any preceding claim, the operations further comprising: receiving from a shared dynamic database at least one of a road surface classification information or known surface property information. \n | \n\n13. The system of any preceding claim, wherein the vehicle is a first vehicle, and wherein the operations further comprise: capturing, using the at least one sensor, a motion of a second vehicle that s driving on the surface 14. The system of any preceding claim, the operations further comprising: sending to a shared dynamic database at least one of: surface property feedback or vehicle motion feedback when the vehicle drives on the surface. \n\n15. A method comprising: receiving, from at least one sensor of a vehicle, sensor data associated with a surface along a path to be traveled by a vehicle; using a surface classifier to determine a classification of the surface based on the sensor data determining, based on the classification of the surface, drivability properties of the surface; planning, based on the drivability properties of the surface, a behavior of the vehicle when driving near the surface or on the surface; and controlling the vehicle based on the planned behavior. \n\n16. A non-transitory computer-readable storage medium comprising at least one program for execution by at least one processor of a first device, the at least one program including instructions which, when executed by the at least one processor, cause the first device to perform the method of claim 15.", "output": "The system (120) has at least one sensor (121), a computer-readable medium storing computer-executable instructions. A processor is configured to communicate with the at least one sensor and execute the computer-executable instructions. The execution performing operations include receiving sensor data associated with a surface along a path for a vehicle (100) to travel from the sensor. A surface classifier is used to determine a classification of the surface based on the sensor data. The drivability characteristics of the surface are determined based on the classification of the surface. A behavior of the vehicle when driving near the surface or on the surface is planned based on the drivability characteristics of the surface. The vehicle is controlled based on the planned behavior. INDEPENDENT CLAIMS are included for the following:a method for decision making and prediction with control based on road surface condition; anda non-transitory computer-readable storage medium storing program for executing a method for decision making and prediction with control based on road surface condition. Autonomous vehicle system for decision making and prediction with control based on road surface condition. The vehicle's behavior is adjusted based on dynamically changing road surfaces and conditions that affect safety and drivability. The vehicle can predict the behavior of other vehicles driving on the surface and can proactively adjust its behavior accordingly, based on the drivability characteristics of the surface. The system improves vehicle safety and reliability, particularly when driving in hazardous environments. The system reduces the chances of collisions and improves vehicle reliability and safety. The automated driving systems achieve better safety. The system ensures better decision-making, obeying traffic rules and predicting future events better than humans, and reliably controls a vehicle better than a human. The movement planner determines vehicle behavior that causes the vehicle to travel in the lane of other vehicles while it is snowing. The drawing shows a schematic view of an autonomous vehicle with autonomous capability. (Drawing includes non-English language text) 100Vehicle120AV system121Sensor122Stereo video camera132Computer peripheral", "instruction": "Please summarize the input" }, { "input": "Hailing a vehicleIn general, an indication is received through a user interface of an intention of a potential rider to use an autonomous vehicle. In response to the receipt of the indication, a hailing request is sent by a signaling mode to at least one autonomous vehicle that can receive the hailing request directly in accordance with the signaling mode.The invention claimed is: \n | 1. A stationary apparatus comprising: \nat least one processor; \na screen; and \nat least one non-transitory storage medium storing instructions that, when executed by the at least one processor, cause the at least one processor to: \ndisplay, on the screen, a user interface for hailing a vehicle; \nreceive, from an operator of the stationary apparatus operating the user interface, an indication for a request for a pick-up by a vehicle at a location proximate to the stationary apparatus; \ntransmit the request for the pick-up to a plurality of vehicles; \nreceive a response from a responsive vehicle of the plurality of vehicles accepting the request for the pick-up, wherein the response is received after the responsive vehicle exchanges at least one message with another vehicle of the plurality of vehicles ensuring that multiple vehicles do not respond to the request for the pick-up; and \ndisplay, on the screen, the response to the operator of the stationary apparatus. \n | 2. The stationary apparatus of claim 1, wherein the indication for the request for the pick-up comprises an indication of one or more of a rider, a number of riders, a destination location, a class of service, and a time to arrive.\n | 3. The stationary apparatus of claim 1, wherein the instructions that cause the at least one processor to transmit the request for the pick-up to the plurality of vehicles cause the at least one processor to broadcast the request directly to the plurality of vehicles.\n | 4. The stationary apparatus of claim 3, wherein the instructions that cause the at least one processor to broadcast the request directly to the plurality of vehicles cause the processor to broadcast the request using a vehicle to infrastructure (V21) communications protocol.\n | 5. The stationary apparatus of claim 1, wherein the stationary apparatus comprises a kiosk that resides at a fixed location.\n | 6. A method performed by a stationary apparatus at a fixed location, the method comprising: \ndisplaying, on a screen of a stationary apparatus, a user interface for hailing a vehicle; \nreceiving, from an operator of the stationary apparatus operating the user interface, an indication for a request for a pick-up by a vehicle at a location proximate to the stationary apparatus; \ntransmitting the request for the pick-up to a plurality of vehicles; \nreceiving a response from a responsive vehicle of the plurality of vehicles accepting the request for the pick-up, wherein the response is received after the responsive vehicle exchanges at least one message with another vehicle of the plurality of vehicles ensuring that multiple vehicles do not respond to the request for the pick-up; and \ndisplaying, on the screen, the response to the operator of the stationary apparatus. \n | 7. The method of claim 6, wherein the indication for the request for the pick-up comprises an indication of one or more of a rider, a number of riders, a destination location, a class of service, and a time to arrive.\n | 8. The method of claim 6, wherein transmitting the request for the pick-up to the plurality of vehicles comprises broadcasting the request to the plurality of vehicles.\n | 9. The method of claim 8, comprising broadcasting the request directly to the plurality of vehicles using a vehicle to infrastructure (V21) communications protocol.\n | 10. The method of claim 6, wherein the stationary apparatus comprises a kiosk.\n | 11. At least one non-transitory storage medium storing instructions embodied in a stationary apparatus residing at a fixed location, the instructions, when executed by at least one processor, cause the at least one processor to: \ndisplay, on a screen, a user interface for hailing a vehicle; \nreceive, from an operator of the stationary apparatus operating the user interface, an indication for a request for a pick-up by a vehicle at a location proximate to the stationary apparatus; \ntransmit the request for the pick-up to a plurality of vehicles; \nreceive a response from a responsive vehicle of the plurality of vehicles accepting the request for the pick-up, wherein the response is received after the responsive vehicle exchanges at least one message with another vehicle of the plurality of vehicles ensuring that multiple vehicles do not respond to the request for the pick-up; and \ndisplay, on the screen, the response to the operator of the stationary apparatus. \n | 12. The at least one non-transitory storage medium of claim 11, wherein the indication for the request for the pick-up comprises an indication of one or more of a rider, a number of riders, a destination location, a class of service, and a time to arrive.\n | 13. The at least one non-transitory storage medium of claim 11, wherein the instructions that cause the at least one processor to transmit the request for the pick-up to the plurality of vehicles cause the at least one processor to broadcast the request to the plurality of vehicles.\n | 14. The at least one non-transitory storage medium of claim 13, wherein the instructions that cause the at least one processor to broadcast the request to the plurality of vehicles cause the processor to broadcast the request directly to the plurality of vehicles using a vehicle to infrastructure (V21) communications protocol.\n | 15. The at least one non-transitory storage medium of claim 11, wherein the stationary apparatus comprises a kiosk.", "output": "The apparatus has a processor (232,280,282) for displaying a user interface (248) for hailing a vehicle (200) on a screen. The processor receives an indication for a request for a pick-up by the vehicle at a location proximate to the stationary apparatus from an operator of the stationary apparatus operating the user interface. The request for the pick-up is transmitted to a set of vehicles. A response is received from one of the set of the vehicles accepting the request for the pick-up. The response is displayed on the screen to the operator of the stationary apparatus. The indication comprises an indication of a rider, a number of riders, a destination location, a class of service, and a time to arrive. INDEPENDENT CLAIMS are included for:(1) a method performed by a stationary apparatus at a fixed location; and(2) a non-transitory storage media storing instructions. Stationary apparatus for hailing a vehicle, such as autonomous vehicle. Can also be used for a taxicab and a ride-sharing vehicle. The hailing request is received directly in accordance with the signaling mode from the device in the vicinity of the potential rider, and the pickup location is provided to the processor based on the hailing confirmation, thus allowing the user to hail the autonomous vehicle in an efficient manner. The method allows the user of the mobile device to provide hailing information to the vehicle, and allows the vehicle to provide a pickup location to autonomous driving features of the vehicle in a reliable manner. The drawing shows a block diagram of the vehicle.200Vehicle 202Stimulus detector 204Video sensor 232,280,282Processors 248User interface ", "instruction": "Please summarize the input" }, { "input": "Permission authentication and public key-based integrity self-driving vehicle IoT firmware update generation device and method through hyperledger fabric block chain moduleIn the present invention, while updating the firmware of a conventional self-driving car through the vehicle's external communication network, various hacking such as DDoS (Distributed Denial of Service) attack, vehicle ID (identification) falsification, GPS manipulation, and information collection is performed on the self-driving car's firmware. If so, there are problems that could cause the entire self-driving system to be damaged by causing it to move to a different location instead of the desired destination, or spreading malicious code, and when updating the IoT-type firmware module of the self-driving car, excessive use of the vehicle's external communication network may occur. In order to improve the problem of not being able to respond to firmware update requests provided by firmware suppliers, which can cause network bottlenecks by causing traffic, MKG-type IoT firmware brokerage control module (100), firmware brokerage for firmware suppliers Consisting of a control module (200) and a firmware-type Hyperledger Fabric blockchain module (300), the MKG-type IoT firmware brokerage control module is connected to a 5G Wifi communication network, Through the firmware brokerage control module for firmware suppliers and the firmware-type Hyperledger Fabric blockchain module, it is possible to create unprecedented permission authentication and public key-based integrity autonomous vehicle IoT firmware updates, and to determine the cause of accidents in self-driving cars. It can be provided as forensic data to prove, and is connected to the IoT-type firmware module of the self-driving car, forming an MKG-type IoT firmware brokerage control module that replaces the IoT-type firmware module of the self-driving car, thereby solving the transaction overload problem., It facilitates the management of IoT-type firmware modules, and by using bridge peer nodes, it can guarantee high availability of 80% compared to the existing system without single point of failure on public networks, private networks, and Hyperledger Fabric networks., The program is designed to receive and process token-type objects of the MKG-type IoT firmware mediation control module and firmware update request signals in the order of authorization and authentication applications in the Hyperledger Fabric blockchain, causing excessive traffic in the vehicle's external communication network., Bottlenecks that occur in public networks, private networks, and Hyperledger Fabric networks can be reduced to less than 70% compared to existing ones, and only authorized nodes can participate in public networks, private networks, and Hyperledger Fabric networks, and smart Since only some nodes execute the chain code of the contract function, multiple transactions can be quickly processed in parallel, the ledger is disclosed only to authorized nodes using the channel, and the identity of network participating nodes can be confirmed. The purpose is to provide a permission authentication and public key-based integrity autonomous vehicle IoT firmware update generation device and method through a firmware-type Hyperledger Fabric blockchain module that can clearly determine who is responsible when a problem occurs.|1. It is connected to the IoT firmware module that plays the role of cognitive control, learning judgment control, and autonomous driving control of the autonomous vehicle, and receives firmware data on the presence or absence of firmware updates, version, and image data before and after the accident of the autonomous vehicle, A firmware-type Hyperledger Fabric block that is formed to download firmware update files directly from a firmware supplier that supplies firmware update files based on additional permission certification and a public key after receiving permission certification from the Ledger Fabric blockchain. Permission authentication and public key-based integrity autonomous vehicle IoT firmware update generation device through chain module.\n | 2. According to claim 1, wherein the permission authentication/public key-based integrity autonomous vehicle IoT firmware update generation device is connected to the IoT-type firmware module of the autonomous vehicle, replacing the IoT-type firmware module of the autonomous vehicle, A token-type object is created by consolidating the data of the car's IoT-type firmware modules into one, and after applying for permission authentication to the firmware-type Hyperledger Fabric blockchain module, the created token-type object and firmware update request signal are sent to the firmware-type Hyperledger Fabric block. MKG-type IoT firmware brokerage control that transmits the data to the chain module, receives the public key for downloading the firmware update file from the firmware-type Hyperledger Fabric blockchain module, and mediates and controls the firmware update file to be downloaded from the firmware brokerage control module for firmware suppliers. The module 100 is connected to the firmware supplier's smart device and, on behalf of the firmware supplier, applies for permission authentication to the firmware-type Hyperledger Fabric blockchain module, Checks whether the firmware supplier is registered in the Hyperledger Fabric blockchain, registers the firmware update file provided by the firmware supplier, and sends the firmware update file to the authorized MKG-type IoT firmware through the firmware-type Hyperledger Fabric blockchain module. It is located between the firmware intermediary control module 200 for firmware suppliers, which mediates and controls distribution to the intermediary control module, the firmware intermediary control module for firmware suppliers, and the MKG-type IoT firmware intermediary control module, and creates a chain code according to automatic agreement. After forming the (Chain Code), register and distribute the firmware update file sent to the firmware brokerage control module for the firmware supplier, and block the token-type object and firmware update request sent from the MKG-type IoT firmware brokerage control module to the node. Create an ordering service node with a structure, Permission through a firmware-type Hyperledger Fabric blockchain module, which consists of a firmware-type Hyperledger Fabric blockchain module 300 that creates a permission-type blockchain that allows only authorized nodes to participate in the Hyperledger Fabric network. Authentication/public key-based integrity autonomous vehicle IoT firmware update generation device.\n | 3. According to claim 2, the MKG-type IoT firmware mediation control module 100 is connected to the IoT-type firmware module of the self-driving car, and provides firmware data regarding the presence or absence of firmware update and version sensed by the IoT-type firmware module of the self-driving car. Forms an interface that receives video data before and after the accident of an autonomous vehicle, video and audio data inside and outside the vehicle, and traffic data from internal and external network communication through V2X communication or IVN (In-Vehicle Network). The RS-232 interface unit 110 and the UART signal and SPI signal are connected to IEEE802. A 5G Wifi communication forming unit 120 for IoT firmware that converts to the 11b/g/n wireless LAN protocol and forms a firmware brokerage control module for firmware suppliers, a firmware-type Hyperledger Fabric blockchain module, and a 5G Wifi communication network, and RS -232 From the interface unit, the presence or absence of a firmware update, which is the sensing data of IoT-type firmware modules of the self-driving car, firmware data regarding the version, video data before and after the accident of the self-driving car, and video and audio data inside and outside the vehicle, a firmware data token-type object creation control unit 130 that receives internal and external network communication traffic data through V2X communication or IVN (In-Vehicle Network) and controls it to generate a token-type object by consolidating it into one, and an autonomous vehicle Receives firmware version data from the IoT-type firmware module, compares and analyzes it with the latest version value, and if it is a lower version, A firmware update request control unit 140 that receives permission from the certificate-based Hyperledger Fabric blockchain through a CA (certification authority) and controls a firmware update request, a token-type object created by the firmware data token-type object creation control unit, and, a permission authentication application control unit 150 for IoT firmware that controls whether the firmware update request signal from the firmware update request control unit can be transmitted to the firmware type Hyperledger Fabric blockchain module to request permission certification, and a firmware type Hyperledger Fabric block. Receive the public key and metadata for downloading the firmware update file from the chain module, Permission authentication through a firmware-type Hyperledger Fabric blockchain module, characterized in that it consists of a Check Firmware Download Algorithm engine unit 160 that forms a firmware update file to be downloaded from the firmware brokerage control module for the firmware supplier. ·Public key-based integrity autonomous vehicle IoT firmware update generation device.\n | 4. The method of claim 2, wherein the firmware intermediary control module 200 for the firmware supplier transmits the UART signal and the SPI signal to IEEE802. A 5G Wifi communication formation unit 210 for firmware suppliers that converts to 11b/g/n wireless LAN protocol to form a firmware-type Hyperledger Fabric blockchain module, MKG-type IoT firmware intermediary control module, and 5G Wifi communication network, and firmware Controls whether the supplier is registered in the Hyperledger Fabric blockchain and whether the firmware update file provided by the firmware supplier can be registered in the Hyperledger Fabric blockchain by sending it to the Hyperledger Fabric blockchain to request permission authentication. After checking whether the firmware supplier is registered in the Hyperledger Fabric blockchain, the permission authentication application control unit 220 for the firmware supplier, and a set vendor algorithm (A SetVendor Algorithm engine unit 230, a firmware update algorithm engine unit 240 that updates metadata of the firmware update file in the Hyperledger Fabric blockchain and creates a transaction, Firmware-type Hyperledger, characterized in that it consists of a firmware update file distribution control unit 250 that controls the distribution of the firmware update file directly to the authorized MKG-type IoT firmware mediation control module through the firmware-type Hyperledger Fabric blockchain module. Permission authentication and public key-based integrity autonomous vehicle IoT firmware update generation device through fabric blockchain module.\n | 5. According to claim 2, the firmware-type Hyperledger Fabric blockchain module 300 registers and distributes firmware update files of firmware mediation control modules for firmware suppliers to be shared, and firmware of IoT-type firmware modules of autonomous vehicles. Firmware data regarding update status and version, video data before and after the accident of an autonomous vehicle, video and audio data inside and outside the vehicle, and internal and external network communication through V2X communication or IVN (In-Vehicle Network). A node structure is created by blocking the Hyperledger Fabric-type distributed ledger unit 310, which records all changes in traffic data, and the Hyperledger Fabric-type distributed ledger unit 310, which records all changes in traffic data. An ordering service node 320 that creates an ordering block, determines the order of transactions in the ordering block, and hosts an ordering service that is delivered to connected nodes, a firmware brokerage control module for firmware suppliers, It forms a chain code based on automatic agreement between MKG-type IoT firmware mediation control modules, processes business logic agreed upon by nodes participating in the Hyperledger Fabric network, and creates a new distributed ledger unit in the Hyperledger Fabric type. The chain code unit 330, which is responsible for updating content or reading existing content, manages the Hyperledger Fabric-type distributed ledger and chain code on the Hyperledger Fabric network, and operates at the ordering service node. Permission authentication through a firmware-type Hyperledger Fabric blockchain module, which consists of a peer node 340 that verifies the created block and stores a Hyperledger Fabric-type distributed ledger based on the block. ·Public key-based integrity autonomous vehicle IoT firmware update generation device.\n | 6. According to claim 5, the peer node 340 is an endorsing peer node 341 that determines whether a transaction is appropriate through chain code simulation and performs verification of the latest block. A committing peer node 342, which plays the role of communicating with a Hyperledger Fabric network located in another organization, and an anchor peer node, which plays a role of communicating with a peer node located in another organization. peer node) (343), a leader peer node (344) that is connected to the ordering service node and receives the latest block and transmits it to other peer nodes in the organization, Hyperledger Fabric blockchain, Hyperledger Fabric Blockchain, which consists of devices that connect public blockchains and private blockchains, Based on permission authentication and public key through a firmware-type Hyperledger Fabric blockchain module, which consists of a bridge peer node (345) that forms one or two of the public blockchain and private blockchain to participate in the selected blockchain. Type integrity autonomous vehicle IoT firmware update generation device.\n | 7. The MKG-type IoT firmware brokerage control module is connected to the IoT-type firmware module of the self-driving car, and instead of the IoT-type firmware module of the self-driving car, the data of the IoT-type firmware modules of the self-driving car is integrated into one to create a token-type object. A step of creating (S10), a step of applying for permission authentication to the firmware-type Hyperledger Fabric blockchain module through the MKG-type IoT firmware brokerage control module (S20), and permission authentication of the firmware-type Hyperledger Fabric blockchain module. Then, a step (S30) of transmitting the token-type object and firmware update request signal generated by the MKG-type IoT firmware brokerage control module to the firmware-type Hyperledger Fabric blockchain module, and the firmware brokerage control module for the firmware supplier is sent to the firmware supplier. A step of connecting to a smart device and applying for permission authentication to the firmware-type Hyperledger Fabric blockchain module on behalf of the firmware supplier (S40), and when permission certification of the firmware-type Hyperledger Fabric blockchain module is achieved, A step (S50) of checking whether the firmware supplier is registered in the Hyperledger Fabric blockchain in the firmware-type Hyperledger Fabric blockchain module and registering the firmware update file provided by the firmware supplier (S50), and the firmware-type Hyperledger Fabric block A step (S60) of sending the public key and metadata for downloading the firmware update file to the MKG-type IoT firmware mediation control module that sent the firmware update request signal from the chain module, and sending the firmware update file from the firmware supplier firmware mediation control module to the firmware-type When distributed to the authorized MKG-type IoT firmware brokerage control module through the Hyperledger Fabric blockchain module, the firmware update file is downloaded based on the public key and metadata for downloading the firmware update file from the MKG-type IoT firmware brokerage control module., Creation of permission authentication/public key-based integrity self-driving car IoT firmware update through a firmware-type Hyperledger Fabric blockchain module, which consists of installing the firmware update file of the IoT-type firmware module of the self-driving car (S70). method.", "output": "The device has a firmware-type Hyperledger (RTM: object-oriented programming language) fabric block that is connected to an Internet of things (IoT) firmware module to download firmware update files directly from a firmware supplier. The Fabric block that is formed to download firmware update files directly from a firmware supplier that supplies firmware update files based on additional permission certification and a public key after receiving permission certification from the Ledger Fabric blockchain. A token-type object is created by consolidating the data of the IoT-type firmware modules. The brokerage control transmits the data to the chain module, receives public key for downloading the firmware update file from firmware-type Hyperledger Fabric blockchain module. An INDEPENDENT CLAIM is included for a method for generating integrity of autonomous vehicle Internet of Things firmware update based on permission authentication and public key using Hyperledger fabric blockchain module. Device for generating integrity of autonomous vehicle Internet of Things (IoT) firmware update based on permission authentication and public key using Hyperledger fabric blockchain module. The device generates firmware updates and provides forensic data for proving cause of accidents in self-driving cars. The ledger is disclosed only to authorized nodes, and identity of nodes participating in the network can be verified, so that responsibility is clearly identified in the event of a problem. The bottlenecks occurs in public networks, private networks and Hyperledger Fabric networks is reduced. The drawing shows a block diagram of a device generating integrity of autonomous vehicle Internet of Things firmware update. (Drawing includes non-English language text) 1Device for generating integrity of autonomous vehicle100Brokerage control module200Control module300Chain Module", "instruction": "Please summarize the input" }, { "input": "The V2 X (V X) communication system using the OTP (one time password).The present invention relates to a V2X communication system using an OTP, and more particularly, to a V2X communication system using an OTP, capable of not only preventing the malfunction of an electronic component mounted on a vehicle due to hacking but also further improving driving safety in an autonomous driving operation, by encrypting transmission data using the OTP which the vehicle uses to be transmitted and decrypting reception data using a counterpart OTP to be decoded, after receiving a secret key for the generation of a counterpart unique OTP through a security relay center when the vehicle communicates with an external device. \n\nCOPYRIGHT KIPO 2018 \n\nREPRESENTATIVE DRAWING - Reference numerals: (10) Vehicle; (20) External device; (30) Security relay center; (AA) Request a secret key 2 used by a communication target external device; (BB) Request a secret key 1 used by a communication target vehicle; (CC) Transmit the secret key 2; (DD) Transmit the secret key 1; (EE) Encrypt the transmission data 1 using the OTP 1; (FF) Encrypt the transmission data 2 using the OTP 2; (GG) Generate transmission data 1 and generate an OTP 1; (HH) Receive the transmission data 2 and generate the OTP 2; (II) Decrypt the transmission data 2 using the OTP 2; (JJ) Receive the transmission data 1 and generate the OTP 1; (KK) Decrypt the transmission data 1 using the OTP 1; (LL) Generate transmission data 2 and generate an OTP 2|1. A V2X communication system using OTP, wherein: the ECU (11), and wireless communication module (12) are included; data transmitted to the external device (20) is encoded to the characteristic OTP1 and it transmits; data received from the external device (20) are decoded to the OTP2 of the external device (20) intrinsic and the vehicle (10) decoded, the control unit (21) controlling the communication behavior with the vehicle (10), and wireless communication module (22) are included; data transmitted to the vehicle (10) is encoded to the characteristic OTP2 and it transmits; and it configures with the external device (20) which decodes data received from the vehicle (10) to the OTP1 of the vehicle (10) intrinsic and decoded, and the ECU (11) the communication system controls the communication behavior with the external device (20).\n\n\n | 2. The V2X communication system using OTP of claim 1, wherein: the communication system moreover altogether stores the secret key for the OTP1 of the vehicle (10) and external device (20) and OTP2 generation; and it further includes the security transit centres (30) which transmits the corresponding secret key in case it has the request for transmission of the far-end secret key from the vehicle (10) and external device (20), and the security transit centres (30) organizes on the communication unit (31) which transmits the corresponding secret key stored in the secret key store (32) with the vehicle (10) and external device (20) in case it has the request for transmission of the secret key from the secret key store (32), storing all secret keys that use in the vehicle (10) and external device (20) and vehicle (10) and external device (20). \n\n\n | 3. The V2X communication system using OTP of claim 1, wherein: the vehicle (10) produces the OTP1 by being transmitted the secret key 2 of the external device (20) for the V2X communication with the external device (20) from the security transit centres (30) and using the secret key 1 of the vehicle intrinsic in the ECU (11): stored in the OTP module (13) and data transmission, and in the data receiver time, comprises the OTP module (13) producing the OTP2, the encryption unit (14), and the decoder (15) by using the secret key 2, and as to the encryption unit (14), data transmitted to the external device (20) is encoded to the OTP1 generated in the OTP module (13) and transmitted; and the decoder (15) decodes data transmitted from the external device (20) to the OTP2 generated in the OTP module (13) and decoded. \n\n\n | 4. The V2X communication system using OTP of claim 1, wherein: the external device (20) produces the OTP2 by being transmitted the secret key 1 of the vehicle (10) for the V2X communication with the vehicle (10) from the security transit centres (30) and using the secret key 2 of the external device intrinsic in the control unit (21): stored in the OTP module (23) and data transmission, and in the data receiver time, comprises the OTP module (23) producing the OTP1, the encryption unit (24), and the decoder (25) by using the secret key 1, and as to the encryption unit (24), data transmitted to the vehicle (10) is encoded to the OTP2 generated in the OTP module (23) and transmitted; and the decoder (25) decodes data transmitted from the vehicle (10) to the OTP1 generated in the OTP module (23) and decoded. \n\n\n", "output": "The system has an electronic control unit (ECU) for controlling communication behavior with an external device (20). A one time password (OTP) module produces first OTP by using a secret key of the vehicle in a data transmission part and produces second OTP by using another secret key in a data receiver. A wireless communication module transmits data to the external device, where the data is encoded by using the first OTP and the data received from the external device is decoded by using the second OTP. A control unit controls communication behavior with a vehicle (10). Vehicle-to-everything (V2X) communication system. The system is provided with an automotive electric part that is mounted in the vehicle so as to prevent malfunction in the system, improve running stability during autonomous driving operation and provide security. The drawing shows a flow diagram illustrating operation of a V2X communication system. '(Drawing includes non-English language text)' AAStep for receiving transmission dataBBStep for decoding transmission data10Vehicle20External device30Security transit center", "instruction": "Please summarize the input" }, { "input": "SYSTEM AND METHODS TO APPLY ROBUST PREDICTIVE TRAFFIC LOAD BALANCING CONTROL AND ROBUST COOPERATIVE SAFE DRIVING FOR SMART CITIESApparatuses. systems and methods applying an ?ηικ?Λ'???λ? rcin-djcbtrmatiLHting anaayniDus car w-1 ^1··=^ nasTgattoo driven traffic model predictive -γ^?-ιΜ?τιΙ rcodimirE predictive load-balancing or. road nrtTsyfHrc -π,+ι?ι-Η dynamically assigns efficient sets of routes to car related navi^bDO aids and Tvhich navigation aids may refer to in dash navtgaiiDQ or to zanart phone navigation application. The system methods are may enable, for exainple. to ?οφτο^?: or to sbbsdtiile oommerrial naiigatict sejvrKe soluti-DEis. applying under -such upgrade ca* stribsfituticei a new hi^hK· pffir^wnt prciaztive fraffir COHtLol f-DT 'Zity SlXe €&' jize L dTllC .|1. A method enabling according to predetermined procedure to perform by in-vehicle apparatus privileged tolling transaction with a toll charging center, while non exposing trip details, and transmitting position related data to a path control system, the method comprising: a. Receiving by an in vehicle apparatus data associated with time related varying positions of a path which should be developed according to dynamic updates to an in-vehicle driving navigation aid, b. Tracking and storing positions along a trip by said in vehicle apparatus, c. Comparing by said in vehicle apparatus said tracked time related positions by the in-vehicle apparatus with time related positions associated with said path that should be developed according to updates to the driving navigation aid, d. Determining by said in-vehicle apparatus, according to a level of a match, privilege related toll charging data, e. Transmitting by said in-vehicle apparatus using an IP address associated with the in-vehicle apparatus a message which is characterized by being vehicle identifying and not trip identifying toll charging related data message, wherein the IP address differs from an IP address that is associated with the in-vehicle apparatus while in-vehicle positioning related data is transmitted anonymously.\n | 2. An in-vehicle apparatus which according to 1 comprises: a. Mobile internet transceiver, b. GNSS positioning receiver, c. Processor and memory, d. Communication apparatus to communicate with an in-vehicle driving navigation aid.\n | 3. A method and a system according to which conditions to improve traffic flow on a road network is encouraged by encouraging directly or indirectly usage of vehicles having in-vehicle driving navigation aids which interact with drivers, or with driving control means of autonomous-vehicles, to guide trips of a vehicles according to path controlled trips, the method comprises: a) receiving by an in-vehicle driving navigation aid data for dynamic path assignments, b) tracking by in-vehicle apparatus the actual path of the trip, c) comparing by in-vehicle apparatus the tracked path with the path complying with the dynamic path assignments along a trip, d) determining by in-vehicle apparatus the privilege entitling usage of the assigned path according to predetermined criteria for the level of the match determined by the comparison. e) transmitting by in-vehicle apparatus privilege related transaction data which expose no trip details.\n | 4. A method and a system according to which traffic flow improvement conditions on a road network is encouraged by encouraging directly or indirectly usage of vehicles having in-vehicle driving navigation aids which interact with drivers, or with driving control means of autonomous-vehicles, to guide trips of a vehicles according to path controlled trips, the method comprises: a) receiving by an in-vehicle driving navigation aid data for dynamic path assignments, b) tracking by in-vehicle apparatus the actual path of the trip, c) comparing by in-vehicle apparatus the tracked path with the assigned path complying with the dynamic path assignments along a trip, d) determining by in-vehicle apparatus the privilege entitling usage of the assigned path according to predetermined criteria for the level of the match determined by the comparison.\n | 5. A method according to 3 or 4, wherein said privilege is free of charge road toll.\n | 6. A method according to 3 or 4, wherein said privilege is discount in charged road toll.\n | 7. A method according to 5 or 6, wherein an entitlement for privilege include a criterion according to which travel on certain predetermined links requires that a trip will be stopped for a minimum predetermined time.\n | 8. A method according to 7, wherein said predetermined links are links on which traffic is diluted.\n | 9. A method according to 3 or 4, wherein a said vehicle is an autonomous vehicle classified as level 4 according to the Society of Automotive Engineers.\n | 10. A method according to 3 or 4, wherein a said vehicle is an autonomous vehicle classified as level 5 according to the Society of Automotive Engineers.\n | 11 A method according to 3 or 4, wherein path controlled trips tending to be coordinated by dynamic assignment of paths performed by coordinating path control.\n | 12. A method according to 11 wherein which traffic on the network tends to converge to traffic load balance.\n | 13. A method according to 11, wherein a DTA simulator is used with traffic predictions of coordinating path control.\n | 14. A method according to 13, wherein the DTA simulator includes models of motion of autonomous vehicles on roads and interactions of autonomous vehicles with other vehicles on roads.\n | 15. A method according to 11, wherein gradual coordination is applied by determining current highest priority links which negatively contribute to traffic load balance subject to a given computation power.\n | 16. A method according to 11, wherein dynamic assignments of paths use processes of coordination control iterations.\n | 17. A method according to 16, wherein coordination control iterations apply fairness related processes.\n | 18. A method according to 17, wherein processes of iterations of coordination control are also used.\n | 19. A method according to 12,13,14,16, 17 and 18, wherein, paths are assigned to fictitious destinations on a fictitious road map which expands a real part of a road map with evacuation of traffic from a part of a network.\n | 20. A method according to 12, 13, 14, 16, 17 and 18, wherein, paths are assigned to fictitious destinations on a fictitious road map which expands a real part of a road map with traffic dilution of a part of a network.\n | 21. A method and system according to which improved safe driving on a road network is encouraged by encouraging usage of safety aids, the method comprises: a) tracking by in-vehicle apparatus the actual use of said safety aid along the trip, d) determining by in-vehicle apparatus privilege related data for usage of said safety aid according to predetermined criteria entitling privilege for the level usage, c) transmitting by in-vehicle apparatus privilege related transaction data which expose no trip details.\n | 22. A method according to 21, wherein said privilege applies free of charge road toll.\n | 23. A method according to 21, wherein said privilege is a discount in charged road toll.\n | 24. A method according to 21, wherein safety aids are cooperative safety driving aids enabling to improve a single in-vehicle measurement of a safety driving aid by in-vehicle fusion of the measurement with one or more respective measurements performed by other one or more vehicles and received by a fusion apparatus through vehicle to vehicle communication.\n | 25. A method according to 3,4 and 21, wherein privilege for usage refers to usage of both safety driving aids and path controlled trips.\n | 26. A method according to which a path control system assigns paths to path controlled trips according to coordination control processes, wherein coordination control processes comprise iterative mitigation processes, and wherein an iteration of mitigation processes comprise determination of relatively loaded links, and wherein determination of relatively loaded links is associated with processes to determine time dependent traffic volumes to capacity ratios related data on network links along predicted time horizon which include feeding a calibrated Dynamic Traffic assignment (DTA) simulation by: a. non-mitigated pending paths, b. current and predicted assigned paths associated with path controlled trips, which are not associated with non mitigated pending paths.\n | 27. A method according to 26, wherein paths fed to the DTA simulation include current and predicted non path controlled trips.\n | 28. A method according to 26, wherein paths fed to the DTA simulation include current and predicted non coordinating path controlled trips.\n | 29. A method according to 26, wherein determination of relatively loaded links further include determination of reference time dependent traffic volume to capacity ratios related data which include feeding a calibrated Dynamic Traffic assignment (DTA) simulation by: a. current and predicted assigned paths associated with path controlled trips, b. current and predicted non path controlled trips.\n | 30. A method according to 29, wherein paths fed to the DTA simulation include current and predicted non coordinating path controlled trips.\n | 31. A method according to 26, wherein an iteration of mitigation processes includes searching for new alternative paths to yet non-mitigated pending alternative paths, preferably by substantially simultaneous search processes, wherein time dependent travel times that are associated with a search are determined by synthesis of DTA based traffic prediction fed by paths which include current and predicted assigned paths associated with path controlled trips which include paths that are associated with mitigated paths up to the current iteration in current cycle,\n | 32. A method according to 31, wherein paths fed to the DTA simulation include current and predicted non path controlled trips.\n | 33. A method according to 31, wherein paths fed to the DTA simulation include current and predicted non coordinating path controlled trips.\n | 34. A method according to 31, wherein an iteration of mitigation processes includes: a. Determining a threshold related acceptance criterion to accept new alternative paths as a substitution to assigned path controlled trips, wherein the threshold is adaptively determined in order to enable controllable mitigation by the current iteration in perspective of one or more prior iterations, b. Accepting new alternative paths or pending alternative paths according to a predetermined acceptance procedure which may but not be limited to a threshold which enables to put a limit on acceptance of said new alternative paths, according to results from said search.\n | 35. A method according to 34, wherein a threshold puts a limit on the maximum accepted reduction in potential travel time improvement in comparison to the potential travel time improvement that was assumed to be gained by said search for a path which became a non mitigated pending path.\n | 36. An apparatus comprising means for performing the method of any one of claims 1 and 3-35.", "output": "The method involves receiving data associated with time related varying positions of a path which should be developed according to dynamic updates to an in-vehicle driving navigation aid, and tracking and storing positions along a trip. The tracked time related positions are compared with time related positions associated with the path that should be developed according to updates to the driving navigation aid. According to the comparison result, privilege related toll charging data are determined. By using an Internet Protocol (IP) address associated with the in-vehicle apparatus, a message which is characterized by being vehicle identifying and not trip identifying toll charging related data message is transmitted. The IP address differs from an IP address that is associated with the in-vehicle apparatus while in-vehicle positioning related data is transmitted anonymously. INDEPENDENT CLAIMS are also included for the following:an in-vehicle apparatus; anda method of improving traffic flow on a road network. Method, by in-vehicle apparatus (claimed), for performing privileged tolling transaction with toll charging center. Robust privacy preservation eliminates, or at least minimizes, possible negativism to conditional tolling, since with robust privacy preservation the non-exposure of trip details can be guaranteed or at least an exposure can be under control of the owner of the vehicle. Privacy preserving path control, supported by privacy preserving free of charge toll or toll discount, may reduce reluctance to apply and use path controlled trips usage and may thus enable to generate high usage of path controlled trips, which with improved traffic mapping and traffic prediction provide good conditions for high performance traffic load balancing. The drawing is a schematic figure which illustrates the coordination of path controlled trips preferably applied with a basic paths planning layer. 210bCoordinated paths transmission211Basic paths planning layer212Requested paths input213Traffic prediction travel time costs", "instruction": "Please summarize the input" }, { "input": "MAP UPDATE USING IMAGESMethods and apparatuses associated with updating a map using images are described. An apparatus can include a processing resource and a memory resource having instructions executable to a processing resource to monitor a map including a plurality of locations, receive, at the processing resource, the memory resource, or both, and from a first source, image data associated with a first location, identify the image data as being associated with a missing portion, an outdated portion, or both, of the map, and update the missing portion, the outdated portion, or both, of the map with the image data.What is claimed is: \n | 1. An apparatus, comprising: \na processing resource; and \na memory resource in communication with the processing resource having instructions executable to: \nmonitor a map including a plurality of locations; \nreceive, at the processing resource, the memory resource, or both, and from a first source, image data associated with a first location; \nidentify the image data as being associated with a missing portion, an outdated portion, or both, of the map; and \nupdate the missing portion, the outdated portion, or both, of the map with the image data. \n | 2. The apparatus of claim 1, wherein the image data associated with the first location includes metadata identifying a physical location and viewing direction of the image data.\n | 3. The apparatus of claim 2, wherein the instructions are executable to identify the image data as being associated with the missing portion, the outdated portion, or both, by matching the metadata associated with the image data with location and viewing direction data associated with the missing portion, the outdated portion, or both.\n | 4. The apparatus of claim 1, further comprising the instructions executable to update a machine learning model associated with the map in response to receipt of the image data associated with the first location.\n | 5. The apparatus of claim 1, wherein the first source is a sensor in communication with an autonomous vehicle.\n | 6. The apparatus of claim 1, further comprising instructions executable to update the map as new image data associated with the first location is received.\n | 7. The apparatus of claim 6, further comprising instructions executable to detect changes to the first location based on the map update and the new image data received.\n | 8. The apparatus of claim 1, further comprising instructions executable to update the map and create a time-lapse version of the map as new image data associated with the particular location and the plurality of locations is received.\n | 9. A non-transitory machine-readable medium comprising a processing resource in communication with a memory resource having instructions executable to: \nreceive, at the processing resource, the memory resource, or both, a first plurality of images from a plurality of sources; \ndetermine, at the processing resource, the memory resource, or both, a second plurality of images of a map monitored by the processing resource, the memory resource, or both, that have not been updated within a particular time period; \nsearch the first plurality of images and a database of previously received images for the second plurality of images of the map; \nin response to finding an image matching one of the second plurality of images, insert the one of the second plurality of images into the map; and \nin response to not finding a matching image, request, from the plurality of sources, the image matching one of the second plurality of images. \n | 10. The medium of claim 9, further comprising the instructions executable to determine a change between the matching image and the one of the second plurality of images.\n | 11. The medium of claim 9, further comprising the instructions executable to alert the plurality of sources of the change.\n | 12. The medium of claim 11, further comprising the instructions executable to instruct the plurality of sources to share the change with different sources in which the plurality of sources is in communication.\n | 13. The medium of claim 10, further comprising the instructions executable to alert a party outside of the plurality of sources of the change.\n | 14. A method, comprising: \nreceiving, at a processing resource, a memory resource, or both, a plurality of images including location and time metadata from sensors associated with a plurality of vehicles that utilize vehicle-to-everything (V2X) communication; \ndetecting, at the processing resource, areas of a map having an outdated image, a missing image, or both; \ndetermining whether one of the plurality of received images addresses the outdated image, the missing image, or both based on the metadata; \nin response to determining the one of the plurality of received images addresses the outdated image, the missing image, or both, updating the map using the one of the plurality of images; \nin response to not finding an image to address the outdated image, the missing image, or both, requesting, from the plurality of sources, a matching one of the second plurality of images; \ndetecting and classifying, by the processing resource and based on the updated map, an issue associated with a particular location on the map; and \nsending a notification to the sensors of the plurality of vehicles and additional vehicles based on the detected and classified issue. \n | 15. The method of claim 14, further comprising receiving Decentralized Environmental Notification Message (DENM) signals from the sensors of the plurality of sources and classifying the DENM signals.\n | 16. The method of claim 15, further comprising providing DENM alerts to an Intelligent Transport System (ITS) based on the classified DENM signals.\n | 17. The method of claim 14, further comprising: \ndetermining periodic updates associated with the particular location; and \nstoring to the memory resource, a database, or both, the periodic updates. \n | 18. The method of claim 14, further comprising requesting, from a database of images uploaded from a plurality of autonomous vehicles, the matching one of the second plurality of images in response to not finding an image to address the outdated image, the missing image, or both.\n | 19. The method of claim 14, wherein detecting the issue comprises detecting a change in a structure associated with the particular location.\n | 20. The method of claim 14, wherein detecting the issue comprises detecting a road condition change associated with the particular location.", "output": "The apparatus (100) has a memory resource (104) in communication with a processing resource (102) having instructions executable to monitor a map containing a set of locations. An image data associated with a first location is received at the processing resource, the memory resource, or both, and from a first source. The image data is identified as being associated with a missing portion, an outdated portion, or both, of the map. The missing portion, the outdated portion, or both, of the map are updated with the image data. The image data associated with the first location contains metadata identifying a physical location and viewing direction of the image data. The instructions are executable to identify the image data as being associated with the missing portion, the outdated portion, or both, by matching the metadata associated with the image data with location and viewing direction data associated with the missing portion, the outdated portion, or both. INDEPENDENT CLAIMS are included for:1) a non-transitory machine-readable medium; and2) a method for updating a map using images. Apparatus for updating a map using images. Uses include but are not limited to a mobile phone, smartphone, tablet, phablet, computing device, implantable device, vehicle, home appliance, smart home device, monitoring device, wearable device and an intelligent shopping system. The method enables utilizing a computing device to transmit information to users through a display to view images and/or text, speakers to emit sound, and a sensor to collect data in an efficient manner. The drawing shows a block diagram of the apparatus for updating a map using images.100Apparatus for updating a map using images 102Processing resource 104Memory resource 106Monitoring 108Receiving ", "instruction": "Please summarize the input" }, { "input": "VEHICLE-TO-EVERYTHING (V2X) COMMUNICATION BASED ON USER INPUTIn some implementations, a device associated with a vehicle may receive, based on user input to an interface of the vehicle, information indicating an incident associated with the vehicle, wherein the user input indicates at least one of: whether the incident is associated with the user of the vehicle, or whether the incident is associated with an event outside of the vehicle. The device may transmit, to a system and via a transceiver of the vehicle, a message indicating the incident associated with the vehicle. The device may receive, from the system and via the transceiver of the vehicle, an acknowledgement of the message, wherein the acknowledgement indicates a classification of the incident. The device may cause the vehicle to perform one or more actions based on the incident associated with the vehicle and based on the classification of the incident.What is claimed is: \n | 1. A device associated with a vehicle, comprising: \nan interface configured to receive user input from a user of the vehicle; \na transceiver; \na memory; and \none or more processors, coupled to the memory, configured to: \nreceive, based on the user input to the interface, information indicating an incident associated with the vehicle, wherein the user input indicates at least one of: \nwhether the incident is associated with the user of the vehicle, or \nwhether the incident is associated with an event outside of the vehicle; \ntransmit, to a system and via the transceiver of the device, a message indicating the incident associated with the vehicle; \nreceive, from the system and via the transceiver of the device, an acknowledgement of the message, wherein the acknowledgement indicates a classification of the incident; and \ncause the vehicle to perform one or more actions based on the incident associated with the vehicle and based on the classification of the incident. \n | 2. The device of claim 1, wherein the interface is configured to receive the user input as a voice input.\n | 3. The device of claim 1, wherein the one or more processors, to cause the vehicle to perform the one or more actions, are configured to: \nprovide one or more instructions to autonomously drive the vehicle to a facility based on: \nthe user input indicating that the incident is associated with the user of the vehicle, \nthe classification of the incident, and \na capability of the vehicle. \n | 4. The device of claim 3, wherein the one or more processors are further configured to: \ntransmit, via the transceiver, a notification to a facility system associated with the facility, wherein the notification indicates the incident. \n | 5. The device of claim 1, wherein the one or more processors, to cause the vehicle to perform the one or more actions, are configured to: \nprovide one or more instructions to autonomously park the vehicle based on: \nthe user input indicating that the incident is associated with the user of the vehicle, \nthe classification of the incident, and \na capability of the vehicle. \n | 6. The device of claim 5, wherein the one or more processors are further configured to: \ndetermine a location at which the vehicle is parked or is to be parked; \ninitiate, with an emergency dispatch system, an emergency call based on the user input indicating that the incident is associated with the user of the vehicle; and \ntransmit, to the emergency dispatch system and via the transceiver, an indication of the location at which the vehicle is parked or is to be parked. \n | 7. The device of claim 1, wherein, based on the user input indicating that the incident is associated with the event outside the vehicle, the message indicating the incident associated with the vehicle includes one or more of: \nan image associated with the incident, \na video associated with the incident, or \nsensor information detected by one or more sensors associated with the vehicle. \n | 8. The device of claim 1, wherein the one or more processors, to cause the vehicle to perform the one or more actions, are configured to: \nprovide one or more instructions to autonomously drive the vehicle along a route that bypasses the incident based on the classification of the incident and a capability of the vehicle. \n | 9. The device of claim 1, wherein the message indicating the incident associated with the vehicle further indicates a location associated with the incident.\n | 10. The device of claim 1, wherein the classification of the incident is associated with a severity level, and wherein the severity level is based on whether the classification corresponds to an emergency classification or a non-emergency classification.\n | 11. A system, comprising: \na memory; and \none or more processors, coupled to the memory, configured to: \nreceive, from a vehicle, a message indicating an incident associated with the vehicle, wherein the message indicates: \nwhether the incident is associated with a user of the vehicle, or \nwhether the incident is associated with an event outside of the vehicle; \ngenerate a classification of the incident based on the incident associated with the vehicle, wherein the classification is an emergency classification or a non-emergency classification depending on the incident associated with the vehicle; \ntransmit, to the vehicle, an acknowledgement of the message, wherein the acknowledgement indicates the classification of the incident; and \nperform one or more actions based on the classification of the incident. \n | 12. The system of claim 11, wherein the one or more processors, to perform the one or more actions, are configured to: \nidentify a plurality of other vehicles that are within a defined range from the vehicle or that are along a route associated with the vehicle; and \ntransmit, to the plurality of other vehicles and based on the classification of the incident as the emergency classification, one or more messages that indicate one or more vehicle actions to be performed by the plurality of other vehicles based on the incident associated with the vehicle. \n | 13. The system of claim 11, wherein the one or more processors, to perform the one or more actions, are configured to: \nrefrain from transmitting one or more messages associated with the incident to a plurality of other vehicles based on the classification being the non-emergency classification. \n | 14. The system of claim 11, wherein the one or more processors are configured to generate the classification based on other messages received from a plurality of other vehicles, wherein the other messages provide additional information that corroborates the message indicating the incident associated with the vehicle.\n | 15. The system of claim 11, wherein the system is a vehicle-to-everything (V2X) system that is co-located with an access point that serves a geographic location associated with the vehicle or a geographic location associated with the incident.\n | 16. The system of claim 11, wherein the system is a vehicle-to-everything (V2X) system that is configured to communicate with a plurality of access points, wherein an access point in the plurality of access points serves a geographic location associated with the vehicle or a geographic location associated with the incident.\n | 17. A method, comprising: \nreceiving, based on user input to an interface of a vehicle, information indicating an incident associated with the vehicle, wherein the user input indicates at least one of: \nwhether the incident is associated with a user of the vehicle, or \nwhether the incident is associated with an event outside of the vehicle; \ntransmitting, to a system and via a transceiver of the vehicle, a message indicating the incident associated with the vehicle; \nreceiving, from the system and via the transceiver of the vehicle, an acknowledgement of the message, wherein the acknowledgement indicates a classification of the incident; and \ncausing the vehicle to perform one or more actions based on the incident associated with the vehicle and based on the classification of the incident. \n | 18. The method of claim 17, wherein causing the vehicle to perform the one or more actions comprises: \nproviding one or more instructions to autonomously drive the vehicle to a facility based on: \nthe user input indicating that the incident is associated with the user of the vehicle, \nthe classification of the incident, and \na capability of the vehicle. \n | 19. The method of claim 17, wherein causing the vehicle to perform the one or more actions comprises: \nproviding one or more instructions to autonomously park the vehicle based on: \nthe user input indicating that the incident is associated with the user of the vehicle, \nthe classification of the incident, and \na capability of the vehicle. \n | 20. The method of claim 17, wherein causing the vehicle to perform the one or more actions comprises: \nproviding one or more instructions to autonomously drive the vehicle along a route that bypasses the incident based on the classification of the incident and a capability of the vehicle. \n | 21. The method of claim 17, wherein the classification of the incident is associated with a severity level, and wherein the severity level is based on whether the classification corresponds to an emergency classification or a non-emergency classification.\n | 22. A method, comprising: \nreceiving, at a system from a vehicle, a message indicating an incident associated with the vehicle, wherein the message indicates: \nwhether the incident is associated with a user of the vehicle, or \nwhether the incident is associated with an event outside of the vehicle; \ngenerating a classification of the incident based on the incident associated with the vehicle, wherein the classification is an emergency classification or a non-emergency classification depending on the incident associated with the vehicle; \ntransmitting, to the vehicle, an acknowledgement of the message, wherein the acknowledgement indicates the classification of the incident; and \nperforming one or more actions based on the classification of the incident. \n | 23. The method of claim 22, wherein performing the one or more actions comprises: \nidentifying a plurality of other vehicles that are within a defined range from the vehicle or that are along a route associated with the vehicle; and \ntransmitting, to the plurality of other vehicles and based on the classification of the incident as the emergency classification, one or more messages that indicate one or more vehicle actions to be performed by the plurality of other vehicles based on the incident associated with the vehicle. \n | 24. The method of claim 22, wherein performing the one or more actions comprises: \nrefraining from transmitting one or more messages associated with the incident to a plurality of other vehicles based on the classification being the non-emergency classification. \n | 25. The method of claim 22, wherein generating the classification is based on other messages received from a plurality of other vehicles, wherein the other messages provide additional information that corroborates the message indicating the incident associated with the vehicle.", "output": "The device (900) has a processor (920) for receiving information indicating an incident associated with a vehicle based on user input to an interface, where the user input indicates one of whether the incident is associated with a user of the vehicle or whether the incident is associated with an event outside of the vehicle. The processor transmits a message indicating the incident associated with the vehicle to a system and through a transceiver of the device, receives an acknowledgement of the message from the system and through the transceiver of the device, where the acknowledgement indicates a classification of the incident. The processor causes the vehicle to perform actions based on the incident associated with the vehicle and based on the classification of the incident. INDEPENDENT CLAIMS are included for:(1) a system for facilitating vehicle-to-everything communication between an entity and a vehicle for detecting an event based on a user input; and (2) a method for facilitating vehicle-to-everything communication between an entity and a vehicle for detecting an event based on a user input. Device for facilitating vehicle-to-everything communication between an entity and a vehicle for detecting an event e.g. traffic event, vehicle accident and poor road conditions such as icy roads based on a user input. Uses include but are not limited to an automobile, a motorcycle, a bus, a train, a scooter and a truck. The device facilitates vehicle-to-everything communication to improve road safety and traffic efficiency and realize energy savings. The drawing shows a block diagram of components of a device for facilitating vehicle-to-everything communication between an entity and a vehicle for detecting an event based on a user input. 900Device for facilitating vehicle-to-everything communication between entity and vehicle for detecting event based on user input 920Processor 940Input component 950Output component 960Communication component ", "instruction": "Please summarize the input" }, { "input": "A motor vehicle, the program for motor vehicles, the navigation apparatus for motor vehicles, and the program for navigation apparatuses for motor vehicles|1. It has a transmission/reception function; a radio communication part for performing radio communication between vehicles; and a point on the route of a route guidance which requires execution of other vehicle influence behavior which affects the behavior of the other vehicle in the periphery. A car navigation function part for forming route guide data including trigger information of inter-vehicle communication and a route guide using the route guide data by the car navigation function part are executed, and at a point on the route where the trigger information is detected, respectively. Through the wireless communication section, a vehicle-to-vehicle communication request means for requesting communication between vehicles around the vehicle, a communication channel generation means for generating a communication path between the vehicle and the other vehicle responding to the inter-vehicle communication request, and the communication path generated by the communication channel generation means are provided. This automobile is provided with a behavior schedule information transmitting means for transmitting the behavior schedule information for informing the other vehicle influence behavior of the execution schedule to the other vehicle.\n | 2. The trigger information is included in a claim 1 on the route guidance route, and is included in the same spot as the point where the other vehicle influence behavior should be performed or at a predetermined point in front of the same point.\n | 3. The automobile is provided with an automatic operation mode for autonomously performing the behavior of the own vehicle; the car navigation function part is for the automatic operation mode; and a turn indicator, a hazard lamp, and a backlight are used. This device is provided with a means for flashing or lighting-controlling an indicator including a brake light; and on the route guide route of the route guide data, information of a flashing indication point and a lighting indication point for blinking or lighting control of the indicator is included. The point on the route including the trigger information is made to be the same point as that of the flashing indication point and the lighting indication point, or to a point before it, and is described in claim 1 or claim 2.\n | 4. The trigger information consists of other vehicle influence behavior generation information including the information of the other vehicle influence behavior to be generated, and is described in either of the claims 1 to claim 3.\n | 5. As the other vehicle influence behavior, the behavior of turning to the right or left, the behavior of changing the lane, the behavior of merging on the general road or the highway, the behavior of entering into a traffic line in traffic, the behavior of entering the intersection. The behavior including at least one of the behavior of entering the rotary of the station is stored; and the information of the other vehicle influence behavior included in the other vehicle influence behavior generation information is provided. The automobile described in claim 4 is characterized by the information of the selected behavior from the stored behavior.\n | 6. The other vehicle influence behavior generation information includes information on a point where the other vehicle influence behavior is performed, and is described in a claim 4 or a claim 5.\n | 7. The behavior schedule information includes the information of the other vehicle influence behavior scheduled to be executed, and the information of the other vehicle influence behavior to be executed is acquired from the other vehicle influence behavior generation information, and is described in either of the claim 4 to the claim 6.\n | 8. The behavior schedule information includes information about a traveling route related to the present position information of the own vehicle and the other vehicle influence behavior, and is described in one of the claim 1 to claim 7.\n | 9. A means for receiving reply information sent from the other vehicle through the radio communication part, and a reply information analysis means for analyzing the received reply information and analyzing the behavior of the other vehicle to the other vehicle influence behavior are provided corresponding to the behavior schedule information; and the reply information analysis means for analyzing the behavior of the other vehicle. The automobile is provided with a behavior determination means for determining the behavior of the own vehicle based on an analysis result by the reply information analysis means; and a behavior execution means for executing the behavior of the own vehicle determined by the behavior determination means.\n | 10. When it is determined that the other vehicle influence behavior can be safely executed according to the analysis result by the reply information analysis means, a confirmation notice showing that the behavior of the other vehicle determined corresponding to the behavior schedule information is confirmed; and the behavior of the other vehicle is confirmed. The automobile is provided with a confirmation notice means for notifying the other vehicle through the communication path.\n | 11. The behavior of the other vehicle involved in the other vehicle influence behavior of the own vehicle is specified from the behavior of the other vehicle detected by the analysis result by the reply information analysis means; and when it is determined that the other vehicle influence behavior can be safely executed by the specified other vehicle behavior, the other vehicle behavior is determined. The automobile is provided with a confirmation notice means for notifying the other vehicle of the confirmation notice indicating that the behavior of the other vehicle determined corresponding to the behavior schedule information is confirmed is provided.\n | 12. The behavior determined by the behavior determination means is an automobile described in either of claims 9 to 11, which is characterized in that the behavior of the other vehicle is notified by the result of analysis by the reply information analysis means.\n | 13. In the self-driving vehicle capable of autonomous traveling, the behavior determined by the behavior determining means is the behavior in which the behavior of the other vehicle in the analysis result by the reply information analyzing means is executed safely in the automatic operation, and is described in either of the claims 9 to the claim 11. ?\n | 14. The automobile is provided with a manual operation mode and an automatic operation mode in which autonomous traveling is possible, and is provided with an operation mode discriminating means for discriminating between the manual operation mode and the automatic operation mode; and the behavior determined by the behavior determination means is provided. When the operation mode determination means determines the manual operation mode, it is the behavior of notifying the behavior of the other vehicle as the result of the analysis by the reply information analysis means; and when the operation mode determination means determines the automatic operation mode, the operation mode determination means determines that the operation mode is the automatic operation mode. The behavior of the other vehicle in the result of the analysis by the reply information analysis means is a behavior for safely executing the behavior of the other vehicle in the automatic operation, and the automobile is described in either of the claims 9 to 11.\n | 15. During the execution of the behavior determined by the behavior determination means, the automobile is described in a claim 14 which is characterized in that it is prohibited from switching to the manual operation mode from the automatic operation mode.\n | 16. During the communication between the other vehicle and the vehicle, the automobile is described in a claim 14 or claim 15 characterized in that it is prohibited from switching from the automatic operation mode to the manual operation mode.\n | 17. The behavior determination means acquires information on the traveling speed of the other vehicle in the inter-vehicle communication with the other vehicle, and based on the traveling speed of the other vehicle obtained from the acquired information and the traveling speed of the own vehicle. The behavior of the own vehicle is determined, and the behavior of the vehicle is described in either of claims 9 to 16.\n | 18. This vehicle is provided with a camera for photographing the periphery of the own vehicle; the behavior determination means recognizes a traffic sign and/or a traffic sign from a photographed image of the camera; and determines traffic regulation around the own vehicle based on the recognition result. In the automobile, the behavior of the own vehicle is determined in consideration of the discriminated traffic regulation.\n | 19. The automobile is provided with a camera for photographing the periphery of the own vehicle, and the behavior determination means determines the peripheral situation of the own vehicle from the photographed image of the camera, and determines the behavior of the own vehicle in consideration of the discriminated peripheral situation.\n | 20. The behavior schedule information includes information for specifying the own vehicle to the other vehicle, and the reply information from the other vehicle includes information for the own vehicle to specify the other vehicle, and the automobile is described in either of the claims 9 to 19.\n | 21. This system is provided with a camera for photographing the periphery of one's own vehicle and/or a microphone for collecting sound around the own vehicle, and a means for detecting an emergency vehicle from a photographed image around one's own vehicle photographed by the camera and/or voice around the own vehicle collected by the microphone. In the behavior determination means, the automobile is described in either of claims 9 to 20, which determines the behavior of the self-vehicle with the priority of the emergency vehicle.\n | 22. The information for specifying the own vehicle to the other vehicle includes: the present position information of the own vehicle; and feature information capable of specifying the own vehicle from the photographed image of the camera; and the present position information of the other vehicle is included in the information for specifying the other vehicle by the own vehicle. The automobile is characterized in that it includes feature information capable of specifying the other vehicle from a photographed image of the camera.\n | 23. The behavior schedule information includes information for specifying the own vehicle to the other vehicle, and the automobile is described in either of the claims 1 to the claim 22.\n | 24. The information for specifying the own vehicle to the other vehicle includes: the present position information of the own vehicle; and the feature information capable of specifying the own vehicle from the photographed image of the camera.\n | 25. The vehicle is sent from the other vehicle in response to the inter-vehicle communication request; or it is an automatic driving vehicle for the other vehicle to autonomously perform the behavior of the own vehicle. Or a means for receiving response information including information of a state of an autonomous driving mode for autonomously performing the behavior of the own vehicle through the radio communication part; and a self-driving vehicle for autonomously performing the behavior of the own vehicle by the other vehicle is from the received response information. Alternatively, the device is provided with a discrimination means for discriminating whether it is a state of an automatic operation mode for autonomously performing the behavior of the own vehicle, and the communication path generation means is, based on the discrimination result of the discrimination means, the other vehicle which has determined that it is an automatic driving vehicle which autonomously performs the behavior of the own vehicle. Alternatively, a communication path is generated between the automobile and the other vehicle which is discriminated to be in the state of an automatic operation mode for autonomously executing the behavior of the own vehicle, and the automobile is described in either of the claims 1 to 24.\n | 26. The automobile is provided with a vehicle-to-vehicle communication request button, and when the inter-vehicle communication request button is operated, the inter-vehicle communication request means transmits a vehicle-to-vehicle communication request to the other vehicle in the periphery through the radio communication part.\n | 27. When the disconnection request of the communication path from the other vehicle transmitted through the communication path generated by the communication path generation means is received, the communication path between the other vehicle and the other vehicle is cut, and the automobile is described in either of the claims 1 to 26.\n | 28. The device is provided with a means for discriminating whether or not the execution of the other vehicle influence behavior has been completed; and when it is determined that the execution of the other vehicle influence behavior has been completed, the execution end notification is transmitted to the other vehicle through the communication path, and thereafter, the execution of the other vehicle influence behavior is discriminated. The automobile is described in one of claims 1 to 27, which is characterized by cutting off the communication path.\n | 29. The other vehicle influence behavior is different according to whether the vehicle is on the left side or on the right side, and the automobile is described in either of the claims 1 to the claim 28.\n | 30. The automobile is provided with a present position detecting means, determines whether the vehicle is on the left side or right side traffic according to the country or region specified based on the present position detected by the present position detecting means; and the other vehicle influence behavior is set on the basis of the determined result.\n | 31. The other vehicle influence behavior is set on the basis of the presence/absence of the traffic signal, and the automobile is described in one of the claims 1 to the claim 30.\n | 32. The automobile is provided with a camera for photographing the periphery of one's own vehicle, and the presence or absence of the signal is discriminated from the photographed image of the camera.\n | 33. The automobile is provided with a communication means for receiving a radio wave from a traffic signal, and the presence or absence of the signal is discriminated from the information of the radio wave received by the communication means.\n | 34. The other vehicle influence behavior is set on the basis of the priority relation between the vehicles, and the automobile is described in either of the claims 1 to the claim 33.\n | 35. The automobile is provided with a camera for photographing the periphery of one's own vehicle, and the priority relation between the vehicles is discriminated from the photographed image of the camera.\n | 36. The automobile is provided with a present position detecting means; the priority relation between vehicles is determined according to the country or region specified based on the present position detected by the present position detection means; and the other vehicle influence behavior is set based on the determination result.\n | 37. The automobile is provided with a camera for photographing the periphery of the own vehicle, and the vehicle-to-vehicle communication request means performs inter-vehicle communication request to the other vehicle in the periphery when detecting the recognition of a prescribed situation from the photographed image of the camera.\n | 38. The prescribed situation is the automobile described in a claim 37 characterized in that it includes the time of entering the intersection, the approach of the highway, or the entering of the rotary of the station.\n | 39. The inter-vehicle communication requesting means is an automobile described in one of claims 1 to 38 which are characterized in that when it is detected that radio wave information from a beacon installed on a road is received, the communication request means to the other vehicle in the vicinity of the vehicle.\n | 40. The inter-vehicle communication requesting means is an automobile described in either of claims 1 to 39, which is characterized in that when a backlight lighting is detected, a vehicle-to-vehicle communication request is made to another vehicle in the periphery.\n | 41. The inter-vehicle communication request means is an automobile described in either of claims 1 to 40, which is characterized in that when a sudden accelerator is detected, a vehicle-to-vehicle communication request is made to the other vehicle around.\n | 42. The inter-vehicle communication requesting means is an automobile described in one of claims 1 to 41 characterized in that when the lighting of the brake light is detected in the case of a sudden braking, the communication request is made to the other vehicle around the vehicle.\n | 43. The automobile is provided with a communication monitoring part for transmitting the generation notification of the transmission trigger to the inter-vehicle communication request means when the trigger information is detected.\n | 44. The automobile is provided with a reception processing part for the inter-vehicle communication request, and the communication monitoring part starts the reception processing part when the reception of the inter-vehicle communication request from the other vehicle is detected.\n | 45. The communication monitoring part is characterized in that after the processing of the inter-vehicle communication request is completed, processing for detecting the trigger information is resumed.\n | 46. The communication monitoring part, after finishing the processing of the inter-vehicle communication request, and after finishing the processing in the reception processing part, resumes processing for detecting the trigger information, and at the same time, it is also provided. To resume processing for monitoring the reception of a vehicle-to-vehicle communication request from another vehicle.\n | 47. A computer provided with a transmission/reception function and provided with a motor vehicle provided with a radio communication part for wirelessly communicating between vehicles corresponding to the point on the route of the route guidance which requires execution of other vehicle influence behavior which affects the behavior of the other vehicle in the periphery. A car navigation function part which forms route guide data including trigger information of communication between vehicles; and during execution of route guidance using the route guidance data by the car navigation function part; and at the point on the route which detected the trigger information, through the radio communication part. A vehicle-to-vehicle communication request means for requesting communication between vehicles in the periphery, a communication channel generating means for generating a communication path between the vehicle and another vehicle responding to the inter-vehicle communication request, and the communication path generated by the communication channel generation means are provided. The program for the automobile is made to function as a behavior schedule information transmitting means for transmitting the behavior schedule information for informing the other vehicle influence behavior of the execution schedule to the other vehicle.\n | 48. The navigation device for an automobile includes a transmission/reception function, a radio communication part for performing radio communication between vehicles, and a vehicle-to-vehicle communication request means for making a communication request between vehicles in the surrounding other vehicles through the radio communication part based on the trigger information. A means for forming route guide data including the trigger information corresponding to a point on a route guide route which requires execution of other vehicle influence behavior affecting the behavior of the other vehicle in the periphery, and a route guide using the route guide data are executed. The navigation device for the automobile is provided with a means for notifying the occurrence of the trigger information to the inter-vehicle communication request means at a point on the route where the trigger information is detected.\n | 49. A communication path generating means for generating a communication path between the vehicle and the other vehicle responding in response to the inter-vehicle communication request, and the communication path generated by the communication channel generation means are provided. This navigation device for the automobile is provided with a behavior schedule information transmission means for transmitting the behavior schedule information for informing the other vehicle influence behavior of the execution schedule to the other vehicle.\n | 50. The trigger information is included in a point of claim 48 or a claim 49 which is included in the same spot as the point where the other vehicle influence behavior is to be performed or at a predetermined point on this side of the route guidance route.\n | 51. The automobile is provided with an automatic operation mode for autonomously performing the behavior of the own vehicle, and is provided with a means for blinking or lighting control of an indicator including a turn indicator, a hazard lamp, a back light, and a brake light; and the automobile navigation device is provided. In the automatic operation mode, on the route of the route guidance of the route guide data, information of a flashing indication point and a lighting indication point for blinking or lighting control of the indicator is included, and the point on the route including the trigger information is provided. The navigation device for the automobile described in either the same point as the blinking indication point or the same point as the lighting indication point or the same point than the lighting indication point is described in either of the claim 48 and the claim 50.\n | 52. The trigger information is composed of other vehicle influence behavior generation information including information of the other vehicle influence behavior to be generated, and is included in either of the claim 48 to the claim 51.\n | 53. As the other vehicle influence behavior, the behavior of turning to the right or left, the behavior of changing the lane, the behavior of merging on the general road or the highway, the behavior of entering into a traffic line in traffic, the behavior of entering the intersection. The behavior including at least one of the behavior of entering the rotary of the station is stored; and the information of the other vehicle influence behavior included in the other vehicle influence behavior generation information is provided. The navigation device for the automobile described in claim 52 is characterized in that it is the information of the behavior selected from the stored behavior.\n | 54. The other vehicle influence behavior occurrence information includes information of a point where the other vehicle influence behavior is performed, and is included in a claim 52 or a claim 53.\n | 55. The other vehicle influence behavior included in the other vehicle influence behavior generation information is differed depending on whether the vehicle is on the left side or on the right side, and the other vehicle navigation device is characterized in either of the claim 52 to the claim 54.\n | 56. The automobile navigation device determines whether the vehicle is on the left side or the right road according to the country or region specified based on the present position detected by the present position detection means; and the other vehicle influence behavior is set based on the determination result.\n | 57. The other vehicle influence behavior included in the other vehicle influence behavior generation information is set on the basis of the presence or absence of the traffic signal, and the vehicle navigation device is described in either of the claim 52 to the claim 56.\n | 58. The other vehicle influence behavior included in the other vehicle influence behavior generation information is set on the basis of the priority relation between the vehicles, and the vehicle navigation device is described in either of the claim 52 to the claim 57.\n | 59. A computer provided with a navigation device for an automobile includes a transmission/reception function, a radio communication part for performing radio communication between vehicles, and a vehicle-to-vehicle communication request means for performing inter-vehicle