diff --git "a/v2x-try-40-summary-0901.json" "b/v2x-try-40-summary-0901.json" new file mode 100644--- /dev/null +++ "b/v2x-try-40-summary-0901.json" @@ -0,0 +1,197 @@ +[ + { + "input": "FULL-DUPLEX COMMUNICATION METHODS AND APPARATUSTechniques are described for a full duplex communication method and apparatus for inter-vehicle communication (V2V). A communication apparatus includes one or more transmit antennas, one or more receive antennas, and a processor. For cases where a single transmit antenna and multiple receive antennas are used, a distance between the transmit and receive antennas is greater than a pre-determined value. Further, the transmit antenna is located on or in a central region of a top surface of the vehicle and the receive antennas are evenly distributed located on the vehicle. The processor configured to generate one or more messages to be transmitted via the transmit antenna, where the one or more messages includes vehicle condition information, operational information about a driver of the vehicle, or information associated with one or more sensors of the vehicle.", + "output": "The communication apparatus has transmit antenna (104) and receive antenna (106a-106d) that are located on or in first side and second side of the vehicle (102). A processor generates one or more messages to be transmitted via transmit antenna. The messages include vehicle condition information, operational information about the driver of the vehicle or information associated with one or more sensors of the vehicle. An INDEPENDENT CLAIM is included for a wireless communication method. Communication apparatus for full-duplex vehicle-to-vehicle (V2V) or device-to-device (D2D) communication. Assists driver of the vehicle as the vehicle transmits information to or receives information from the vehicles surrounding the driver, and as well as assists vehicle to operate in autonomous driving mode. The drawing is the schematic view of the communication apparatus for full-duplex vehicle-to-vehicle (V2V) or device-to-device (D2D) communication. 102Vehicle104Transmit antenna106a-106dReceive antenna", + "instruction": "Please summarize the input" + }, + { + "input": "Method, system and vehicle for controlling over-vehicle on ice and snow road of automatic driving vehicleThe invention claims an automatic driving vehicle ice and snow road overtaking control method, system and vehicle, firstly detecting whether the surrounding vehicle has overtaking intention; detecting the front and back position vehicle on the borrowing lane is a non-automatic driving vehicle or an automatic driving vehicle, and a non-automatic driving vehicle or automatic driving vehicle located on the to-be-executed overtaking vehicle overtaking route position, and the safe distance of the front vehicle position and the rear vehicle position on the borrowing lane, then sending signal to the surrounding vehicle, after executing the first lane change of the overtaking vehicle, judging whether the detecting road is the ice film road, if not, performing the second lane changing to finish the overtaking after the overtaking vehicle speed change driving exceeds the original lane, sending the over-vehicle signal to the surrounding vehicle, the surrounding vehicle recovers the original driving state, the invention controls the automatic driving vehicle and the surrounding automatic driving vehicle, reduces the uncertainty of the non-automatic driving vehicle in the vehicle process, improves the super-vehicle safety performance of the ice and snow road.|1. An automatic driving vehicle control method for snowy and icy road of automobile, wherein it comprises the following steps: firstly comparing the to-be-executed overtaking vehicle and the front vehicle speed, if it is greater than the front vehicle speed, carrying out overtaking, and then detecting whether the surrounding vehicle has overtaking intention; if there is, then to be executed overtaking vehicle deceleration or original state driving; if not, then detecting the front and back position vehicle on the borrowing lane is a non-automatic driving vehicle or an automatic driving vehicle, and a non-automatic driving vehicle or an automatic driving vehicle located on the vehicle overtaking route to be executed on the position, and the safety distance of the front vehicle position and the rear vehicle position on the borrowing lane, It includes the following three cases: The first situation: if the vehicle located at the front vehicle position and the rear vehicle position of the overtaking vehicle borrowing lane is a non-automatic driving vehicle, then the to-be-executed overtaking vehicle sends a first signal for prompting the non-automatic driving vehicle having an overtaking intention, detecting the front vehicle and the rear vehicle speed and does not change or decelerate, executing the first lane changing for the overtaking vehicle; The second situation: if the non-automatic driving vehicle and the automatic driving vehicle are respectively located at the front vehicle position and the rear vehicle position of the overtaking vehicle to be executed, the priority selection sends the second signal to prompt the automatic driving vehicle to change the speed, the overtaking vehicle is to be executed for the first lane change; The third scenario: if the automatic driving vehicle is respectively located at the front vehicle position and the rear vehicle position of the overtaking vehicle to be executed, detecting whether the road surface of the front vehicle and the rear vehicle is the ice film road surface, preferably selecting the second signal to prompt as the automatic driving vehicle changing speed of the non-ice film road surface, performing the first lane changing for the overtaking vehicle, in the three cases, when the first signal or the second signal is selected, the overtaking vehicle sends the third signal to the automatic driving vehicle located at the front of the front vehicle position or the automatic driving vehicle after the rear vehicle position, after the vehicle receives the third signal of the overtaking vehicle to be executed, detecting the current position of the vehicle, if the rear vehicle at the rear vehicle position is slowly decelerated, and sending the variable speed driving warning to the front and rear vehicles, if the vehicle in the front of the front vehicle position is slowly accelerated, and sending the variable speed driving early warning to the surrounding vehicle ; after executing the first lane change of the overtaking vehicle, judging whether the detecting road is ice film road, if so, performing the original state driving of the overtaking vehicle, if not, detecting whether the surrounding vehicle state is changed, if there is no change, performing the second lane changing to finish the overtaking after the overtaking vehicle speed changing running exceeds the original lane front vehicle, sending the over-vehicle signal to the surrounding vehicle, the surrounding vehicle recovers the original driving state.\n | 2. The overtaking control method for snowy and icy road of automatic driving automobile according to claim 1, wherein the judging process of the icy road surface is as follows: the camera of the automatic driving car is matched with the sensor to judge; the distance of the surrounding vehicle passes through the laser radar, the matching of the millimeter-wave radar and the camera can realize measurement; ground adhesion coefficient through the road passing state, the camera sensing and sensor measuring cooperation for judging.\n | 3. The overtaking control method for snowy and icy road of automatic driving automobile according to claim 1, wherein the safe distance of the vehicle and the surrounding vehicle is calculated by the following formula: wherein t1 and t2 are the time of the brake, S is the braking process driving distance, v driving speed, g gravity acceleration, μ road adhesion coefficient, s0 after braking distance from the front vehicle. when the automatic driving vehicle is to perform the overtaking operation, the condition that the vehicle should satisfy the vehicle in the super-vehicle borrowing lane is as follows: the vehicle driving condition with the super-vehicle borrowing lane: △S1 ?S; v is not less than v1; and the super-vehicle driving condition of the front vehicle borrowing lane: DELTA S2 IS NOT LESS THAN S; v is less than or equal to v2; wherein ΔS1 is the horizontal distance with the borrowing lane rear vehicle; v1 is the speed of borrowing lane back vehicle; wherein delta S2 is the horizontal distance with the borrowing lane front vehicle; v2 is the speed of borrowing lane front vehicle.\n | 4. The overtaking control method for snowy and icy road of automatic driving automobile according to claim 1, wherein said borrowing traffic lane can be the adjacent left side or the adjacent right side traffic lane.\n | 5. The overtaking control method for snowy and icy road of automatic driving automobile according to claim 1, wherein said first signal provides lane changing information to the surrounding vehicle in the same way as the non-automatic driving vehicle lane changing process.\n | 6. The overtaking control method for snowy and icy road of automatic driving automobile according to claim 1, wherein the second signal is the interactive vehicle information between the overtaking vehicle to be executed and the automatic driving vehicle at the upper front vehicle position or the rear vehicle position of the borrowing lane, the position and the speed of the vehicle. steering speed and steering time and so on.\n | 7. The method for controlling overtaking of ice and snow road surface of automatic driving automobile according to claim 1, wherein the third signal is the interactive vehicle information between the to-be-executed overtaking vehicle and the automatic driving vehicle located in front of the front vehicle or the automatic driving vehicle after the rear vehicle position, comprising a position, to be executed overtaking vehicle speed, front vehicle position speed or back vehicle position speed and steering time and so on.\n | 8. An automatic driving vehicle ice and snow road overtaking control system, comprising a vehicle controller, a V2V communication unit and a combined instrument; the vehicle controller is adapted to collect the position information of the vehicle, vehicle speed information and steering information; the V2V communication unit is adapted to transmit position information, vehicle speed information and steering information; the vehicle controller is adapted to according to the position information of each vehicle, vehicle speed information and steering information to generate super-vehicle intention signal; the combined instrument is suitable for displaying the corresponding overtaking information according to the intention of the overtaking.\n | 9. An automatic driving vehicle, comprising the automatic driving vehicle ice and snow road overtaking control system.", + "output": "The self-driving car overtaking control method involves comparing the speed of a vehicle to be overtaken with the vehicle in front, overtaking if it is greater than the speed of the vehicle in front, and detecting whether the surrounding vehicles have overtaking intentions. The overtaking vehicle is to be decelerated or driven in its original state. Determination is made to detect whether the front and rear position vehicles on the borrowed lane are non-autonomous driving vehicles or automatic driving vehicles. The first signal is sent to remind the non-autonomous vehicle that the vehicle has an overtaking intention if the vehicles in the front and rear positions of the borrowed lane of the vehicle to be overtaken are non-autonomous vehicles. The second signal is preferred to prompt the automatic driving vehicle to change speed if the non-autonomous and automatic driving vehicles are respectively located in the front and rear positions of the borrowed lane of the overtaking vehicle. INDEPENDENT CLAIMS are included for:(1) an overtaking control system for an automatic driving vehicle on icy and snowy roads;(2) an automatic driving vehicle. Self-driving car overtaking control method on icy and snowy roads. The method reduces the uncertainty of the non-automatic driving vehicle in the process of driving through the cooperative control of the automatic driving vehicle and the surrounding automatic driving vehicles, and improves the safety performance of overtaking on ice and snow roads. The drawing shows a flow chart of a self-driving car overtaking control method on icy and snowy roads. (Drawing includes non-English language text).", + "instruction": "Please summarize the input" + }, + { + "input": "A non-human bus wire control chassis and automatic driving system thereofThe invention claims a non-human bus wire control chassis and automatic driving system thereof, a unmanned bus wire control chassis, comprising a chassis main body. In the application, the image and the 3 D laser front fusion sensing, for pedestrian and lane line environment detection to ensure the correct understanding and corresponding decision of the vehicle body surrounding environment of the automatic driving vehicle. identifying the road red street lamp by V2X intelligent network connection technology based on the 5 G communication, the technology by installing the signal emitter on the traffic light to continuously transmit the state information of the traffic light, automatically driving the vehicle by receiving the signal sent by the signal emitter to judge the state of the traffic light, using the MPC track tracking can make the vehicle travel along the predetermined track, the algorithm has excellent performance, the track of the track exhibits stable and accurate tracking capability, at the same time, it has enough real-time performance.|1. A non-human bus wire control chassis, comprising a chassis main body, wherein the chassis main body front side is top part mounted with 32-line laser radar, providing a horizontal 360 degrees, vertical 40 degrees of view, four corners of the chassis main body are fixedly mounted with 16-line laser radar, for making up the view blind area caused by 32-line laser radar height, finishing the monitoring area covering 360 degrees, the back side of the chassis main body, the outer side of the two groups of 16-line laser radar is fixedly installed with two groups of blind area auxiliary millimeter wave radar, the front side of the chassis main body is fixedly installed with a preposed millimeter wave radar, three surfaces of the chassis main body outer wall adjacent are distributed with 12 groups of ultrasonic radar, and the ultrasonic radar on the same surface are distributed in equal distance, the front side of the chassis main body is fixedly mounted with an industrial camera 1 and an industrial camera 2, the industrial camera is used for identifying the lane line and the traffic identification, the bottom of the chassis main body is fixedly mounted with an automatic driving calculation platform ADU, comprising an automatic driving operation platform controller, the bottom of the chassis main body is fixedly installed with a 5 G host and a 5 G antenna, and four sides of the chassis main body are respectively fixedly installed with four groups of 5 G cameras, one group of outer side of two groups of the 5 G antennas is provided with a combined navigation host fixedly connected with the chassis main body; 16-line laser radar, 32-line laser radar, preposed millimeter wave radar blind area auxiliary millimeter wave radar and ultrasonic radar can provide millions of data points per second, so as to create a three-dimensional map of surrounding object and environment, combining the auxiliary of combined navigation host and industrial camera, constructing high precision map needed by bus operation.\n | 2. The unmanned bus automatic driving system according to claim 1, wherein the unmanned bus control chassis according to claim 1, wherein it comprises a module software interface and hardware interface, further comprising: sensing algorithm module, which is detected by laser radar, camera detection, laser radar tracking, camera tracking and predicting five sub-modules. The camera detection mainly uses the laser radar obtain the high quality barrier information and detects the final result. the tracking result of each sensor will be fused by the filter algorithm, and the prediction module through the fusion tracking result, inputting and outputting the future track of each type of road participant; a locating module, the locating algorithm is mainly composed of a laser radar milemeter, combined navigation calculation and fusion three sub-modules. after outputting the relative positioning information of the fixed frequency, the fusion module combines the positioning information of different frequencies by filtering algorithm, finally outputting the global position of the fixed frequency, providing absolute positioning capability; a global path planning module, responding to the external routing request, giving an optimal route from the current position to the end point of the request; planning control module, the sensor receives the external information, and through the locating and sensing algorithm module, obtaining the state of the vehicle itself and the surrounding vehicle, and receiving the state of the external information and the vehicle, responsible for autonomous vehicle movement planning and trajectory tracking control; a decision planning module, receiving the real-time location, prediction information, planning according to the global path, combining obstacle avoidance, multiple factors, real time planning the vehicle future a collision-free track of a period of time, and sending to the track tracking controller to execute the vehicle; action prediction module, receiving the input of sensing and positioning module, responsible for giving the action of surrounding other participants 5s-7s the specific motion track, for decision planning module, track tracking control module, after the decision planning module gives the track of safety without collision, the track tracking control module is responsible for calculating the proper control command according to the current vehicle state and the planned track, so that the vehicle can move along the planned track.\n | 3. The unmanned bus automatic driving system according to claim 2, wherein the output frequency of the positioning module is the fixed frequency and is processed in the ADU of the automatic driving calculation platform.\n | 4. The unmanned bus automatic driving system according to claim 2, wherein the hardware interface comprises: a communication data interface: for interactive scheduling command, vehicle positioning, posture; sensor data interface: the combined inertial navigation system IMU and the automatic driving calculation platform ADU, using the USART interface of the IMU to transmit data; multi-line laser radar interface, millions of point cloud data per second, using UDP protocol for data transmission; the ultrasonic radar is a near-distance obstacle detection, the output result is a barrier distance, the data reading is performed by the CAN interface on the ultrasonic radar control box; a control data interface, an automatic driving operation platform ADU and vehicle control chassis interface, using the mode of CAN to transmit.\n | 5. The unmanned bus automatic driving system according to claim 2, wherein said module software interface comprises: sensor abstract layer service interface, providing two types of service interface, one is the information service interface of the intelligent sensor, and the other one is other vehicle sensor interface.\n | 6. The unmanned bus automatic driving system according to claim 2, wherein the laser radar mileage meter in the positioning module uses the GNSS data to finish the initialization, and the point cloud data generated by the laser radar is matched with the high precision map, and the absolute positioning information of the fixed frequency is output. combined navigation calculating module combined with GNSS data and IMU data, then outputting the relative positioning information of the fixed frequency.\n | 7. The non-human bus wire control chassis and automatic driving system thereof according to claim 2, wherein the radar detection and camera detection in the sensing algorithm module can be decoupled and used for tracking.", + "output": "The chassis has a chassis main body whose front side is fixedly mounted with first and second industrial cameras. A bottom of the chassis main body is fixedly mounted with an Automatic Drive Unit (ADU). A bottom of the chassis main body is fixedly installed with a fifth generation (5G ) host and a 5G antenna. Four sides of the chassis main body are fixedly installed with four groups of 5G cameras. Groups of the 5G antennas is provided with a combined navigation host fixedly connected with the chassis main body. An INDEPENDENT CLAIM is included for an automatic driving system. Non-human bus wire control chassis for an automatic driving system (claimed). The chassis enables Model predictive control (MPC) tracking to make the vehicle travel along the predetermined track, thus obtaining excellent performance, and stable and accurate tracking capability. The drawing shows a schematic view of a non-human bus wire control chassis. (Drawing includes non-English language text).", + "instruction": "Please summarize the input" + }, + { + "input": "lane recognition system, method and automatic driving automobileThe invention embodiment claims a lane recognition system, method and automatic driving vehicle, the lane recognition system, comprising: at least one camera, at least one radar, data processing unit and a lane recognition unit; the weight ratio of the road edge feature the lane recognition unit is connected with the data processing unit for determining the road edge feature obtained by processing the image information and obtained by processing the radar information, and according to the weight proportion, the road edge feature obtained by processing the image information; road edge feature obtained by processing the radar information and the lane mark feature to identify the lane position. the technical solution of the invention realizes the camera and radar device data on the lane recognition fusion, so as to better deal with complex road condition and environmental interference, avoid the occurrence of dangerous accident and improve the safety of driving.|1. A lane recognition system, wherein, comprising: at least one camera for obtaining vehicle driving image information of lane in the path, at least one radar, for obtaining vehicle running radar information of the area, and a processing unit for: processing the image information to obtain the lane line feature and the road edge feature, processing the radar information to acquire the road edge feature, determining the weight ratio of road edge feature by the road edge feature processing obtained by the image information and information obtained by processing the radar; according to said weight proportion, road edge feature obtained by processing the image information, by processing road edge feature obtained by the radar information and the lane mark feature to identify the lane position.\n | 2. The system according to claim 1, wherein, further comprising: at least one illumination device, used for obtaining the vehicle driving the illumination intensity information of the area, wherein the processing unit is used for determining the weight proportion according to the illumination intensity information.\n | 3. The system according to claim 2, wherein it further comprises a vehicle-to-vehicle communication device, which is used for obtaining the vehicle driving road traffic information and auxiliary lane information of the area, wherein: the processing unit is used for according to the road traffic information and the illumination information to determine the road edge feature by processing the weight rate of said image information, obtained by processing the weight ratio of road edge feature obtained by the radar information, the auxiliary lane information of the weight proportion; road edge feature and according to the weight proportion, obtained by processing the image information by processing the road edge feature obtained by the radar information, the lane line feature, the auxiliary lane information to identify lane position.\n | 4. A lane recognition method, wherein the method comprises: according to the image information, obtaining the vehicle running lane in the lane route and lane road edge feature of the two side, wherein the image information collected by the camera by the installed on the vehicle, determining the weight ratio of the respectively obtained according to image information and radar information two sides of the lane of the road edge feature, wherein the radar information by mounting the radar acquisition of the vehicle according to the lane line feature. the weight proportion is, the road edge feature obtained by the image information and the road edge feature identifying lane position acquired by the radar information.\n | 5. The method according to claim 4, wherein it further comprises the following steps: obtaining the vehicle driving the illumination intensity information of the area according to the illumination intensity information to determine the road edge feature by processing the image information to that obtained by processing the weight ratio of road edge feature obtained by the radar information.\n | 6. The method according to claim 5, wherein it further comprises the following steps: obtaining the vehicle running road traffic information and auxiliary lane information of the region; The road traffic information and the illumination information to determine the road edge feature by processing weight ratio of the image information obtained by the weight ratio of the road edge feature obtained by processing the radar information, the auxiliary lane information of weight ratio, and the road edge feature according to the weight proportion, obtained by processing the image information, road edge feature obtained by processing the radar information, the lane line feature, the auxiliary lane information to identify lane position.\n | 7. The method according to claim 4, wherein said according to the image information, obtaining the vehicle running lane and a lane in the lane path at two sides of road edge feature, is implemented as: for enhancing white balance processing to the image information. the said image information into area according to the RGB value of the pixel point, gray processing the image information of the divided area, extracting the road feature, the road characteristic input deep learning model trained in advance, output lane and road edge feature.\n | 8. The method according to claim 4, wherein the radar information obtaining the lane road edge feature at two sides, is implemented as: performing filtering processing to the radar information, extracting the road edge feature.\n | 9. The method according to claim 4, wherein said lane characteristic according to the weight proportion, the road edge feature obtained by the image information and the road edge feature identification vehicle acquired by the radar information of position, comprising: calculating the lane width according to the lane mark feature according to the weight proportion, the road edge feature obtained by the image information and the road edge feature acquired by the radar information of the calculated road width; according to the lane width and the width of the road lane number calculation, and based on the lane width and the lane number identifying the lane position.\n | 10. An automatic driving automobile, comprising one of a lane recognition system according to any one of claims 1~3.", + "output": "The system has a camera for obtaining vehicle driving image information of lane in a path. A radar obtains radar information of an area in which a vehicle is traveling. A processing unit processes image information to obtain lane line features and road edge features. The processing unit processes radar information to obtain road edge features. The processing unit determines a road edge feature obtained by processing image information and a weight ratio of a road edge feature obtained by processing radar information. The processing unit identifies a lane position by processing the road edge feature obtained by radar information and lane line feature. An INDEPENDENT CLAIM is also included for a lane recognition method. Lane recognition system. The system realizes data fusion of the camera and the radar device in lane recognition, so as to better deal with complex road conditions and environmental disturbances, avoids the occurrence of dangerous accidents, and improves the safety of driving. The drawing shows a block diagram of a lane recognition system. '(Drawing includes non-English language text)'", + "instruction": "Please summarize the input" + }, + { + "input": "A formation control system of automatic driving vehicleThe application model claims a formation control system of automatic driving vehicle, the formation control system of the automatic driving vehicle comprises a vehicle end and a road end, the vehicle end comprises a vehicle unit and a domain controller, the road end comprises a road side unit, a vehicle unit, for obtaining the sensor data of the vehicle end and the vehicle road cooperative data obtained by the V2X communication mode, and sending the sensor data of the vehicle end and the road end of the vehicle road cooperative data to the domain controller; domain controller, used for performing fusion processing for the received data, and performing formation control of automatic driving vehicle according to the fusion processing result; a road side unit, for obtaining the road cooperation data of the road end and sending to the vehicle end through the V2X communication mode. The application realizes the full link design of automatic driving vehicle formation control based on V2X communication, by combining the sensor data of the vehicle end with the road of the road end, providing more abundant, more reliable judging basis for the formation decision planning, improving the formation control precision.|1. A formation control system of automatic driving vehicle, wherein the formation control system of the automatic driving vehicle comprises a vehicle end and a road end, the vehicle end comprises a vehicle-mounted unit and a domain controller, the road end comprises a road side unit, the vehicle-mounted unit, for obtaining the sensor data of the vehicle end and the vehicle road cooperative data of the road end by the V2X communication mode, and sending the sensor data of the vehicle end and the vehicle road cooperative data of the road end to the domain controller; the domain controller is used for performing fusion processing to the sensor data of the vehicle end and the road end of the road end, and according to the fusion processing result for automatically driving the formation control of the vehicle; the road side unit is used for obtaining the road coordinate data of the road end and sending it to the vehicle end through the V2X communication mode.\n | 2. The formation control system of automatic driving vehicle according to claim 1, wherein the domain controller is further used for: performing target prediction according to the fusion processing result; and performing formation control of the automatic driving vehicle according to the target prediction result.\n | 3. The formation control system of automatic driving vehicle according to claim 1, wherein the domain controller is further used for: fusing the data of each sensor of the vehicle end, obtaining the sensor data after fusion; integrating the converged sensor data with the road-end vehicle-road cooperative data to obtain the final fusion processing result.\n | 4. The formation control system of automatic driving vehicle according to claim 1, wherein the domain controller is further used for: determining whether the bicycle can be used as a pilot vehicle according to the fusion processing result; under the condition that the bicycle can be used as a pilot vehicle, based on self-vehicle for formation decision planning, generating formation decision planning task.\n | 5. The formation control system of automatic driving vehicle according to claim 4, wherein the domain controller is further used for: executing the formation decision planning task, and obtaining the self-vehicle fleet according to the execution result of the formation decision planning task; controlling the self-vehicle fleet according to the preset fleet driving strategy for driving.\n | 6. The formation control system of automatic driving vehicle according to claim 4, wherein the vehicle unit is further used for: sending the formation request to the surrounding vehicle corresponding to the bicycle through the V2X communication mode, so that the surrounding vehicle according to the formation request application added to the self-vehicle fleet; according to the response result of the surrounding vehicle to the formation request, updating the to-be-processed state list of the vehicle end, the to-be-processed state list comprises adding the vehicle queue list, member list and leaving the vehicle queue list.\n | 7. The formation control system of automatic driving vehicle according to claim 6, wherein the vehicle unit is further used for: according to the response result of the surrounding vehicle to the formation request, determining the candidate surrounding vehicle; obtaining the vehicle information of the candidate surrounding vehicle by V2X communication mode, and sending the vehicle information of the candidate surrounding vehicle to the domain controller.\n | 8. The formation control system of automatic driving vehicle according to claim 7, wherein the domain controller is further used for: according to the vehicle information of the candidate surrounding vehicle, determining whether the candidate surrounding vehicle satisfy into the requirement of the fleet, under the condition that the candidate surrounding vehicle satisfy added with the requirement of the fleet, the candidate surrounding vehicle is used as the following vehicle to join the self-vehicle fleet.\n | 9. The formation control system of automatic driving vehicle according to claim 4, wherein the formation decision planning task comprises a vehicle fleet driving track, the domain controller is further used for: determining a current lane where the bicycle is located; according to the current lane of the bicycle and the driving track of the train, determining whether the bicycle needs to change the lane; under the condition that the bicycle needs to be changed, generating and executing the lane-changing track planning task, so that the bicycle is changed from the current lane to the target lane.\n | 10. The formation control system of the automatic driving vehicle according to any one of claims 1 to 9, wherein the vehicle road cooperation data is data obtained by sensing the surrounding environment in the preset range of the road side device, the vehicle road cooperation data comprises other traffic participation object data, traffic signal lamp data and road event data in the one kind of or more.", + "output": "The system has a vehicle end provided with a vehicle-mounted unit and a domain controller. A road end is provided with a road side unit. The vehicle-mounted unit is used for obtaining the sensor data of the vehicle end and the vehicle road cooperative data of the road end by the V2X communication mode and sending the sensor data of the vehicle end and the vehicle road cooperative data of the road end to the domain controller. The domain controller is used for performing the fusion processing to the sensor data of the vehicle end and the road end. The road side unit is used for obtaining the road coordinate data of the road end and sending to the vehicle end through the V2X communication mode. The domain controller is used for performing the target prediction according to the fusion processing result and performing formation control of the automatic driving vehicle according to the target prediction result. Formation control system for an automatic driving vehicle e.g. automatic driving bus and automatic driving lorry. The application realizes the full link design of automatic driving vehicle formation control based on V2X communication, by combining the sensor data of the vehicle end with the road of the road end, providing more abundant, more reliable judging basis for the formation decision planning, improving the formation control precision. The drawing shows a structure schematic diagram of a formation control system for an automatic driving vehicle e.g. automatic driving bus and automatic driving lorry. (Drawing includes non-English language text).", + "instruction": "Please summarize the input" + }, + { + "input": "SYSTEM AND METHOD FOR LONGITUDINAL REACTION-CONTROL OF ELECTRIC AUTONOMOUS VEHICLEA PID automatic control system and method for speed control of an electric vehicle, according to an embodiment, design a commercial vehicle-based autonomous driving system and a controller for development of a semi-autonomous driving acceleration/deceleration controller, develop a driving priority determination algorithm on the basis of V2X communication, develop technologies for correcting autonomous navigation and improving position precision, develop a technology for recognizing autonomous driving road environments by using a cognitive sensor, conceive an application method of a driving habits improvement algorithm by using learning, develop a driving habits improvement algorithm by using learning, and develop an AEB function for a commercial vehicle to which a semi-autonomous driving technology is applied.|1. A PID automatic control system for semi-autonomous driving acceleration/deceleration control, comprising: a communication module for communicating with a nearby vehicle and a leading vehicle when a platooning group is formed around the vehicle;\na detection module for detecting obstacles on the front and rear sides of the vehicle, detecting surrounding vehicle information including the vehicle speed and driving path of the front vehicle, and detecting road information including stop lines, traffic lights, signs, and road curbs; \nThe vehicle speed is controlled according to the result of V2X communication with communication objects around the vehicle and information about surrounding vehicles and road information, and the amount of change in vehicle speed due to the change in pedal tilt is calculated and reflected in the proportional gain value and error calculation, and acceleration control response characteristics and deceleration control a semi-autonomous driving acceleration/deceleration control module for learning response characteristics and applying the result of learning response characteristics for each vehicle to a gain value calculation; \na deceleration/acceleration sensor that detects the inclination of the pedal and the amount of change in the inclination of the vehicle brake and accelerator; and driver-specific driving data and vehicle control data are collected and driver-specific learning data is applied to the vehicle, and driving habits are identified based on the learning data stored for each driver. a driving habit improvement learning module that improves driving habits through and the semi-autonomous driving acceleration/deceleration control module; is a gain calculator for calculating the difference between the target speed and the running speed, the amount of change in the running speed, and proportional gain (Kp), integral gain (Ki), and differential gain (Kd), which are proportional gains for PID calculation; \nan error amount calculator for calculating an error with a target speed after controlling the motor according to the calculated gain value; \na feedback unit for feeding back motor control by applying the calculated error to each gain value; And The detection module implements an autonomous driving navigation position correction algorithm using sensor information to correct the current position of the vehicle from a sensor including a LiDAR and a camera by comparing it with global coordinates, , Through camera coordinate system calibration using camera coordinate system calibration, 1:1 pixel coordinates of external parameters and internal parameters are matched, and the driving habit improvement learning module; the amount of deceleration in the silver curve is small, or the habit of rapidly accelerating when waiting for a signal If monitored, it feeds back to the semi-autonomous driving acceleration/deceleration control module to decelerate further than the amount of deceleration caused by the brake by the driver. PID automatic control system for semi-autonomous driving acceleration/deceleration control, characterized in that it decelerates. \n | 2. A PID automatic control method for semi-autonomous driving acceleration/deceleration control, comprising the steps of: (A) an autonomous driving vehicle communicating with a nearby vehicle and a leading vehicle when a platooning group is formed around the vehicle;\n(B) detecting obstacles on the front and rear sides of the vehicle in the autonomous vehicle and detecting surrounding vehicle information including the vehicle speed and driving path of the vehicle in front, and road information including stop lines, traffic lights, signs, and curbs; and (C) controlling the vehicle speed according to the result of V2X communication with the communication object around the vehicle in the autonomous vehicle, and information about the surrounding vehicle and the road; and (D) the driver's driving data and vehicle control data in the autonomous vehicle. It collects and applies learning data for each driver to the vehicle, identifies driving habits based on the learning data stored for each driver, and implements autonomous driving to improve driving habits through semi-autonomous driving when high-risk driving habits are identified. Including; and the step of (B); Detecting the inclination of the pedal and the amount of change in the inclination of the vehicle brake and accelerator in the deceleration and acceleration sensor; and calculating a change in vehicle speed due to a change in pedal inclination in the autonomous vehicle and reflecting it in calculating a proportional gain value and an error; comprising the step of (B); implements an autonomous driving navigation position correction algorithm using sensor information to compare and correct the current position of the vehicle from sensors including LiDAR and Camera with global coordinates, and calibrate the camera coordinate system using The pixel coordinates of the external parameter and the internal parameter are matched 1:1 through the camera coordinate system calibration, and the step of (C); is the difference between the target speed and the running speed, the amount of change in the running speed, and the proportional gain for PID calculation calculating a gain (Kp), an integral gain (Ki), and a differential gain (Kd); \ncalculating an error with a target speed after controlling the motor according to the calculated gain value; \nfeeding back the motor control by applying the calculated error to each gain value; \nlearning acceleration control response characteristics and deceleration control response characteristics; And Applying the response characteristic learning result for each vehicle to the gain value calculation; Including; Step of (D); When the amount of deceleration in the curve is small or the habit of sudden acceleration when waiting for a signal is monitored, the driver's brake PID automatic control for semi-autonomous driving acceleration/deceleration control, which feeds back to make the vehicle decelerate further than the amount of deceleration by method.", + "output": "The system has a communication module (110) that communicates with a nearby vehicle and a leading vehicle when platooning group is formed around the vehicle. A detection module (130) detects obstacles on the front and rear sides of the vehicle. The vehicle speed is controlled according to the V2X communication result with the communication object around the vehicle. A semi-autonomous driving acceleration and deceleration control module (150) learns acceleration control response characteristics and deceleration control response characteristics. A driving habit improvement learning module (170) improves driving habits through semi-autonomous driving. A feedback unit feeds back motor control by applying the calculated error to each gain value. The detection module implements autonomous navigation position correction algorithm using sensor information. An INDEPENDENT CLAIM is included for a proportional integral derivative (PID) automatic control method for semi-autonomous driving acceleration and deceleration control. PID automatic control system for semi-autonomous driving acceleration and deceleration control of 1-ton electric commercial vehicle. The fuel consumption caused by air resistance is reduced, thus improving fuel economy. The semi-autonomous speed control is more accurately performed according to the individual driving characteristics of the vehicle. The driver habit improvement module predicts collision by recognizing obstacle in front of the vehicle being driven by interlocking with the automatic emergency braking system and automatically applies the brake when the driver does not intervene to prevent collision. The drawing shows a block diagram of PID automatic control system for semi-autonomous driving acceleration and deceleration control. (Drawing includes non-English language text) 110Communication module130Detection module150Semi-autonomous driving acceleration and deceleration control module170Driving habit improvement learning module", + "instruction": "Please summarize the input" + }, + { + "input": "Interconnected automatic driving decision method based on cooperative sensing and self-adaptive information fusionThe invention claims an interconnected automatic driving decision method based on cooperative sensing and self-adaptive information fusion, which mainly solves the problem that the existing automatic driving decision has low applicability under complex road structure and traffic light information condition. The method considers the multi-lane traffic environment under the world coordinate system, wherein there is the mixed traffic flow composed of the interconnected automatic driving vehicle and the human driving vehicle. Each CAV can obtain surrounding multimodal environmental features (such as lane information, HDV vehicle information, and traffic light information) through a vehicle-mounted sensor and an off-line high precision map. With the help of the vehicle-to-vehicle communication, the CAV can share its information and make a decision within a specified time step t. The aim of the method is to generate speed decision and steering angle decision for CAV. With such action decision, the automatic driving vehicle can safely and effectively drive according to the special route, at the same time, the comfort level of the passenger is greatly improved and the influence to the surrounding HDV is reduced.|1. An interconnected automatic driving decision method based on cooperative sensing and self-adaptive information fusion, wherein it comprises the following steps: S1, collecting the vehicle dynamics information fA of the CAV through the vehicle GNSS and IMU sensor, and detecting the vehicle dynamics information fH of the HDV through the vehicle radar of the CAV; wherein CAV is automatic interconnected driving vehicle, HDV is human driving vehicle; S2, accurately locating the position and direction of the CAV, and identifying the road and traffic light near the CAV, so as to obtain the real-time road information of the CAV preset driving route; S3, each CAV respectively transmits its own vehicle dynamics information fA and the sensed vehicle dynamics information fH of HDV to an MLP, and splices the obtained result codes to form a vehicle code hi; and aggregating the CAV vehicle code based on the CAV communication link matrix M using the graph attention layer to obtain the vehicle flow information of the CAV preset travelling route; wherein MLP represents a multilayer perceptron; S4, adopting the multi-intelligent body strengthening learning algorithm MAPAC training parameterized CAV action structure to obtain the optimal action strategy, and adopting the random Gaussian strategy to improve the searching ability of the algorithm, so as to realize the action decision of CAV; S5, setting the self-central reward function for improving the safety, efficiency and comfort of the CAV and the social influence reward function for reducing the negative influence to the surrounding HDV to optimize the action decision of the CAV.\n | 2. The interconnected automatic driving decision method based on collaborative sensing and self-adaptive information fusion according to claim 1, wherein in the step S1, the vehicle dynamics information fA of CAV comprises vehicle speed, direction, length, width, a lane ID and an offset from the lane centre line; The vehicle dynamics information fH of the HDV includes the relative distance, speed, direction of the HDV relative to the CAV and the lane mark and lane centre line offset of the HDV.\n | 3. The interconnected automatic driving decision method based on cooperative sensing and self-adaptive information fusion according to claim 2, wherein in the step S2, the CAV preset driving route is composed of multiple lanes of roads. the characteristic fL of the road is represented by the lane track point, wherein each track point comprises the lane horizontal height, namely the lane gradient change, the direction and the lane ID; the traffic light information uses the detection technology based on the camera to detect the real-time state and the distance from the vehicle; wherein red is [1, 0, 0], green is [0, 1, 0], yellow is [0, 0, 1]; if the characteristic fL of the road and the traffic light state fTL are coded by a road encoder, Er, i= sigma (phi (fL, L, fTL, L) | phi (fL, C, fTL, C) | phi (fL, R, fTL, R)), wherein Er, i represents road coding, phi represents MLP layer, sigma is ReLU activation function, fL, L, fTL, L represents the left side lane code and traffic light code of the lane where the vehicle is located, fL, C, fTL, C represents the lane code and traffic light code of the lane where the vehicle is located, fL, R, fTL, R represents the right lane code and the traffic light code of the lane where the vehicle is located, and the absolute value is the connection operation; for each intelligent agent, the attention point is only limited to the lane characteristic of the current lane, the red-green lamp and two adjacent lanes; finally connecting the traffic flow code Et, i and the road code Er, i to obtain the state code Es, i for the final road information code operated by the subsequent module.\n | 4. The interconnected automatic driving decision method based on cooperative sensing and self-adaptive information fusion according to claim 3, wherein in the step S3, in the CAV vehicle coding based on CAV communication link matrix M, according to the attention mechanism, Each intelligent agent i in the vehicle communication network calculates the query vector qi, the key vector ki and the value vector vi are listed as follows: qi = Wqhi, ki = Wkhi, vi=Wvhi, in the formula, Wq represents query matrix, Wv represents value matrix, Wk represents key matrix, hi is vehicle code; Assuming that the intelligent agent i has Ni adjacent intelligent agents, the attention score a ij of the intelligent agent to the adjacent intelligent agent j can be calculated as: wherein sigma is the activation function ReLU; LeakyReLU represents a LeakyReLU activation function, exp represents an exponential operation operation, and l represents one of Ni adjacent intelligent bodies; Due to the change of the traffic environment, the intelligent body which lost the communication connection with it in the current time step is filtered, and the final traffic flow code Et is calculated in combination with the CAV link matrix, and i is listed as follows: wherein phi represents the MLP layer, Mi, j are the values of the link matrix, Mi, j = 0 represents that there is no connection between the agent i and the agent j in the current time step, and vice versa; Wherein, the intelligent agent is CAV.\n | 5. The interconnected automatic driving decision method based on collaborative sensing and self-adaptive information fusion according to claim 4, wherein in the step S4, the multi-intelligent body strengthening learning algorithm MAPAC uses the actor commentator structure in strengthening learning, wherein the actor network is used for calculating the action, the commentator network is used for evaluating the action through the estimation value function; the random Gaussian network replaces the original depth Q network, the random Gaussian network outputs a Gaussian distribution, the intelligent agent samples from the distribution to form parameterized continuous action; Wherein, in the model training process using the multi-agent enhanced learning algorithm MAPAC, the actor network pi i of the agent i updates the network by minimizing the following objects: wherein the experience buffer D is used for storing the state and action of all interconnected intelligent bodies, and Q (beta) represents the network parameter of the commentator; lambda is the regularization coefficient of the search performance of the control algorithm, and respectively representing the state information and action information of the intelligent agent i in the time step t, an actor network representing the parameters of the agent i; the combined set of the state and action of the interconnected intelligent body is used as the input of the commentator network, the network then outputs the Q value of the action taken by the intelligent body i in the time step t; The critic network is updated by minimizing the following Berman error JQ: wherein gamma is rewarding discount factor, ri is instant rewarding of t time step; Two target commentator networks for stable training process, when executing, each intelligent body operates the copy of the respective actor network and commentator network, namely distributed execution; the intelligent agent i only needs to obtain the observed traffic environment information and performs information enhancement through the shared information from the interconnected intelligent agent, and then calculates the final parameterized action based on the fused information; finally, selecting the action with the maximum Q value as the actually executed action; All CAVs in the Internet follow the process described above to generate their respective action decisions.\n | 6. The interconnected automatic driving decision method based on collaborative sensing and self-adaptive information fusion according to claim 5, wherein in the step S5, the self-central reward function and the social influence reward function form a mixed reward function. The expression is as follows: In the formula, rego represents a self-centered reward, and rsoc represents a social impact reward; it is used for quantifying the cooperation degree between the automatic driving vehicle union and the human driving vehicle.\n | 7. The interconnected automatic driving decision method based on cooperative sensing and self-adaptive information fusion according to claim 6, wherein the expression of the social influence reward function is as follows: FORMULA. In the formula, rsoc, 1 is used to quantify the incidence of sudden parking of CAV or sudden cut-in of corresponding lane, and the expression is as follows: rsoc = rsoc, 1 + rsoc, 2, wherein rsoc, 1 is used to quantify the incidence of sudden parking of CAV or sudden cut-in of corresponding lane, and the expression is as follows: In the formula, represents the speed of the HDV in the time step t, thrvel is a threshold value of the speed change, for determining whether the CAV causes the quick brake action of the HDV, thracc* delta t is the speed change threshold value between two continuous time steps; The reward is only effective when the HDV deceleration is greater than thrvel; rsoc, 2 is used for quantizing CAV to adjust its speed or position so as to reserve incidence rate of variable track space behaviour for HDV, the expression is as follows: wherein is the adjacent HDV of the agent i in the time step t; when the vehicle of the adjacent lane is in front of the CAV or behind the CAV safe lane change,/> set as 1; In other cases, is set as 0.\n | 8. The interconnected automatic driving decision method based on collaborative sensing and self-adaptive information fusion according to claim 7, wherein the expression of the self-central reward function is as follows: FORMULA. rego=rsaf + reff + rcom, in the formula, rsaf is security reward, reff is efficiency reward, rcom is passenger comfort reward; wherein the security reward is the reward sum of CAV unsafe behaviour and traffic rule compliance rate; the vehicle-following safety uses the predicted collision time TTC to ensure the safe vehicle-following distance of the CAV; The TTC calculation formula is as follows: In the formula, fA. level and fH-level represent speeds of CAV and HDV, respectively, dis (A, H) represents Euclidean distance between A and H; On-vehicle security reward rsaf, 1 is calculated as follows: Wherein, is a threshold value of TTC; Secondly, the lane-keeping security reward keeping CAV travels in the center of the lane, and the calculation method is as follows: wherein dis (wp, A) measures the distance between CAV and the central point of the lane, d is half of the width of the lane; the emergency safety is CAV collision, deviation road action or violation traffic signal lamp of penalty, other condition is 0; efficiency reward: the efficiency of the multi-lane task is the sum of the speed control efficiency and the lane changing efficiency; The speed control efficiency reff, 1 promotes the automatic driving vehicle to keep a safe driving speed, calculating as follows: wherein fA. level represents the speed of the automatic driving vehicle, velmax is the maximum driving speed set by the vehicle; lane change reward reff, 2 encourage vehicle overtaking and avoid obstacle; It is calculated as follows after the lane change is completed wherein dis (Htar, A) and dis (Hprev, A) represent the distance of the vehicle from an obstacle or a forward vehicle on the target lane and the previous lane, respectively; Comfortable rewards for passengers: using the vehicle acceleration change rate Jerk to measure; The Jerk calculation method is as follows: wherein acct is the acceleration of the vehicle under the time step t, delta t is each time step length; rcom is calculated by Jerk: wherein thracc is the allowed maximum Jerk value.", + "output": "The method involves collecting vehicle dynamics information of a vehicle through a vehicle global navigation satellite system (GNSS) and an internet of things (IMU) sensor. A self-central reward function is set for improving safety, efficiency and comfort of a CAV and a social influence reward function for reducing negative influence to a surrounding HDV to optimize an action decision of the CAV. Cooperative sensing and self-adaptive information fusion based interconnected automatic driving decision method for automatic driving vehicles e.g. human driving vehicle (HDV) and inter-connected auto-driving vehicle (CAV). The automatic driving vehicle can safely and effectively drive according to the special route, at the same time, the comfort level of the passenger is greatly improved and the influence to the surrounding HDV is reduced. The drawing shows an overall frame diagram of a cooperative sensing and self-adaptive information fusion based interconnected automatic driving decision method. (Drawing includes non-English language text).", + "instruction": "Please summarize the input" + }, + { + "input": "Intelligent cruise robot reverse searching system and method based on cloud wireless transmissionThe invention relates to the technology field of reverse searching in parking lot, specifically to an intelligent cruise robot reverse searching system and method based on cloud wireless transmission. The system comprises four parts of an intelligent cruise robot terminal, a parking space identification terminal, a cloud terminal and a reverse searching inquiry terminal. The system is a full-automatic intelligent robot device with vehicle position identification, license plate automatic identification and mobile walking functions. The searching robot device integrates automatic control, video recognition analysis, autonomous driving, wireless communication and algorithm based on artificial intelligence, realizing fast and accurate recognition of license plate and parking space and issuing the information for searching vehicle position.|1. An intelligent cruise robot reverse searching system based on cloud wireless transmission, wherein the system comprises the following four parts: intelligent cruise robot end of the intelligent cruise robot end comprises one or more intelligent cruise robot; each intelligent cruise robot comprises a high-definition camera and a license plate number recognition software, automatically control the cruise module, universal high-performance wireless communication transmission module, server module and a battery drive module, the high definition camera and license plate number recognition software uses video analysis algorithm for license plate number and the carport associated identification of the vehicle, and the computer visual detection; the automatic control cruise module for controlling the intelligent cruise robot walking and obstacle recognition avoidance function, the module through ultrasonic wave radar plus video visual detection of two kinds of technology, it can detect the size and distance of the obstacle object of all ultrasonic radar in the walking. after calculating controlling the robot to perform the corresponding avoid and special marks (such as white lines) video visual detection may be based on the path of the robot intelligent cruise control according to the route automatic walking; the wireless communication module adopts 4 G internet of things technology responsible for identifying the licence plate number and the associated parking space information to the cloud; the generic high performance server module is used for license plate number recognition software, automatically control the cruise module, wireless communication transmission module provides hardware computing and memory function; the battery drive module for providing intelligent cruise robot driving walking power and to charge the battery under the condition of low battery, parking mark end must be a parking mark end formed by the position identification device for associating the license plate of parking and stopping vehicle, wherein identification device adopts two technical methods for parking space recognition, and the two technical means also determines the working mode and working state of the intelligent cruise robot: A smart sign technology using the intelligent vehicle is mounted below the stall, integrated with an automatic induction device, wireless communication transmission device and the parking state display device; automatic sensing device adopts low power consumption communication technology, when the intelligent car label after sensing the vehicle parking the parking state display device from a vehicle green light into red light of a vehicle and the wireless communication transmission device to send a wireless signal to the intelligent cruise robot; the intelligent cruise robot working mode is driven, robot parked at the appointed place, after the wireless signal or more intelligent receiving the intelligent car label sent by the car label of white lines, road is fast by no change of vehicle area to stall before collecting the vehicle license plate number, B adopts traditional ground printed number technology using traditional printing mode, printing numbers on each parking space, the intelligent cruise robot working mode is active the cruise. Under this working mode, intelligent cruise robot according to the set time interval, driving the all vehicle in the area which is responsible for identifying one identifying vehicle license plate at the same time, the identification number of the ground printed. the two kinds of number associated to the cloud for issuance, cloud the cloud comprising one or more management servers, one or more indoor LED guide screen and one or more outdoor LED guide screen, the server cluster working mode, can support large data storage operation, providing the original learning data for future data mining and artificial intelligence algorithms, the server providing a plurality of external interfaces, which can remotely control one or more indoor LED guide screen, one or more outdoor LED guide screen. communicated through the local area network or the Internet server and the guide screen for multiple parking the parking information, receiving parking and vehicle information transmitted by multiple intelligent cruise robot, providing for indoor and outdoor LED guide screen to issue to realize centralized management and resource sharing of the vehicle information; reverse the searching inquiry user through reverse searching inquiry APP software or Minicell public number and cloud communication end, can query the vehicle position after inputting the inquiry condition.\n | 2. A searching method based on the reverse searching system according to claim 1, wherein, when the vehicle identification end is a traditional ground printing vehicle number, the method adopts the intelligent cruise robot via the driving mode to the parking space recognition, specifically comprising the following steps: S1. the parking space of the parking lot A and lot B printed with the traditional ground parking space number associated set walking route on the lane marked white cruise robot, S2. Intelligent cruise robot via preset time (such as every 5 minutes or 10 minutes), driving walking cruise on a predetermined area and route, simultaneous analysis of license plate and parking space number through video analysis and recognition in the walking process, S3. the analysis back to the license and parking space number, the license plate and parking space number information uploaded to the cloud through the wireless communication module of the robot, S4. cloud end after statistic analysis, the transmission guide screen, the vehicle is guided into position to guide indoor LED screen and outdoor LED parking information and vehicle location through the network, S5. APP or Minicell public number owner inquiry end through backward searching when the need to find vehicle to vehicle position inquiry so as to fast and conveniently to the navigation guidance.\n | 3. The searching method based on the reverse searching system according to claim 1, wherein when the vehicle identification terminal is intelligent sign, the method uses intelligent cruise robot parking space recognition by passive cruise mode, specifically comprising the following steps: S1. the parking space of the parking lot A and lot B is equipped with an intelligent vehicle, the intelligent vehicle is integrated with automatic induction device, the wireless communication transmission device and the parking state display device, then information associated with the parking space S2. when there is the vehicle parking position, in the intelligent vehicle automatic induction device utilizing photoelectric or electromagnetic induction technology, after sensing the vehicle parks, the intelligent vehicle position state display device of label index light from the non-green light into red light of a vehicle, the intelligent vehicle one or more label by the wireless communication transmission device and respective intelligent cruise robot communication, intelligent cruise robot receives the information, it will rapidly reach with change of the parking area, S3. cloud end after statistic analysis, the transmission guide screen, the vehicle is guided into position to guide indoor LED screen and outdoor LED parking information and vehicle location through the network, S4. APP or Minicell public number owner inquiry end through backward searching when the need to find vehicle to vehicle position inquiry so as to fast and conveniently to the navigation guidance.", + "output": "The system has an intelligent cruise robot for comprising a high-definition camera, a number plate, an universal high-performance wireless communication transmission module, a server module and a battery drive module. An automatic control cruise module controls intelligent cruise robot walking and obstacle recognition avoidance function. A server is connected with external interfaces to control an indoor LED guide screen. A cloud communication end determines a vehicle position by using reverse searching inquiry application (APP) software or public number after inputting inquiry condition. An INDEPENDENT CLAIM is also included for a cloud wireless transmission based intelligent cruise robot reverse searching method. Cloud wireless transmission based intelligent cruise robot reverse searching system. The system can automatically control video recognition analysis, autonomous driving, wireless communication and algorithm based on artificial intelligence and realize information in the number plate and a parking space after searching the vehicle position. The drawing shows a block diagram of a cloud wireless transmission based intelligent cruise robot reverse searching system. '(Drawing includes non-English language text)'", + "instruction": "Please summarize the input" + }, + { + "input": "Method for determining reliability of received dataThe invention relates to a computer-implemented method for determining the reliability level of the data received by the autonomous vehicle from the target vehicle, the target vehicle is different from the autonomous vehicle. The invention also relates to a corresponding control system and a computer program product.|1. A computer implemented method for determining the reliability level of the data received by the autonomous vehicle from the target vehicle, wherein the target vehicle is different from the autonomous vehicle and is arranged near the autonomous vehicle, the autonomous vehicle comprises a control unit, wherein the method comprises the following steps: and using wireless communication at the control unit, receiving a first set of operation data associated with the target vehicle, and determining the reliability level by the control unit based on the first set of operation data and a predetermined model of the expected behaviour of the first set of indicating operation data.\n | 2. The method according to claim 1, wherein the predetermined model is further dependent on the target vehicle.\n | 3. The method according to any one of claim 1 and 2, wherein the predetermined model further depends on the expected change of the first set of operation data over time.\n | 4. The method according to claim 1, further comprising the following steps: using a first sensor included in the autonomous vehicle to determine a second set of operational data associated with the target vehicle by the control unit, and determining, by the control unit, a difference between the first set of operational data and the second set, wherein the control unit determines the second set of operational data associated with the target vehicle, The determination of the level of reliability is also based on the determined difference.\n | 5. The method according to claim 4, wherein the predetermined model further indicates an expected behaviour of the second set of operational data.\n | 6. The method according to claim 4, wherein the predetermined model further indicates an expected behaviour of the difference between the first set of operational data and the second set.\n | 7. The method according to any one of the preceding claim, further comprising the following steps: -only when the reliability level is higher than the first predetermined threshold value, the operation data from the target vehicle is defined as reliable.\n | 8. The method according to any one of the preceding claim, further comprising the following steps: -if the reliability level is below a second predetermined threshold value, defining the operation data from the target vehicle as unreliable.\n | 9. The method according to claim 1, wherein the first set of operating data relates to at least one of a speed of the target vehicle, an acceleration, a deceleration, and the like.\n | 10. The method according to claim 2, further comprising the following steps: -determining an identifier of the target vehicle using a second sensor included in the autonomous vehicle.\n | 11. The method according to claim 4, wherein the predetermined model represents a statistical behaviour of the set of operational data.\n | 12. The method according to any one of the preceding claim, further comprising the following steps: -establishing a network connection with a server disposed outside the autonomous vehicle, and requesting the predetermined model from a remote server.\n | 13. The method according to claim 12, further comprising the following steps: if the reliability level is higher than the third predetermined threshold value, providing the updated model.\n | 14. The method according to claim 4, wherein the first sensor is at least one of a radar, a laser radar sensor, or a camera.\n | 15. The method according to any one of the preceding claim, wherein the operation data is a vehicle-to-vehicle (V2V) data.\n | 16. The invention claims an control system vehicle, comprising a control unit, which is suitable for determining the reliability level of the data received by the autonomous vehicle from the target vehicle, the control system control system vehicle is different from the autonomous vehicle and is arranged near the autonomous vehicle, wherein the control unit is adapted to: receiving a first set of operational data associated with the target vehicle using wireless communication, and determining the reliability level based on the first set of operational data and a predetermined model indicative of an expected behaviour of the first set of operational data.\n | 17. The system according to claim 16, wherein the predetermined model is further dependent on the target vehicle.\n | 18. The system according to any one of claim 16 and 17, wherein the predetermined model further depends on the expected change of time of the first set of operation data.\n | 19. The system according to claim 16, wherein the control unit is further adapted to: -determining a second set of operational data associated with the target vehicle using a first sensor included in the autonomous vehicle, and determining a difference between the first set of operational data and the second set, wherein the determination of the reliability level is further based on the determined difference.\n | 20. The system according to claim 19, wherein the predetermined model further indicates an expected behaviour of the second set of operational data.\n | 21. The system according to claim 19, wherein the predetermined model further indicates an expected behaviour of the difference between the first set of operational data and the second set.\n | 22. The system according to any one of 16 to 22 claim to 22, wherein the control unit is further adapted to: -only when the reliability level is higher than the first predetermined threshold value, the operation data from the target vehicle is defined as reliable.\n | 23. The system according to any one of claim 16 to 23, wherein the control unit is further adapted to: -if the reliability level is below a second predetermined threshold value, defining the operation data from the target vehicle as unreliable.\n | 24. A vehicle comprising the control system according to any one of claim 16 to 23.\n | 25. The vehicle according to claim 24, wherein the vehicle is a truck, a vehicle, a bus or a working machine.\n | 26. A computer program product comprising a non-transitory computer readable medium, the non-transitory computer readable medium is stored with a computer program component for operating the control system included in the autonomous vehicle; said control system is suitable for determining the reliability level of the data received by the autonomous vehicle from the target vehicle, the target vehicle is different from the autonomous vehicle and is arranged near the autonomous vehicle, the control system comprises a control unit, wherein the computer program product comprises: a code for receiving, at the control unit, a first set of operational data associated with the target vehicle using wireless communication, and a code for determining the reliability level by the control unit based on the first set of operating data and a predetermined model of expected behaviour of the first set of operating data.", + "output": "The method involves receiving a first set of operational data relating to the target vehicle, at the control unit and using wireless communication. The reliability level based on the first set of operational data and a predetermined model indicative of an expected behavior of the first set of operational data is determined by the control unit. The predetermined model is further dependent on the target vehicle. The predetermined model is dependent on an expected variation over time of the first set of operational data. The method comprises determining, by the control unit, a second set of operational data related to the target vehicle using a first sensor comprised with the ego vehicle. INDEPENDENT CLAIMS are included for the following:a control system comprised with an ego vehicle;a computer program product; anda vehicle comprising a control system. Method for use in determining a reliability level of data received by an ego vehicle from a target vehicle being different from the ego vehicle. Greatly improves the determination of the reliability level for the received data. The drawing shows a schematic view of a conceptual control system. 200Control system202Control unit204Radar206Lidar sensor arrangement208Camera", + "instruction": "Please summarize the input" + }, + { + "input": "Method and system of determining trajectory for an autonomous vehicleA method of determining a navigation trajectory for an autonomous ground vehicle (AGV) is disclosed. The method may include receiving first Region of Interest (ROI) data associated with an upcoming trajectory path, and receiving predicted attributes associated with a future navigation trajectory for the upcoming trajectory path. The predicted attributes are derived based on map for the upcoming trajectory path. The method may further include modifying the predicted attributes based on environmental attributes extracted from first ROI data to generate modified attributes, and dynamically receiving a second ROI data associated with the upcoming trajectory path upon reaching the upcoming trajectory path. The method may further include predicting dynamic attributes associated with an imminent navigation trajectory for the upcoming trajectory path based on the second ROI data, and refining the modified attributes based on the one or more dynamic attributes to generate a final navigation trajectory.What is claimed is: \n | 1. A method of determining a navigation trajectory for an autonomous ground vehicle (AGV), the method comprising: \nreceiving, by a navigation device, first Region of Interest (ROI) data associated with an upcoming trajectory path, \nwherein the first ROI data is captured using a set of vision sensors installed on one or more road side infrastructures over vehicle-to-infrastructure (V2I) communication network, and \nwherein the first ROI data is an environmental data indicative of an obstacle present in a predetermined ROI along the upcoming trajectory path from an anticipated future location of the AGV; \nderiving, by the navigation device, one or more environmental attributes associated with the predetermined ROI based on the first ROI data; \nreceiving, by the navigation device, one or more predicted attributes associated with a static map of the predetermined ROI, from a cloud, \nwherein the one or more predicted attributes are predicted by performing at least one of: a semantic segmentation, an object detection, and a lane detection on map data associated with the predetermined ROI, using a first artificial intelligence (AI) prediction model deployed on a cloud-based computing device; \nmodifying, by the navigation device, the one or more predicted attributes associated with the static map of the predetermined ROI, based on the one or more environmental attributes associated with the predetermined ROI, to generate one or more modified attributes associated with the static map of the predetermined ROI along the future navigation trajectory, wherein the one or more modified attributes are generated by: \nextracting environmental information from the one or more environmental attributes, wherein the environmental information indicates obstacles present in the predetermined ROI; \nreceiving predictions from a cloud server; \nmerging the environmental information and the predictions received from the cloud server; and \ngenerating the one or more modified attributes based on the merging of the environmental information and the predictions received from the cloud server; \nreceiving, by the navigation device, a second ROI data associated with the predetermined ROI along the upcoming trajectory path upon reaching the anticipated future location, \nwherein the second ROI is captured by a current field-of-view (FOV) of the camera sensor mounted on the AGV; \npredicting, by the navigation device, one or more dynamic attributes associated with the predetermined ROI along the upcoming trajectory path based on the second ROI data using a second AI prediction model deployed on the navigation device; \ndetermining, by the navigation device, an error based on a comparison between the one or more modified attributes with the one or more dynamic attributes; \ngenerating, by the navigating device, one or more refined attributes by correcting the one or more modified attributes based on the error; \nupdating, by the navigation device, the future navigation trajectory based on the one or more refined attributes to generate a final navigation trajectory to refine the one or more dynamic attributes, wherein the one or more modified attributes are refined based on the one or more dynamic attributes to generate the final navigation trajectory; and \ncontrolling, by the navigation device, the AGV to follow the final navigation trajectory. \n | 2. The method of claim 1, wherein, the one or more dynamic attributes associated with the predetermined ROI are predicted, by the second AI prediction model by performing at least one of: the semantic segmentation, the object detection, and the lane detection.\n | 3. The method of claim 1, wherein the first ROI data is further captured using a set of vision sensors installed on or other AGVs, and wherein the first ROI data is provided to the navigation device or vehicle-to-vehicle (V2V) communication network.\n | 4. The method of claim 1, wherein the one or more environmental attributes comprise at least one of a type of an obstacle present in the upcoming trajectory path, and a location of the obstacle.\n | 5. A navigation device for determining a navigation trajectory for an autonomous ground vehicle (AGV), the navigation device comprising: \na processor; and \na memory communicatively coupled to the processor, wherein the memory stores processor instructions, which, on execution, causes the processor to: \nreceive first Region of Interest (ROI) data associated with an upcoming trajectory path, \nwherein the first ROI data is captured using a set of vision sensors installed on one or more road side infrastructures over vehicle-to-infrastructure (V2I) communication network, \nwherein the first ROI data is an environmental data indicative of an obstacle present in a predetermined ROI along the upcoming trajectory path from an anticipated future location of the AGV; \nderive one or more environmental attributes associated with the predetermined ROI based on the first ROI date; \nreceive one or more predicted attributes associated with a static map Of the predetermined ROI, from a cloud; \nwherein the one or more predicted attributes are predicted by performing at least one of: a semantic segmentation, an object detection, and a lane detection based—on map data associated with the predetermined ROI using a first artificial intelligence (AI) prediction model deployed on a cloud-based computing device; \nmodify the one or more predicted attributes associated with the static map of the predetermined ROI, based on the one or more environmental attributes associated with the predetermined ROI, to generate one or more modified attributes associated with the static map of the predetermined ROI along the future navigation trajectory, wherein the one or more modified attributes are generated by: \nextracting environmental information from the one or more environmental attributes, wherein the environmental information indicates obstacles present in the predetermined ROI, \nreceiving predictions from a cloud server, \nmerging the environmental information and the predictions received from the cloud server, and \ngenerating the one or more modified attributes based on the merging of the environmental information and the predictions received from the cloud server, \nreceive a second ROI data associated with the predetermined ROI along the upcoming trajectory path upon reaching the anticipated future location, \nwherein the second ROI is captured by a current field-of-view (FOV) of the camera sensor mounted on the AGV; \npredict one or more dynamic attributes associated with the predetermined ROI along the upcoming trajectory path based on the second ROI data using a second AI prediction model deployed on the navigation device; \ndetermine an error based on a comparison between the one or more modified attributes with the one or more dynamic attributes; \ndetermine an error based on a comparison between the one or more modified attributes with the one or more dynamic attributes; \ngenerate one or more refined attributes by correcting the one or more modified attributes based on the error; \nupdate the future navigation trajectory based on the one or more refined attributes to generate a final navigation trajectory to refine the one or more dynamic attributes, wherein the one or more modified attributes are refined based on the one or more dynamic attributes to generate the final navigation trajectory; and \ncontrol the AGV to follow the final navigation trajectory. \n | 6. The navigation device of claim 5, wherein the one or more dynamic attributes associated with the predetermined ROI are predicted, based-on by the second AI prediction model by performing at least one of: the semantic segmentation, the object detection, and the lane detection.\n | 7. The navigation device of claim 5, wherein the first ROI data is captured using a set of vision sensors installed on or other AGVs, and wherein the first ROI data is provided to the navigation device over vehicle-to-vehicle (V2V) communication network.\n | 8. The navigation device of claim 5, wherein the one or more environmental attributes comprise at least one of a type of an obstacle present in the upcoming trajectory path, and a location of the obstacle.\n | 9. A non-transitory computer-readable storage medium having stored thereon, a set of computer-executable instructions causing a computer comprising one or more processors to perform steps comprising: \nreceiving first Region of Interest (ROI) data associated with an upcoming trajectory path, \nwherein the first ROI data is captured using a set of vision sensors installed on one or more road side infrastructures, \nwherein the first ROI data is an environmental data indicative of an obstacle present in a predetermined ROI along the upcoming trajectory path from an anticipated future location of an autonomous ground vehicle (AGV); \nderiving one or more environmental attributes associated with the predetermined ROI based on the first ROI data; \nreceiving one or more predicted attributes associated with a static map of the predetermined ROI, from a cloud, \nwherein the one or more predicted attributes are predicted by performing at least one of: a semantic segmentation, an object detection, and a lane detection on map data associated with the predetermined ROI using a first artificial intelligence (AI) prediction model deployed on a cloud-based computing device; \nmodifying the one or more predicted attributes associated with the static map of the predetermined ROI, based on the one or more environmental attributes associated with the predetermined ROI to generate one or more modified attributes associated with the static map of the predetermined ROI along the future navigation trajectory, wherein the one or more modified attributes are generated by: \nextracting environmental information from the one or more environmental attributes, wherein the environmental information indicates obstacles present in the predetermined ROI, \nreceiving predictions from a cloud server, \nmerging the environmental information and the predictions received from the cloud server, and \ngenerating the one or more modified attributes based on the merging of the environmental information and the predictions received from the cloud server, \nreceiving a second ROI data associated with the predetermined ROI along the upcoming trajectory path upon reaching the anticipated future location, \nwherein the second ROI is captured by a current field-of-view (FOV) of the camera sensor mounted on the AGV; \npredicting one or more dynamic attributes associated with the predetermined ROI along the upcoming trajectory path based on the second ROI data using a second AI prediction model deployed on the navigation device; \ndetermining an error based on a comparison between the one or more modified attributes with the one or more dynamic attributes; \ngenerating one or more refined attributes by correcting the one or more modified attributes based on the error; \nupdating the future navigation trajectory based on the one or more refined attributes to generate a final navigation trajectory to refine the one or more dynamic attributes, wherein the one or more modified attributes are refined based on the one or more dynamic attributes to generate the final navigation trajectory; and \ncontrolling the AGV to follow the final navigation trajectory. \n | 10. The non-transitory computer-readable storage medium of 9, wherein one or more dynamic attributes associated with the ROI are predicted by the second AI prediction model by performing based-on at least one of a semantic segmentation, an object detection, and a lane detection.\n | 11. The non-transitory computer-readable storage medium of claim 9, wherein the first ROI data is captured using a set of vision sensors installed on other AGVs, and wherein the first ROI data is provided to the navigation device over vehicle-to-vehicle (V2V) communication network.\n | 12. The non-transitory computer-readable storage medium of claim 9, wherein the one or more environmental attributes comprise at least one of a type of an obstacle present in the upcoming trajectory path, and a location of the obstacle.", + "output": "The method involves receiving region of interest (ROI) data associated with an upcoming trajectory path by a navigation device, where the ROI data is captured while approaching the trajectory path from an anticipated future location of an autonomous ground vehicle (AGV). A set of predicted attributes associated with a future navigation trajectory for the trajectory is received by the navigation device. The attributes are modified based on a set of environmental attributes to generate modified attributes. The modified attributes are refined based on the dynamic attributes by the device to generate a final navigation trajectory by using an artificial intelligence (AI) prediction model. INDEPEDENT CLAIMS are included for the followinga navigation device for determining navigation trajectory for AGV; anda non-transitory computer-readable storage medium storing program for determining navigation trajectory for AGV. Method for determining navigation trajectory for AGV i.e. autonomous ground vehicle (AGV), in indoor and outdoor settings to facilitate efficient transportation. The AGV is capable of sensing the dynamic changing environment, and of accurately navigating without any human intervention. The method provides for automatic classification for the objects detected in an environment and for enhancing mapping for the AGVs. The drawing shows a schematic view of the system for determining a navigation trajectory.602Computing system 604Processor 608Input device 610Output device 626RAM ", + "instruction": "Please summarize the input" + }, + { + "input": "MODIFYING BEHAVIOR OF AUTONOMOUS VEHICLE BASED ON PREDICTED BEHAVIOR OF OTHER VEHICLESA vehicle configured to operate in an autonomous mode could determine a current state of the vehicle and the current state of the environment of the vehicle. The environment of the vehicle includes at least one other vehicle. A predicted behavior of the at least one other vehicle could be determined based on the current state of the vehicle and the current state of the environment of the vehicle. A confidence level could also be determined based on the predicted behavior, the current state of the vehicle, and the current state of the environment of the vehicle. In some embodiments, the confidence level may be related to the likelihood of the at least one other vehicle to perform the predicted behavior. The vehicle in the autonomous mode could be controlled based on the predicted behavior, the confidence level, and the current state of the vehicle and its environment.|1. A method, comprising: \n * determining, using a computer system (112), a current state of a vehicle (308), wherein the vehicle is configured to operate in an autonomous mode;\n * determining, using the computer system, a current state of an environment of the vehicle (308), wherein the environment of the vehicle (308) comprises at least one other vehicle (312, 314);\n * determining, using the computer system, a predicted behavior of the at least one other vehicle (312, 314) based on at least the current state of the vehicle (308) and the current state of the environment of the vehicle (308);\n * determining, using the computer system, a confidence level, wherein the confidence level comprises a likelihood of the at least one other vehicle (312, 314) to perform the predicted behavior, and wherein the confidence level is based on at least the predicted behavior, the current state of the vehicle (308), and the current state of the environment of the vehicle (308); and\n * controlling, using the computer system, the vehicle (308) in the autonomous mode based on the predicted behavior, the confidence level, the current state of the vehicle (308), and the current state of the environment of the vehicle (308).\n | 2. The method of claim 1, wherein determining the current state of the vehicle comprises determining at least one of a current speed of the vehicle, a current heading of the vehicle, a current position of the vehicle, and a current lane of the vehicle.\n | 3. The method of claim 1, wherein determining the current state of the environment of the vehicle comprises determining at least one of a respective position of the at least one other vehicle, a respective speed of the at least one other vehicle, and a position of an obstacle.\n | 4. The method of claim 1, wherein controlling the vehicle comprises at least one of controlling the vehicle to accelerate, controlling the vehicle to decelerate, controlling the vehicle to change heading, controlling the vehicle to change lanes, controlling the vehicle to shift within the current lane and controlling the vehicle to provide a warning notification.\n | 5. The method of claim 1, wherein the predicted behavior is determined by obtaining a match or near match between the current states of the vehicle and the environment of the vehicle and predetermined scenarios.\n | 6. The method of claim 4, wherein the warning notification comprises at least one of a horn signal, a light signal, and a vehicle-to-vehicle communication message transmission and optionally wherein the vehicle-to-vehicle communication message transmission is transmitted using a dedicated short range communications (DSRC) device.\n | 7. A vehicle (308), comprising: \n * at least one sensor (310), wherein the at least one sensor is configured to acquire: \n * i) vehicle state information; and\n * ii) environment state information;\n * wherein the vehicle state information comprises information about a current state of the vehicle (308), wherein the environment state information comprises information about a current state of an environment of the vehicle (308), wherein the environment of the vehicle comprises at least one other vehicle (312, 314); and\n * a computer system configured to: \n * i) determine the current state of the vehicle (308) based on the vehicle state information;\n * ii) determine the current state of the environment of the vehicle (308) based on the environment state information;\n * iii) determine a predicted behavior of the at least one other vehicle (312, 314) based on at least the current state of the vehicle (308) and the current state of the environment of the vehicle (308);\n * iv) determine a confidence level, wherein the confidence level comprises a likelihood of the at least one other vehicle (312, 314) to perform the predicted behavior, and wherein the confidence level is based on at least the predicted behavior, the current state of the vehicle (308), and the current state of the environment of the vehicle (308); and\n * v) control the vehicle (308) in the autonomous mode based on the predicted behavior, the confidence level, the current state of the vehicle (308), and the current state of the environment of the vehicle (308).\n | 8. The vehicle of claim 7, wherein the at least one sensor comprises at least one of a camera, a radar system, a lidar system, a global positioning system, and an inertial measurement unit.\n | 9. The vehicle of claim 7, wherein the computer system is further configured to determine the current state of the vehicle based on at least one of a current speed of the vehicle, a current heading of the vehicle, a current position of the vehicle, and a current lane of the vehicle.\n | 10. The vehicle of claim 7, wherein the computer system is further configured to determine the current state of the environment of the vehicle based on at least one of a respective position of the at least one other vehicle, a respective speed of the at least one other vehicle, a position of an obstacle, and a map of the roadway.\n | 11. The vehicle of claim 7, wherein the computer system is further configured to cause at least one of accelerating the vehicle, decelerating the vehicle, changing a heading of the vehicle, changing a lane of the vehicle, shifting a position of the vehicle within a current lane, and providing a warning notification.\n | 12. The vehicle of claim 7, wherein the warning notification comprises at least one of a horn signal, a light signal, and a vehicle-to-vehicle communication message transmission.\n | 13. The vehicle of claim 7, wherein the vehicle-to-vehicle communication message transmission is transmitted using a dedicated short range communications (DSRC) device.\n | 14. A non-transitory computer readable medium having stored therein instructions executable by a computer system to cause the computer system to perform functions comprising: \n * determining a current state of a vehicle (308), wherein the vehicle is configured to operate in an autonomous mode;\n * determining a current state of an environment of the vehicle (308), wherein the environment of the vehicle comprises at least one other vehicle (312, 314);\n * determining a predicted behavior of the at least one other vehicle (312, 314) based on at least the current state of the vehicle (308) and the current state of the environment of the vehicle (308);\n * determining a confidence level, wherein the confidence level comprises a likelihood of the at least one other vehicle (312, 314) to perform the predicted behavior, and wherein the confidence level is based on at least the predicted behavior, the current state of the vehicle (308) and the current state of the environment of the vehicle (308); and\n * controlling the vehicle (308) in the autonomous mode based on the predicted behavior, the confidence level, the current state of the vehicle (308), and the current state of the environment of the vehicle (308).\n | 15. A computer program to be executed by the computer system of the vehicle claimed in any one of claims 7 to 13 to perform a method as claimed in any one of claims 1 to 6.", + "output": "The behavior modification method involves determining confidence level which comprises the likelihood of at least one other vehicle (314,316) to perform a predicted behavior including acceleration, deceleration, change heading, change lanes, and leaving roadway. The confidence level is determined is based on predicted behavior of other vehicle, current vehicle state, and current vehicle environment state. Own vehicle (308) is controlled in autonomous mode based on the predicted behavior, confidence level, current vehicle state, and current vehicle environment state. INDEPENDENT CLAIMS are included for the following:a vehicle; anda non-transitory computer readable medium for storing instructions executable by computer system. Behavior modification method for vehicle (claimed) e.g. truck based on predicted behavior of other vehicle. Interaction between vehicles is allowed through the peripherals. Safety is improved through the computer system that causes the vehicle to slow down slightly by reducing the throttle. The drawing shows the top view of the autonomous vehicle operating scenario.302Left-most lane304Center lane306Right-most lane308Own vehicle314,316Other vehicle320Scenario", + "instruction": "Please summarize the input" + }, + { + "input": "Portable universal autonomous driving systemThis invention includes an autonomous driving system for automobiles, comprising: one or more common electronic communication ports of autonomous driving (communication ports) that are built-in on the automobiles; and one or more universal autonomous driving portable controllers (portable controllers) that are to be plugged-in to the said communication ports that are built-in on the automobiles. The interfaces of the communication ports and the portable controllers are both standardized such that the portable controllers can be plugged-in universally to all of the automobiles that are equipped with the built-in communication ports. The communication ports comprise electronic communication of all relevant electronic control units (ECUs) and feedback information of the automobiles, dedicated for the said portable controllers to communicate with and to control the automobiles. In addition to the portable controllers, the communication ports comprise a buffer that is designed to execute a short duration of controls to make emergency stops, in case of loss of connection with the portable controllers due to accidents or other failure conditions. The portable controllers comprise a central control unit (CCU), and a plurality of sensors and processors, and a plurality of data storages, and a plurality of data links, and a Global Positioning System (GPS). The portable controllers have standardized interfaces that match with that of the communication ports. The invention disclosed herein enables all automobiles to be ready for autonomous driving with minimal cost, provided that the said communication ports are adapted to the automobiles. The said portable controllers integrate all the hardware and software relevant to autonomous driving as standalone devices which can share the components, simplify the systems, reduce parasitic material and components, and most importantly, will be safer when multiple sensors and processors that are based on different physics are grouped together to detect objects and environment conditions. A method of compound sensor clustering (CSC) is introduced herein. The CSC method makes the sensors and processors to self-organize and to address real-world driving conditions. The portable controllers can be mass-produced as standard consumer electronics at lower cost. The portable controllers can also be more easily updated with the latest technologies since that they are standalone devices, which would be otherwise hard to achieve when the hardware and software are built-in permanently as part of the automobiles. The invention disclosed herein is more efficient, since that the portable controllers can be plugged-in to the automobiles when there are needs for autonomous driving, comparing with current methods of integrating autonomous driving control hardware and software that are built-in to automobiles permanently, which may not be used for autonomous driving frequently. The system also decouples the liability from automotive manufactures in case of accidents. The portable controllers can be insured by insurance companies independently, much like insuring human drivers.I claim: \n | 1. An autonomous driving system for an automobile, comprising: \na) one or more common electronic communication ports ( 100) for autonomous driving, wherein the communication ports are built-in on the automobile;\nb) one or more universal autonomous driving portable controllers, wherein said portable controllers are plugged in to the exterior of the automobile via the communication ports to detect a driving environment and to control the automobile for autonomous driving; wherein the communication ports and portable controllers share common interfaces; \nc) said one or more communication ports having a primary high speed control area network wherein said primary high speed control area network providing communication between said one or more portable controllers via said one or more communication ports, and at least one electronic control unit, further wherein said at least one electronic control unit configured to control at least one of steering, braking, and acceleration; \nd) said one or more communication ports having a secondary control area network, said secondary control area network configured to provide electronic communication, via said one or more communication ports, between said one or more portable controllers and secondary electronic control units, said secondary electronic control units configured to control at least one of turn signals, brake lights, emergency lights, head lamps and tail lamps, fog lamps, windshield wipers, defrosters, defogs, window regulators, and door locks; \ne) said one or more communication ports having a tertiary control area network configured to electronically communicate at least one feedback parameter to said one or more portable controllers, via said one or more communication ports, said at least one feedback parameter comprised one or more of velocity, acceleration, ABS activation, airbag deployment, and traction control activation; \nf) said one or more communication ports having a quaternary control area network configured to electronically communicate at least one status parameter to said one or more portable controllers via said one or more communication ports, said at least one status parameter comprised of one or more of fuel level, battery charge, tire pressure, engine oil level, coolant temperature, and windshield washer level; \ng) said one or more communication ports having a buffer memory controller that provides emergency control instruction for emergency stops of the automobiles in the event of loss of electronic connection with the portable controller due to accidents or other failure conditions; \nh) said one or more communication ports having electronic connections to the portable controllers and adapted to take at least one of the methods of: wired pin connections, wireless connections, or combinations of wired pin and wireless connections; \ni) said one or more portable controllers adapted for mounting locations and anchorages for the portable controllers, which match with the configurations of the portable controllers; \nj) a driver interface, said driver interface positioned to enable the driver to provide driving instructions to said one or more portable controllers; \nk) said one or more portable controllers having a plurality of sensors, said plurality of sensors comprising: \ni. one or more digital color cameras that detect optical information; \nii. one or more LIDARs that detect geometrical information; \niii. task specific sensors, including one or more ultrasonic sensors to detect near distance objects; \niv. one or more RADARs to detect median and far distance objects; \nv. one or more thermal imaging cameras or passive infrared sensors to detect objects that have heat emissions; \nvi. one or more three dimensional accelerometers to detect acceleration and vibration in vertical, lateral, and fore/aft directions; \nvii. one or more gyroscopes to detect inclination angles; \nviii. one or more physical-chemical sensors which adapted to detect specific air contents; \nix. one or more sound sensors to detect human languages or warning sirens; \nx. one or more water sensors for detecting rain and rain intensity; \nxi. one or more temperature sensors adapted for detecting temperature at the vicinity of the automobiles; \nl) said one or more portable controllers having a plurality of processors comprising: \ni. one or more processors for the digital color cameras; \nii. one or more processors for the LIDARs; \niii. one or more processors for the ultrasonic sensors; \niv. one or more processors for the RADARs; \nv. one or more processors for the thermal imaging cameras or passive infrared sensors; \nvi. one or more processors for the one or more three dimensional accelerometers; \nvii. and one or more processors for the gyroscopes; \nviii. one or more processors for the physical-chemical sensors; \nix. one or more processors for the sound sensors; \nx. one or more processors for the water sensors; \nxi. one or more processors for the temperature sensors; \nm) said one or more portable controllers programmed to generate driving instructions based on information from said plurality of processors; said processors of the plurality of processors programmed to generate queries addressing specific driving conditions, said specific driving conditions being determined by pre-defined criteria, wherein said queries include queries between the processors of said plurality of processors, said queries programmed in the processors; \nn) said one or more portable controllers having a Central Control Unit to direct the operations of the processors; \no) said one or more portable controllers having a plurality of communication links to send and/or receive data, said communication links including vehicle-to-vehicle and vehicle-to-infrastructure links; \np) said one or more portable controllers having a global positioning system to identify the locations of the automobiles to which the portable controllers are plugged-in; \nq) said one or more universal autonomous driving portable controllers are compatible with said communication ports. \n | 2. The autonomous driving system of claim 1 wherein, \na. the processors of said plurality of processors are integrated into said one or more portable controllers, \nb. the sensors of said plurality of sensors are each integrated with at least one of the processors. \n | 3. The autonomous driving system of claim 2 wherein the sensors of said plurality of sensors are built on one or more common substrates and/or integrated circuit boards.\n | 4. The autonomous driving system of claim 2 further comprising wherein querying sensors are dynamically organized as clusters to function as groups such that sensors and processors communicate with each other to validate sensed information pertaining to specific driving conditions.\n | 5. The autonomous driving system of claim 2 further comprising wherein queries function to detect mismatches between information between sensors and alert the Central Control Unit when mismatches are found.\n | 6. The autonomous driving system of claim 5 wherein a mismatch between LIDARs and RADARs generates an alert to the central Control Unit, thereby enabling the Central Control Unit to respond to potential hazards.\n | 7. The autonomous driving system of claim 5 wherein information derived from queries from the temperature sensors and water sensors is used to jointly determine a potential freezing rain condition.\n | 8. The autonomous driving system of claim 5 wherein the queries for detection of said potential freezing rain condition include detection of rain, and/or ice, and/or snow using captured images and pattern recognition.\n | 9. The autonomous driving system of claim 5 wherein detection of smoke by said physical chemical is used to query the thermal imaging cameras of passive infrared sensors to determine if there is a hazardous fire condition.\n | 10. The autonomous driving system of claim 5 wherein road curvatures are detected by the cameras and/or LIDARs when lateral acceleration is detected by combined information from said gyroscopes and accelerometers to inform the central control unit of lateral stability status.\n | 11. The autonomous driving system of claim 8 wherein ABS activation feedback triggers querying the water and temperature sensors.\n | 12. The autonomous driving system of claim 2 wherein the cameras are queried to identify icy road surfaces, thereby generating a categorized information of low coefficient of friction road surface to the Central Control Unit.\n | 13. The autonomous driving system of claim 5 wherein information derived from queries from the cameras and thermal sensors is used to jointly verify an existence of pedestrians.\n | 14. The autonomous driving system of claim 5 wherein, \na. said thermal sensors are queried to detect a human heat signature, and if the human heat signature is detected, then, \nb. the thermal sensor's processor queries object detection sensors for the presence of a human, said object sensors comprising the cameras, LIDARs, RADAR and/or the ultrasonic sensors. \n | 15. The autonomous driving system of claim 5 wherein information derived from the RADARs and/or the ultrasonic sensors detection of a potential road sign generates a query to the cameras, thereby reducing likelihood of missing or misidentifying road signs.\n | 16. The autonomous driving system of claim 15 wherein the queries from a RADAR are generated for detection of a road sign not identified or misidentified by camera captured images and pattern recognition.\n | 17. The autonomous driving system of claim 3 further comprising wherein querying sensors are dynamically organized as clusters to function as groups such that sensors and processors communicate with each other to validate sensed information pertaining to specific driving conditions.\n | 18. The autonomous driving system of claim 17, further comprising wherein queries function to detect mismatches between information between sensors and alert the Central Control Unit when mismatches are found.\n | 19. The autonomous driving system of claim 14, wherein the sensors of said plurality of sensors are built on one or more common substrates and/or integrated circuit boards.\n | 20. The autonomous driving system of claim 3, wherein one or more of the queries from at least one of said one or more RADARs are generated for detection of road signs not identified or misidentified by camera captured images and pattern recognition.", + "output": "The computerized control system has common electronic communication ports (100) that are built-in on each of automobiles, and one or more universal autonomous driving portable controllers (200) that can be attached to the automobiles via the communication ports to accomplish the computerized control or autonomous driving. INDEPENDENT CLAIMS are included for the following:a design for buffer memory controller (BMC);a design of location of interface of communication ports;a design of the communication ports;a design of universal autonomous driving portable controllers;a sensor;a compound sensor clustering method;a back-up safety mechanism of interacting with the buffer memory controller; anda manufacturing rights of the electronic communication port of autonomous driving. Computerized control system or autonomous driving for automobiles. Ensures that computerized control or autonomous driving much more efficient, since that the portable controllers can be plugged-in to any of the automobiles that are equipped with the communication ports when there are needs for autonomous driving. The drawing shows the design of universal autonomous driving portable controller and its relation to common electronic communication port of autonomous driving. 100Common electronic communication ports160Mounting fixtures on automobiles200Autonomous driving portable controllers221Data storages230Central control unit241Data links260Wired or wireless user interface", + "instruction": "Please summarize the input" + }, + { + "input": "PREDICTING REALISTIC TIME OF ARRIVAL FOR QUEUE PRIORITY ADJUSTMENTA queue prioritization system and method for predicting a realistic time of arrival for performing a queue priority adjustment is provided. The method includes the steps of determining an estimated initial arrival time of a first user and a second user to a destination, the estimated initial arrival time being used to establish a queue priority, tracking a current location and predicting a route to be taken to arrive at the destination from the current location, detecting a schedule-altering event of the first user by analyzing: (i) the predicted route of the first user, or (ii) a current state of a vehicle, and reprioritizing the queue priority database, in response to calculating an updated queue priority of the first user that is lower than the queue priority of the second user, based on the detection of the schedule-altering event of the first user.CLAIMS \n | 1. A method for predicting a realistic time of arrival for performing a queue priority adjustment, the method comprising: \n determining, by a processor of a computing system, an estimated initial arrival time of a first user and a second user to a destination, the estimated initial arrival time being used to establish a queue priority of the first user and a queue priority of the second user, in a queue priority database, the queue priority of the first user being higher than the queue priority of the second user; \n tracking, by the processor, a current location of the first user and a current location of the second user, during transit to the destination; \n predicting, by the processor, a route to be taken to arrive at the destination from the current location of the first user and the current location of the second user, respectively; detecting, by the processor, a schedule-altering event of the first user by analyzing: (i) the predicted route of the first user, or (ii) a current state of a vehicle; and \n reprioritizing, by the processor, the queue priority database, in response to calculating an updated queue priority of the first user that is lower than the queue priority of the second user, based on the detection of the schedule-altering event of the first user. \n | 2. The method of claim 1, wherein determining the estimated initial arrival time of the first user and the second user includes: \n receiving, by the processor, a customer pick-up order and current GPS location information of the first user and the second user, the GPS location information obtained from a mobile device of the first user and the second user; \n reviewing, by the processor, historical user data, including a historical path taken by the first user and the second user to the destination; \n evaluating, by the processor, a complexity of the customer pick-up order to determine an earliest store pick-up time; and \n comparing, by the processor, the earliest store pick-up time, the current GPS location of the first user and the second user, and the historical path taken by the first user and the second user to the destination, to determine the estimated initial arrival time, in response to: (i) prompting the first user and the second user to depart for the destination, or (ii) receiving confirmation from the first user and the second user that the first user and the second user have departed for the destination. \n | 3. The method of claim 2, wherein prompting the first user and the second user to depart for the destination includes providing, by the processor, a suggested departure time based on at least one of: the earliest store pick-up time, current traffic conditions, current location of the first user and the second user, and historical traffic patterns of the first user and the second user.\n | 4. The method of claim 1, wherein determining the estimated initial arrival time includes: receiving, by the processor, scheduled delivery information and current GPS location information of the first user and the second user; \n reviewing, by the processor, historical delivery pattern data, based on previous \n deliveries to the destination; \n evaluating, by the processor, the scheduled delivery information, the current GPS location information of the first user and the second user, and the historical delivery pattern data, to determine an earliest delivery arrival time. \n | 5. The method of claim 1, wherein the detecting of the schedule -altering event includes receiving data from a plurality of data sources, the plurality of data sources including a current GPS location of the user received from a mobile device of the user, a real-time traffic data received from the mobile device of the user, a real-time traffic data received from a third party application server, a weather data received from the mobile device of the user, a weather data retrieved from a third party application server, a historical traffic pattern information of the user, a sensor data received from one or more sensors associated with the user, a vehicle and traffic information received from a vehicle-to- vehicle communication network, and a combination thereof.\n | 6. The method of claim 1, wherein the schedule-altering event is at least one of: a delay, a traffic jam, a traffic accident, a vehicle failure, a weather occurrence, an intervening stop by the first user, a wrong turn of the user, an alternative route taken by the user, a predicted traffic delay of the first user, and a predicted weather delay of the first user.\n | 7. The method of claim 1, wherein predicting the route of the first user includes analyzing the current location of the first user, current traffic data, construction data, historical routes to the destination taken by the first user, and map data.\n | 8. The method of claim 1, wherein reprioritizing the queue priority database causes: (i) an in-store pickup order associated with the second user to be available for pickup when the second user arrives at the destination, and before an in-store pickup order associated with the first user is available for pickup, or (ii) a delivery vehicle operated by the first user to be assigned to an available unloading location at the destination, when the first user arrives at the destination.\n | 9. The method of claim 1, wherein the first user and the second user is a customer, a delivery truck driver, an autonomous vehicle, or an unmanned drone.\n | 10. A computer system, comprising: \n a processor; \n a memory device coupled to the processor; and \n a computer-readable storage device coupled to the processor, wherein the storage device contains program code executable by the processor via the memory device to implement a method for predicting a realistic time of arrival for performing a queue priority adjustment, the method comprising: determining, by a processor of a computing system, an estimated initial arrival time of a first user and a second user to a destination, the estimated initial arrival time being used to establish a queue priority of the first user and a queue priority of the second user, in a queue priority database, the queue priority of the first user being higher than the queue priority of the second user; \n tracking, by the processor, a current location of the first user and a current location of the second user, during transit to the destination; \n predicting, by the processor, a route to be taken to arrive at the destination from the current location of the first user and the current location of the second user, respectively; \n detecting, by the processor, a schedule-altering event of the first user by \n analyzing: (i) the predicted route of the first user, or (ii) a current state of a vehicle; and \n reprioritizing, by the processor, the queue priority database, in response to \n calculating an updated queue priority of the first user that is lower than the queue priority of the second user, based on the detection of the schedule- altering event of the first user. \n | 11. The computer system of claim 10, wherein determining the estimated initial arrival time of the first user and the second user includes: \n receiving, by the processor, a customer pick-up order and current GPS location information of the first user and the second user, the GPS location information obtained from a mobile device of the first user and the second user; \n reviewing, by the processor, historical user data, including a historical path taken by the first user and the second user to the destination; \n evaluating, by the processor, a complexity of the customer pick-up order to determine an earliest store pick-up time; and \n comparing, by the processor, the earliest store pick-up time, the current GPS location of the first user and the second user, and the historical path taken by the first user and the second user to the destination, to determine the estimated initial arrival time, in response to: (i) prompting the first user and the second user to depart for the destination, or (ii) receiving confirmation from the first user and the second user that the first user and the second user have departed for the destination. \n | 12. The computer system of claim 1 1, wherein prompting the first user and the second user to depart for the destination includes providing, by the processor, a suggested departure time based on at least one of: the earliest store pick-up time, current traffic conditions, current location of the first user and the second user, and historical traffic patterns of the first user and the second user. \n | 13. The computer system of claim 10, wherein determining the estimated initial arrival time includes: receiving, by the processor, a scheduled delivery information and a current GPS location information of the first user and the second user; \n reviewing, by the processor, a historical delivery pattern data, based on previous deliveries to the destination; \n evaluating, by the processor, the scheduled delivery information, the current GPS location information of the first user and the second user, and the historical delivery pattern data, to determine an earliest delivery arrival time. \n | 14. The computer system of claim 10, wherein the detecting of the schedule-altering event includes receiving data from a plurality of data sources, the plurality of data sources including a current GPS location of the user received from a mobile device of the user, a real-time traffic data received from the mobile device of the user, a real-time traffic data received from a third party application server, a weather data received from the mobile device of the user, a weather data retrieved from a third party application server, a historical traffic pattern information of the user, a sensor data received from one or more sensors associated with the user, a vehicle and traffic information received from a vehicle-to- vehicle communication network, and a combination thereof.\n | 15. The computer system of claim 10, wherein the schedule -altering event is at least one of: a delay, a traffic jam, a traffic accident, a vehicle failure, a weather occurrence, an intervening stop by the first user, a wrong turn of the user, an alternative route taken by the user, a predicted traffic delay of the first user, and a predicted weather delay of the first user.\n | 16. The computer system of claim 10, wherein predicting the route of the first user includes analyzing the current location of the first user, current traffic data, construction data, historical routes to the destination taken by the first user, and map data.\n | 17. The computer system of claim 10, wherein reprioritizing the queue priority database causes: (i) an in-store pickup order associated with the second user to be available for pickup when the second user arrives at the destination, and before an in-store pickup order associated with the first user is available for pickup, or (ii) a delivery vehicle operated by the first user to be assigned to an available unloading location at the destination, when the first user arrives at the destination.\n | 18. The computer system of claim 10, wherein the first user and the second user is a customer, a delivery truck driver, an autonomous vehicle, or an unmanned drone.\n | 19. A computer program product, comprising a computer-readable hardware storage device storing a computer-readable program code, the computer-readable program code comprising an algorithm that when executed by a computer processor of a computing system implements a method for predicting a realistic time of arrival for performing a queue priority adjustment, the method comprising: \n determining, by a processor of a computing system, an estimated initial arrival time of a first user and a second user to a destination, the estimated initial arrival time being used to establish a queue priority of the first user and a queue priority of the second user, in a queue priority database, the queue priority of the first user being higher than the queue priority of the second user; \n tracking, by the processor, a current location of the first user and a current location of the second user, during transit to the destination; \n predicting, by the processor, a route to be taken to arrive at the destination from the current location of the first user and the current location of the second user, respectively; detecting, by the processor, a schedule-altering event of the first user by analyzing: (i) the predicted route of the first user, or (ii) a current state of a vehicle; and \n reprioritizing, by the processor, the queue priority database, in response to calculating an updated queue priority of the first user that is lower than the queue priority of the second user, based on the detection of the schedule-altering event of the first user. \n | 20. The computer program product of claim 19, wherein determining the estimated initial arrival time of the first user and the second user includes: \n receiving, by the processor, a customer pick-up order and a current GPS location information of the first user and the second user, the GPS location information obtained from a mobile device of the first user and the second user; \n reviewing, by the processor, historical user data, including a historical path taken by the first user and the second user to the destination; \n evaluating, by the processor, a complexity of the customer pick-up order to determine an earliest store pick-up time; and \n comparing, by the processor, the earliest store pick-up time, the current GPS location of the first user and the second user, and the historical path taken by the first user and the second user to the destination, to determine the estimated initial arrival time, in response to: (i) prompting the first user and the second user to depart for the destination, or (ii) receiving confirmation from the first user and the second user that the first user and the second user have departed for the destination. \n | 21. The computer program product of claim 20, wherein prompting the first user and the second user to depart for the destination includes providing, by the processor, a suggested departure time based on at least one of: the earliest store pick-up time, current traffic conditions, current location of the first user and the second user, and historical traffic patterns of the first user and the second user.\n | 22. The computer program product of claim 19, wherein determining the estimated initial arrival time includes: \n receiving, by the processor, a scheduled delivery information and current GPS \n location information of the first user and the second user; \n reviewing, by the processor, a historical delivery pattern data, based on previous deliveries to the destination; and evaluating, by the processor, the scheduled delivery information, the current GPS location information of the first user and the second user, and the historical delivery pattern data, to determine an earliest delivery arrival time. \n | 23. The computer program product of claim 19, wherein the detecting of the schedule -altering event includes receiving data from a plurality of data sources, the plurality of data sources including a current GPS location of the user received from a mobile device of the user, a real-time traffic data received from the mobile device of the user, a real-time traffic data received from a third party application server, a weather data received from the mobile device of the user, a weather data retrieved from a third party application server, a historical traffic pattern information of the user, a sensor data received from one or more sensors associated with the user, a vehicle and traffic information received from a vehicle-to-vehicle communication network, and a combination thereof.\n | 24. The computer program product of claim 19, wherein the schedule -altering event is at least one of: a delay, a traffic jam, a traffic accident, a vehicle failure, a weather occurrence, an intervening stop by the first user, a wrong turn of the user, an alternative route taken by the user, a predicted traffic delay of the first user, and a predicted weather delay of the first user.\n | 25. The computer program product of claim 19, wherein predicting the route of the first user includes analyzing the current location of the first user, current traffic data, construction data, historical routes to the destination taken by the first user, and map data.\n | 26. The computer program product of claim 19, wherein reprioritizing the queue priority database causes: (i) an in-store pickup order associated with the second user to be available for pickup when the second user arrives at the destination, and before an in-store pickup order associated with the first user is available for pickup, or (ii) a delivery vehicle operated by the first user to be assigned to an available unloading location at the destination, when the first user arrives at the destination.\n | 27. The computer program product of claim 19, wherein the first user and the second user is a customer, a delivery truck driver, an autonomous vehicle, or an unmanned drone .", + "output": "The method involves tracking a current location of a first user and a current location of a second user during transit to a destination by a processor (141). A route to be taken to arrive at the destination from the current location of the first user and the second user is predicted by the processor. A schedule-altering event of the first user is detected by the processor by analyzing the predicted route of the first user, or a current state of a vehicle. A queue priority database (114) is reprioritized by the processor in response to calculating an updated queue priority of the first user that is lower than queue priority of the second user based on the detection of the schedule-altering event of the first user. INDEPENDENT CLAIMS are also included for the following:a computer systema computer program product comprising a set of instructions for predicting a realistic time of arrival for performing a queue priority adjustment for a user at a retail location by a computer system. Method for predicting a realistic time of arrival for performing a queue priority adjustment for a user e.g. customer, delivery lorry driver, autonomous vehicle or unmanned drone, at a retail location by a computer system (all claimed). The method enables allowing goods to be retrieved in response to the predicted time of arrival based on user input information on a purchase order or to be automatically generated based on complexity of the purchase order. The method enables allowing the customers to pick up purchased items at the retail location selected by the customer by maintaining the queue priority such that the purchased items are ready for the customer when the customer arrives at the retail location. The drawing shows a schematic block diagram of a queue prioritization system. 113Customer database114Queue priority database120Computer system141Processor142Memory", + "instruction": "Please summarize the input" + }, + { + "input": "Method for controlling an autonomously operated vehicle and error control moduleThe invention relates to a method for controlling an autonomously operated vehicle (1) in the event of an error (Fm; m = 1,2, ... N0), with at least the following steps: \n- Detection of a fault (Fm) in the vehicle (1); \n- Evaluating the detected error (Fm) and, as a function thereof, outputting an error assessment result (EF), the error assessment result (EF) indicating a significance of the detected error (Fm); \n- Determination of a local environment (Uk; k = 1,2, ..., N3) in which the vehicle (1) is located; \n- Selecting and activating an emergency operating mode as a function of the determined local environment (Uk) as well as the output evaluation result (EF) and / or the significance of the detected error (Fm); and \n- Autonomous control of the vehicle (1) as a function of the selected and activated emergency operating mode using at least one movement system (100; 101, 102, 103) and / or environment detection system (300; 301, 302, 303, 304, 305) in the vehicle (1).|1. Method for controlling an autonomously operated vehicle (1) in the event of an error (Fm; m = 1,2, ... N0), with at least the following steps:\n- Detection of a fault (Fm) in the vehicle (1) (ST1); \n- Assessment of the detected error (Fm) (ST2) and, as a function thereof, output of an error evaluation result (EF) (ST3), the error evaluation result (EF) having a significance (Bi; i = 1,2 ... N1) of the detected fault (Fm); \n- Determination of a local environment (Uk; k = 1,2, ..., N3) in which the vehicle (1) is located (ST4); \n- Selecting and activating an emergency operating mode (NBnk; n = 1,2, ..., N2) depending on the determined local environment (Uk) and the output evaluation result (EF) and / or the significance (Bi) of the detected error (Fm) (ST5); and \n- Autonomous control of the vehicle (1) as a function of the selected and activated emergency operating mode (NBnk) using at least one movement system (100; 101, 102, 103) and / or environment detection system (300; 301, 302, 303, 304, 305 ) in the vehicle (1) (ST6). \n | 2. Procedure according to Claim 1, characterizedthat the error (Fm) from in the vehicle (1) via a vehicle-internal data transmission system (30), for example a CAN bus (31), or from directly transmitted status signals (So, o = 1, 2, ..., N4) is derived.\n | 3. Procedure according to Claim 2, characterizedthat the status signals (So) from the at least one movement system (100; 101, 102, 103) of the vehicle (1) and / or environment detection system (300; 301, 302, 303, 304, 305) of the vehicle (1) and / or from further sources (3) of the vehicle (1), for example a V2X module (3), the status signals (So) indicating whether the respective movement system (100; 101, 102, 103) and / or the environment detection system (300; 301, 302, 303, 304, 305) has an error (Fm).\n | 4. Method according to one of the preceding claims, characterizedthat as an error (Fm) at least\n- an elementary, fatal error (F1) or \n- a moderate error (F2) or \n- a non-safety-critical error (F3) can be detected. \n | 5. Method according to one of the preceding claims, characterizedthat the error evaluation result (EF) output is at least that there is an error (Fm) with high importance (B1) or medium importance (B2) or low importance (B3).\n | 6. Procedure according to Claim 4 or 5, characterizedthat if present\n- A fault (Fm) with high significance (B1) and / or an elementary, serious fault (F1) depending on the local environment (Uk) a first emergency operating mode (NB1k) is activated to use the vehicle (1) of the at least one movement system (100; 101, 102, 103) and / or environment detection system (300; 301, 302, 303, 304, 305) in the vehicle (1) to be brought to a standstill (H) autonomously, or \n- an error (Fm) with medium significance (B2) and / or a moderately serious error (F2) depending on the local environment, a second emergency operating mode (NB2k) is activated in order to move the vehicle (1) using the at least one movement system ( 100; 101, 102, 103) and / or environment detection system (300; 301, 302, 303, 304, 305) in the vehicle (1) to move autonomously to a stopping area (HB), or \n- an error (Fm) of little importance (B3) and / or a non-safety-critical error (F3) depending on the local environment (Uk) a third emergency operating mode (NB3k) is activated in order to stop the autonomous driving of the vehicle (1) To continue using the at least one movement system (100; 101, 102, 103) and / or environment detection system (300; 301, 302, 303, 304, 305) in the vehicle (1). \n | 7. Procedure according to Claim 6, characterizedthat in the first emergency operating mode (NB1k) at least one brake system (101) and / or a drive system (103) for reducing the engine power (ML) is controlled autonomously in order to bring the vehicle (1) to a standstill (H), and as a function of the local environment (Uk), a steering system (103) is still controlled autonomously in order to enable an avoidance to a secured area (BS), for example an emergency lane (BS1), before the standstill (H) is reached.\n | 8. Procedure according to Claim 6 or 7, characterizedthat in the second emergency operating mode (NB2k) a drive system (102) and / or a braking system (101) and / or a steering system (103) of the vehicle (1) are controlled autonomously in such a way that the vehicle (1) moves with reduced Speed ??(vred) and / or with reduced engine power (ML) moves autonomously along a defined driving trajectory (T) to the stopping area (HB).\n | 9. Procedure according to Claim 8, characterizedthat the neck area (HB) and / or the reduced speed (vred) and / or the motor power (ML) is selected as a function of the local environment (Uk).\n | 10. Method according to one of the Claims 6 to 9, characterizedthat the first emergency operating mode (NB1k) is fixed and unchangeable and / or the second emergency operating mode (NB2k) can be expanded and / or the third and / or further emergency operating modes (NBnk, for n> = 3) changed and / or can be expanded.\n | 11. Method according to one of the preceding claims, characterizedthat a motorway (U1), a country road (U2), an urban environment (U3), a depot (U4), a construction site (U5) or a port area (U6) are determined as the local environment (Uk).\n | 12. Method according to one of the preceding claims, characterizedthat the local environment (Uk) is an environment with a public traffic area (?) or an environment with an enclosed area (G).\n | 13. Method according to one of the preceding claims, characterizedthat the local environment (Uk) is determined as a function of position information (PI) and / or environment information (UI), the position information (PI) being based on automatically provided position data (DPa) and / or manually entered position data (DPm) and the environment information (UI) is based on provided environment data (DU) that are output, for example, by environment detection systems (300; 301, 302, 303, 304, 305) in the vehicle (1) become.\n | 14. Procedure according to Claim 13, characterizedthat the automatically provided position data (DPa) are output by a position detection device (70) and contain a global position (Pg) of the vehicle (1) and / or are output by a telematics system (400), the telematics system ( 400) accesses external information (IX) which is transmitted via a local data interface (8).\n | 15. Procedure according to Claim 13 or 14, characterizedthat the local environment (Uk) is extracted from the automatically provided position data (DPa) via map data (KD), in particular a navigation system (7).\n | 16. Method according to one of the preceding claims, characterizedthat the autonomously and / or driverlessly controlled vehicle (1) is controlled according to an autonomy level (AS) equal to three or higher.\n | 17. Method according to one of the preceding claims, characterizedthat the error (Fm) and / or the error evaluation result (EF) after the detection (ST1) and the evaluation (ST2, ST3) of the error (Fm) via a communication module (50) and / or a V2X module ( 3) is issued, for example to a vehicle operator (K1), a dispatcher (K2), to yard staff (K3) of a depot (U4) and / or to other people (K4) and / or to another vehicle (2) and / or to infrastructure facilities (200).\n | 18. Error control module (60) for autonomous control of a vehicle (1) in the event of an error (Fm), in particular according to a method according to one of the preceding claims, the error control module (60) being designed\n- An emergency operating mode (NBnk; n = 1,2 .. N2) as a function of a local environment (Uk) determined from position information (PI) and / or environment information (UI), in which the vehicle (1 ) and to select and activate an evaluation result (EF) output by an error evaluation module (40) and / or a significance (Bi) of an error (Fm) detected by an error detection module (20), and \n- The vehicle (1) autonomously depending on the selected and activated emergency operating mode (NBnk) using at least one movement system (100; 101, 102, 103) and / or environment detection system (300; 301, 302, 303, 304, 305) to control in the vehicle (1). \n | 19. Vehicle (1) with a movement coordination module (10) for coordinating and controlling movement systems (100; 101, 102, 103) and / or surrounding systems (300; 301, 302, 303, 304, 305) in the vehicle (1) for autonomous control of the vehicle (1), an error detection module (20) for detecting an error (Fm) in the vehicle (1), an error evaluation module (40) for evaluating the detected error (Fm) and to output an evaluation result (EF) and with an error control module (60) Claim 18 for autonomous control of the vehicle (1), in particular via the movement coordination module (10), in the event of an error (Fm).", + "output": "The method involves detecting a fault (Fm) in vehicle (1). The assessment of the detected fault is performed. An error evaluation result (EF) is outputted, where the error evaluation result has a significance of detected fault. A local environment in which the vehicle is located is determined. An emergency operating mode is selected and activated as a function of a determined local environment and the evaluation result and/or the significance of the detected fault. The autonomous control of the vehicle is performed as a function of the selected and activated emergency operating mode using one movement system (100-103) and/or environment detection system (300-304) in the vehicle. INDEPENDENT CLAIMS are included for the following:an error control module for autonomous control of a vehicle in the presence of an error; anda vehicle with a movement coordination module for coordinating and controlling movement systems and/or surrounding systems in the vehicle. Method for controlling autonomously operated vehicle e.g. truck in event of fault, using error control module (claimed). The autonomous driving operation is ensured safely and efficiently in the event of a fault. The vehicle is autonomously controlled as a function of the local environment when the error occurs, where the error that can have an effect on the own vehicle and/or on an environment around the own vehicle taxes to a certain extent. The autonomous vehicles are operated in a single local environment, and in different local environments. The efficiency and the possibility of reacting to error can turn out to be different depending on the local environment, so that the different emergency operating modes are selected or activated depending on the environment. The warning is effectively issued to external persons or vehicles in addition to the autonomous control, when the vehicle is driving without any occupants. The drawing shows a schematic view of the autonomously operated vehicle in local environment with public traffic area. 1Vehicle100-103Movement system300-304Environment detection systemEFError evaluation resultFmFault", + "instruction": "Please summarize the input" + }, + { + "input": "METHOD FOR TRANSFORMING BETWEEN A LONG VEHICLE COMBINATION AND A PLATOON ON THE MOVEThe invention relates to a method for transforming between a long vehicle combination (10) and a platoon (12) on the move. The present invention also relates to vehicles (14a-b; 14b-c) for such a method.|1. A method for transforming between a long vehicle combination (10) and a platoon (12) on the move, wherein the long vehicle combination (10) comprises a plurality of vehicles (14a-c) mechanically coupled together one after the other, which method comprises the steps of: \n * detecting (S2) that the long vehicle combination (10) is approaching a first road section (58) ahead, by means of a navigation system (22) or by means of wireless vehicle-to-infrastructure (V2I) communication, which first road section (58) stipulates decoupling the vehicles (14a-c) of the long vehicle combination (10) to form the platoon (12);\n * automatically decoupling (S4) the vehicles (14a-c) from each other while the vehicles (14a-c) are in motion to form the platoon (12) before reaching the first road section (58);\n * the platoon (12) driving (S5) through the first road section (58);\n * detecting (S6) a second road section (62), by means of a navigation system (22) or by means of wireless vehicle-to-infrastructure (V2I) communication, which stipulates coupling together the vehicles (14a-c) of the platoon (12) to form the long vehicle combination (10);\n * a vehicle (14a-b) in the platoon (12) immediately ahead of a following vehicle (14b-c) of said the platoon (12) sending (S7) information (64) to the following vehicle (14b-c) via wireless vehicle-to-vehicle communication, which information (64) indicates the position and speed of a rear automatic coupling device (18) of the vehicle (14a-b) immediately ahead;\n * based at least on the position and speed indicated in the sent information (64), autonomously driving (S8) the following vehicle (14b-c) so that the rear automatic coupling device (18) of the vehicle (14a-b) immediately ahead of the following vehicle (14b-c) gets within an operational range (66) of a front coupling element (32) of the following vehicle (14b-c);\n * while in motion and when the rear automatic coupling device (18) is within the operational range (66), the following vehicle (14b-c) automatically adjusting (S9) a front coupling device (30) including said front coupling element (32) so that the position of the front coupling element (32) matches the position of the rear automatic coupling device (18) as indicated in the sent information (64); and\n * automatically coupling (S10) together the following vehicle (14b-c) and the vehicle (14a-b) immediately ahead while the vehicles (14a-c) are in motion to form at least a part of the long vehicle combination (10),\n wherein each vehicle (14a-b) immediately ahead is adapted to estimate the position of its rear automatic coupling device (18) based on \n * the heading of the vehicle (14a-b) immediately ahead,\n * the position of a part of the vehicle (14a-b) immediately ahead as determined by a navigation system (22) of the vehicle immediately ahead,\n * a vehicle model representing the vehicle (14a-b) immediately ahead,\n * the height of the rear automatic coupling device (18), and\n * in case the vehicle (14a-b) immediately ahead is an articulated vehicle, at least one articulation angle of the vehicle immediately ahead as detected by at least one articulation angle detection means (28) on the vehicle immediately ahead.\n | 2. A method according to claim 1, wherein each following vehicle (14b-c) comprises actuator means (48a-b) adapted to adjust the front coupling device (30).\n | 3. A method according to claim 2, wherein the actuator means (48a-b) is adapted to laterally adjust the front coupling device (30).\n | 4. A method according to claim 2 or 3, wherein the actuator means (48a-b) is adapted to vertically adjust the front coupling device (30).\n | 5. A method according to any preceding claim, wherein each following vehicle (14b-c) comprises means (54) adapted to adjust the length of the front coupling device (30).\n | 6. A method according to claim 5, further comprising the step of: shortening (S1) the length of the front coupling device (30) while driving as the long vehicle combination.\n | 7. A method according to any preceding claim, wherein each following vehicle (14b-c) is adapted to estimate the position of its front coupling element (32) based on \n * the heading of the following vehicle (14b-c),\n * the position of a part of the following vehicle (14b-c) as determined by a navigation system (36) of the following vehicle (14b-c),\n * a vehicle model representing the following vehicle (14b-c),\n * a first angle representing a lateral adjustment of the front coupling device (30),\n * a second angle representing any vertical adjustment of the front coupling device (30),\n * the length of the front coupling device (30), and\n * a height related to the front coupling device (30).\n | 8. A method according to any preceding claim, wherein each vehicle immediately ahead (14a-b) comprises at least two independent means (21, 22) for determining its speed.\n | 9. A method according to any preceding claim, further comprising the step of: a leading vehicle of the platoon sending an acceleration or deceleration request (63) to the following vehicles (14b-c) of the platoon (12) via wireless vehicle-to-vehicle communication.\n | 10. A method according to any preceding claim, wherein the information (64) sent from the vehicle (14a-b) immediately ahead to the following vehicle (14b-c) includes the heading of the rear automatic coupling device (18) of the vehicle (14a-b) immediately ahead.\n | 11. A method according to any preceding claim, wherein the first road section (58) is at least one of a bridge, a roundabout, and a turn.\n | 12. A method according to any preceding claim, further comprising the step of planning (S3) an inter-vehicle distance (60) between subsequent vehicles based on the first road section (58) ahead, wherein the platoon (12) is driven through the first road section (58) with the planned inter-vehicle distance(s) (60).\n | 13. A method according to any preceding claim, wherein at least one of the automatic decoupling and the automatic coupling is performed while driving at a safety speed.\n | 14. A method according to any preceding claim, wherein the automatic coupling is performed while driving on a straight road.\n | 15. A method according to any preceding claim, wherein the automatic coupling starts with the vehicle (14b) immediately behind the leading vehicle (14a) of the platoon (12) coupling to the leading vehicle (14a) of the platoon (12).\n | 16. A method according to any preceding claim, wherein the automatic decoupling starts with the last vehicle (14c) of the long vehicle combination (10) decoupling from the vehicle immediately ahead (14b).\n | 17. A method according to any preceding claim, wherein each vehicle (14b-c) after the leading vehicle (14a) of the long vehicle combination (10) or platoon (12) is an autonomous vehicle.\n | 18. A method according to any preceding claim, wherein at least one vehicle (14b-c) after the leading vehicle (14a) of the long vehicle combination (10) or platoon (12) is an autonomous dolly (16) and semi-trailer combination.\n | 19. A vehicle (14a-b) comprising: \n * a rear automatic coupling device (18); means (21) for speed determination;\n * a control unit (20) adapted to estimate the position of the rear automatic coupling device (18) while the vehicle (14a-b) is in motion based on the heading of the vehicle (14a-b), the position of a part of the vehicle (14a-b) as determined by a navigation system (22) of the vehicle, a vehicle model representing the vehicle (14a-b), the height of the rear automatic coupling device (18) as determined by a height level sensor (24), and in case the vehicle (14a-b) is an articulated vehicle, at least one articulation angle of the vehicle as detected by at least one articulation angle detection means (28) on the vehicle; and\n * communication means (26) adapted to wirelessly send information (64) indicating the estimated position and the speed of the rear automatic coupling device (18) to a following vehicle (14b-c).\n | 20. A vehicle (14b-c) comprising: \n * a front coupling device (30) including a front coupling element (32);\n * a control unit (34) adapted to estimate the position of the front coupling element (32) while the vehicle (14b-c) is in motion;\n * a navigation system (36) and a height level sensor (38);\n * communication means (40) adapted to wirelessly receive information (64) from a vehicle (14a-b) immediately ahead, which information (64) indicates the position and speed of a rear automatic coupling device (18) of the vehicle (14a-b) immediately ahead;\n * autonomous driving means (42) adapted to drive the vehicle (14b-c) based at least on the position and speed in the received information (64) so that the rear automatic coupling device (18) of the vehicle (14a-b) immediately ahead gets within an operational range (66) of the front coupling element (32); and\n * means (48a-b, 54) adapted to automatically adjust the front coupling device (30), while in motion and when the rear automatic coupling device (18) is within the operational range (66), so that the position of the front coupling element (32) matches the position of the rear automatic coupling device (18) as indicated in the received information (64).", + "output": "The method involves detecting (S6) second road section which stipulates coupling together the vehicles of the platoon to form the long vehicle combination. The information is sent (S7) to the following vehicle through wireless vehicle-to-vehicle communication. The information indicates the position and speed of rear automatic coupling device of the vehicle immediately ahead. The following vehicle is automatically driven (S8) so that the rear automatic coupling device of the vehicle immediately ahead of the following vehicle gets within an operational range of front coupling element of the following vehicle. The front coupling device is automatically adjusted (S9) so that the position of the front coupling element matches the position of the rear automatic coupling device as indicated in the sent information. The following vehicle and the vehicle immediately ahead are automatically coupled (S10) together while the vehicles are in motion to form portion of the long vehicle combination. An INDEPENDENT CLAIM is included for a vehicle. Method for transforming between long vehicle combination and platoon on move, for heavy duty vehicle e.g. truck. The following vehicle can receive the correct speed allowing to safely drive so that the rear automatic coupling device of the vehicle immediately ahead gets within the operational range, even if one of the systems fails. The acceleration or deceleration request sent through wireless vehicle-to-vehicle communication can allow following vehicle to safely drive within the operational range, even if the operational range results in relatively short headway between the following vehicle and the vehicle immediately ahead and even if the speed is relatively high. The long vehicle combination can be automatically re-formed in motion after the roundabout or turn, to improve fuel efficiency. The front coupling device is automatically adjusted so that the position of the front coupling element matches the position of the rear automatic coupling device as indicated in the sent information. The drawing shows a flowchart illustrating the method for transforming between long vehicle combination and platoon on move. S6Step for detecting second road section which stipulates coupling together the vehicles of the platoon to form the long vehicle combinationS7Step for sending information to the following vehicle through wireless vehicle-to-vehicle communicationS8Step for automatically driving following vehicle so that the rear automatic coupling device of the vehicle immediately ahead of the following vehicle gets within an operational range of front coupling element of the following vehicleS9Step for automatically adjusting front coupling device so that the position of the front coupling element matches the position of the rear automatic coupling device as indicated in the sent informationS10Step for automatically coupling together the following vehicle and the vehicle immediately ahead while the vehicles are in motion to form portion of the long vehicle combination", + "instruction": "Please summarize the input" + }, + { + "input": "A METHOD FOR PROVIDING A POSITIVE DECISION SIGNAL FOR A VEHICLEA method for providing a positive decision signal for a vehicle which is about to perform a traffic scenario action. The method includes receiving information about at least one surrounding road user, which information is indicative of distance to the surrounding road user with respect to the vehicle and at least one of speed and acceleration of the surrounding road user; calculating a value based on the received information; providing the positive decision signal to perform the traffic scenario action when the calculated value is fulfilling a predetermined condition. The value is calculated based on an assumption that the surrounding road user will react on the traffic scenario action by changing its acceleration.|1. A method for providing a positive decision signal for a vehicle which is about to perform a traffic scenario action, such as entering a crossing, entering a highway and/or changing lanes, the method comprising: \nreceiving information about at least one surrounding road user, which information is indicative of distance to the surrounding road user with respect to the vehicle and at least one of speed and acceleration of the surrounding road user; \ncalculating a value based on the received information; \nproviding the positive decision signal to perform the traffic scenario action when the calculated value is fulfilling a predetermined condition, wherein the value is calculated based on an assumption that the surrounding road user will react on the traffic scenario action by changing its acceleration, characterized in that, the surrounding road user is a predefined virtual surrounding road user and the predetermined condition is defined by a threshold value which is indicative of an acceleration limit for the surrounding road user. \n | 2. The method according to claim 1, wherein the value is further calculated based on the assumption that the surrounding road user will react on the traffic scenario action by changing its acceleration after a reaction time.\n | 3. The method according to claim 1, wherein the value is further calculated based on the assumption that the surrounding road user will react on the traffic scenario action by changing its acceleration to an acceleration profile having a constant acceleration.\n | 4. The method according to claim 1, wherein the value is further calculated based on the assumption that the surrounding road user will react on the traffic scenario action by changing its acceleration to an acceleration profile having a variable acceleration.\n | 5. The method according to claim 1, further comprising providing a negative decision signal not to perform the traffic scenario action when the calculated value is not fulfilling the predetermined condition.\n | 6. The method according to claim 1, wherein the threshold value is variable depending on at least one factor, such as any one of speed of the surrounding road user, type of surrounding road user, ambient weather conditions with respect to the vehicle and a state of the surrounding road user, such as a state where a turning indicator is active.\n | 7. The method according to claim 1, wherein the information about the at least one surrounding road user is received by any one of a perception sensor of the vehicle, a V2X communication interface and a remote perception sensor which is in communicative contact with the vehicle.\n | 8. The method according to claim 1, wherein the method is used as a safety control method for an autonomous vehicle, wherein the autonomous vehicle is primarily performing traffic scenario actions by use of a primary autonomous vehicle control method, and wherein a traffic scenario action permitted to be performed by the primary autonomous vehicle control method is allowed to be performed if also the positive decision signal is provided.\n | 9. The method according to claim 1, wherein the calculated value is further based on auxiliary information relating to the traffic scenario action, such as any one of shape and/or dimension(s) of a crossing, a road lane and a neighboring road lane.\n | 10. A method for automatically performing a traffic scenario action of a vehicle, comprising: \nproviding a positive decision signal to perform the traffic scenario action, which positive decision signal has been provided according to the method of claim 1; and\nautomatically performing the traffic scenario action. \n | 11. A method for automatically avoiding performing a traffic scenario action of a vehicle, comprising: \nproviding a negative decision signal not to perform the traffic scenario action, which negative decision signal has been provided according to the method of claim 5; and\nautomatically avoiding performing the traffic scenario action. \n | 12. A control unit for a vehicle which is configured to perform the steps of claim 1.\n | 13. A vehicle comprising the control unit according to claim 12.\n | 14. The vehicle according to claim 13, wherein the vehicle is a fully autonomous or semiautonomous vehicle.\n | 15. The vehicle according to claim 13, wherein the vehicle is a road vehicle, such as a public road vehicle, for example a truck, a bus and a construction equipment vehicle adapted to be driven on a road.\n | 16. The vehicle according to claim 13, wherein the vehicle is a heavy-duty vehicle which has a minimum weight of at least 5000 kg, such as 30.000 kg.\n | 17. A computer program comprising program code means for performing the steps of claim 1, when said program is run on a computer.\n | 18. A computer readable medium carrying a computer program comprising program code means for performing the steps of claim 1, when said program product is run on a computer.", + "output": "The method involves receiving information about the surrounding road user (2), which information is indicative of distance to the surrounding road user with respect to the vehicle and the speed and acceleration of the surrounding road user. A value is calculated based on the received information. The positive decision signal is provided to perform the traffic scenario action, when the calculated value is fulfilled a predetermined condition. The value is calculated based on an assumption that the surrounding road user is reacted on the traffic scenario action by changing the acceleration of the vehicle. INDEPENDENT CLAIMS are included for the following:a vehicle;a computer program for providing positive decision signal for vehicle; anda computer readable medium carrying computer program for providing positive decision signal for vehicle. Method for providing positive decision signal for vehicle (claimed). The surrounding road user increases and reduces the speed when the vehicle initiates lane change to the nearby lane to avoid the risk of collision. The improved and cost-efficient redundancy for the autonomous vehicle is implied. The drawing shows a schematic view of the traffic scenario. 1Heavy-duty truck2Surrounding road user", + "instruction": "Please summarize the input" + }, + { + "input": "method for the gap between vehicle control teamThe invention claims a vehicle gap (24a-24c) between control vehicle in method (10), the train (10) comprises a preceding vehicle (12) and one or a plurality of following vehicle (14a-14c). the method comprises the following steps: indicating parameter (27) acquires the potential collision threat (26) recognized by leader vehicle autonomous emergency braking system (16), wherein leader of the autonomous vehicle emergency braking system comprises a plurality of predefined control period (28a-28c), and wherein the parameter for indicating the current control stage at least partially determining the autonomous emergency braking system, and obtained the indication parameter is transmitted to the one or more of the following vehicle.|1. A vehicle gap (24a-24c) between a control vehicle in method (10), the vehicle (10) comprises a preceding vehicle (12) and one or a plurality of following vehicle (14a-14c), wherein, the feature of the method lies in the following steps: obtaining by the leader vehicle autonomous emergency braking system (16) identifies the potential collision threat indication parameter (27) (26), wherein the leading vehicle of the autonomous emergency braking system comprises a plurality of predefined control stage (28a-28c). and wherein the current control stage, the indication parameter at least partially determining the autonomous emergency braking system, and the obtained by the indication parameter is transmitted to the one or more of the following vehicle.\n | 2. The method according to claim 1, further comprising the following step: in the one or more of the following vehicle receiving the indication parameter, and automatically adjusting the clearance between vehicle parameters based on the indication received.\n | 3. The method according to claim 1 or 2, wherein the indicating parameter is collision time.\n | 4. The method according to claim 2 or 3, wherein the step of adjusting the gap between the vehicle comprises automatically based on the indication parameter of the received: the one or more of the following vehicle following vehicle (14c) according to the position of the following vehicle in the motorcade from the avoidance time minus a predetermined time so as to generate the collision time (TTC14C), and the following vehicle based on the collision time is reduced to adjust the following vehicle and the preceding vehicle (14b) and gap (24c).\n | 5. The method according to claim 2, wherein the step of automatically adjusting the clearance between the vehicle starts at the fleet of the last vehicle (14c) based on the received indicating parameter so as to increase the last vehicle (14c) and the gap between the preceding vehicle (14b) (24c).\n | 6. The method according to claim 2, wherein the step of automatically adjusting the clearance between said vehicle is started before the complete braking phase of the leading vehicle in the autonomous emergency braking system (28c) based on the received indicating parameter.\n | 7. The method according to claim 2, further comprising: the driver of the leading vehicle with respect to the vehicle of the last vehicle (14c) how to adjust the last vehicle (14c) and the preceding vehicle (14b) between the gap (24c).\n | 8. The method according to claim 1, wherein, using a vehicle-to-vehicle communication device (18) to perform the indication parameter.\n | 9. The method according to claim 2, wherein, using a vehicle-to-vehicle communication device (32a-32c) to perform reception of the indication parameter.\n | 10. The method according to any one of the preceding claims, further comprising: a friction-based estimation value for determining the retarding capacity of the leader vehicle.\n | 11. The method according to claim 2 and 10, wherein the step of automatically adjusting the clearance between the vehicle based on the received indication parameter comprises: also considers the reduction ability.\n | 12. A vehicle gap (24a-24c) between a control vehicle in method (10), the vehicle (10) comprises a preceding vehicle (12) and one or more following vehicle (14a-14c), wherein the feature of the method lies in the following steps: indicating parameter (27) in the one or more of the following vehicle receiving the potential collision threat (26) identified by autonomous of the leader vehicle emergency brake system (16), wherein the leader vehicle of the autonomous emergency braking system comprises a plurality of predefined control period (28a-28c), and wherein the indication parameter the current control stage at least partially determining the autonomous emergency braking system, and automatically adjusting the clearance between vehicle parameters based on the indication received.\n | 13. A computer program comprising program code, when said program is run on a computer, said program code to perform the method according to any one of claims 1-12 the step.\n | 14. A computer readable medium, the computer readable medium carrying a computer program comprising program code, when said program product is run on a computer, said program code to perform the method according to any one of claims 1-12 the step.\n | 15. A control unit (22, 34a-34c), said control unit is used for controlling the clearance between the vehicle in the fleet, the control unit is configured to perform the method according to any one of claims 1-12 the steps of the method.\n | 16. A vehicle (12; 14a-14c), the vehicle (12; 14a-14c) is configured to perform the method according to any one of claim 1-12 the steps of the method.", + "output": "The method involves obtaining an indicator (27) of potential collision threat (26) identified by an autonomous emergency braking system (16) of a lead vehicle, where the braking system of the lead vehicle comprises pre-defined control phases, and the indicator determines current control phase of the braking system. The obtained indicator is sent to following vehicles. The indicator is received in the following vehicles. Inter-vehicle gaps (24a-24c) are automatically adjusted based on the received indicator. INDEPENDENT CLAIMS are also included for the following:a computer program comprising a set of instructions for controlling inter-vehicle gaps in a platoona computer readable medium comprising a set of instructions for controlling inter-vehicle gaps in a platoona control unit for controlling inter-vehicle gaps in a platoona vehicle. Method for controlling inter-vehicle gaps between vehicles (claimed), e.g. lorries, buses and passenger cars, in a platoon. The method allows the lead vehicle to remain predictable for the following vehicles even if a slippery or low friction road reduces deceleration capacity and calls for earlier braking, and building buffer distance to mitigate effects of different braking capacity of the vehicles in the platoon. The method enables exploring possibilities to drive road vehicles in the platoons or road trains with small time gaps so as to save fuel and decrease driver workload and road footprint in an effective manner. The drawing shows a schematic view of a platoon. 10Platoon16Autonomous emergency braking system24a-24cInter-vehicle gaps26Potential collision threat27Indicator", + "instruction": "Please summarize the input" + }, + { + "input": "A METHOD FOR FORMING A VEHICLE COMBINATIONThe present disclosure relates to a method for selecting and identifying a powered dolly vehicle among a group of powered dolly vehicles in a geographical area for forming a vehicle combination with a primary vehicle and one or more trailers, the method being implemented by one or more processors of a wireless control system, each one of the powered dolly vehicles having an associated distinguishable identification information and an operational characteristic, the method comprising: receiving (S10) a request from the primary vehicle to select a powered dolly vehicle among the group of powered dolly vehicles based at least in part on a mission-characteristic for the vehicle combination; evaluating (S20) the operational characteristic of each one of the powered dolly vehicles based on said mission-characteristic; selecting (S30) a powered dolly vehicle among the group of powered dolly vehicles based at least in part on said evaluation; locating (S40) the selected powered dolly vehicle in the geographical area based at least in part on the identification information; and communicating (S50) the location of the selected powered dolly vehicle to the primary vehicle or operating the powered dolly vehicle to primary vehicle.|1. A method for selecting and identifying a powered dolly vehicle among a group of powered dolly vehicles in a geographical area for forming a vehicle combination with a primary vehicle and one or more trailers, the method being implemented by one or more processors of a wireless control system, each one of the powered dolly vehicles having an associated distinguishable identification information and an operational characteristic, the method comprising: \n * - receiving (S10) a request from the primary vehicle to select a powered dolly vehicle among the group of powered dolly vehicles based at least in part on a mission-characteristic for the vehicle combination;\n * - evaluating (S20) the operational characteristic of each one of the powered dolly vehicles based on said mission-characteristic;\n * - selecting (S30) a powered dolly vehicle among the group of powered dolly vehicles based at least in part on said evaluation;\n * - locating (S40) the selected powered dolly vehicle in the geographical area based at least in part on the identification information; and\n * - communicating (S50) the location of the selected powered dolly vehicle to the primary vehicle or operating the powered dolly vehicle to the primary vehicle.\n | 2. Method according to claim 1, further comprising selecting one or more trailers among a group of trailers based at least in part on the mission-characteristic for the vehicle combination and the selected powered dolly vehicle; communicating the location of the selected one or more trailers to the powered dolly vehicle, or to the primary vehicle; and operating the powered dolly vehicle to couple with the one or more trailers.\n | 3. Method according to any one of the preceding claims, wherein the mission-characteristic of the vehicle combination comprises any one of an assignment instruction for the vehicle combination, a cargo space-requirement for the vehicle combination, a pick-up location of the cargo, a pick-up time for the cargo, a delivery time for the cargo, a delivery location of the cargo, and data indicating type of cargo.\n | 4. Method according to any one of the preceding claims, wherein the mission-characteristic of the vehicle combination comprises data indicating type of primary vehicle.\n | 5. Method according to any one of the preceding claims, further comprising receiving data relating to environmental conditions.\n | 6. Method according to any one of the preceding claims, wherein the associated operational characteristic comprises data indicating any one of a brake capacity of the powered dolly vehicle, energy storage system capacity of the powered dolly vehicle, and state of charge of the energy storage system of the powered dolly vehicle.\n | 7. Method according to any one of the preceding claims, wherein the associated operational characteristic comprises data indicating type of powered dolly vehicle.\n | 8. Method according any one of the preceding claims, wherein the associated distinguishable identification information comprises an identification component configurable to be updated by the one or more processors of a wireless control system.\n | 9. Method according to any one of the preceding claims, wherein the evaluating comprises determining if at least one operational characteristic of at least one of the powered dolly vehicles fulfils the mission-characteristic, or is at least sufficient for fulfilling the mission-characteristic.\n | 10. Method according any one of the preceding claims, wherein the request is received at a remote-control source from the primary vehicle, the remote-control source comprising a transceiver for receiving the request from the autonomous vehicle.\n | 11. Method according to claim 10, wherein the remote-control source comprises a memory configurable to contain and store the associated distinguishable identification information and operational characteristic of each one of the powered dolly vehicles.\n | 12. Method according any one of the claims 10 to 11, comprising receiving the request from the primary vehicle at the remote-control source when the primary vehicle arrives at the geographical area.\n | 13. Method according any one of the preceding claims, wherein the primary vehicle comprises a memory configurable to contain any one of the associated distinguishable identification information and operational characteristic of each one of the powered dolly vehicles.\n | 14. Method according to any one of the preceding claims, comprising obtaining the operational characteristic directly from powered dolly vehicles.\n | 15. Method according to any one of the preceding claims, comprising controlling any one of the primary vehicle and the selected powered dolly vehicle to couple to each other so as to form the vehicle combination.\n | 16. A computer program comprising instructions, which when executed by one or more processors of a wireless control system, cause the one or more processors to perform operations comprising: receiving a request from the primary vehicle to select a powered dolly vehicle among the group of powered dolly vehicles based at least in part on a mission-characteristic for the vehicle combination; evaluating the operational characteristic of each one of the powered dolly vehicles based on said mission-characteristic; selecting a powered dolly vehicle among the group of powered dolly vehicles based at least in part on said evaluation; locating the selected powered dolly vehicle in the geographical area based at least in part on the identification information; and communicating the location of the selected powered dolly vehicle to the primary vehicle or operating the powered dolly vehicle to primary vehicle.\n | 17. A non-transitory computer-readable medium comprising instructions, which when executed by one or more processors of a control system, cause the one or more processors to perform operations comprising: receiving a request from the primary vehicle to select a powered dolly vehicle among the group of powered dolly vehicles based at least in part on a mission-characteristic for the vehicle combination; evaluating the operational characteristic of each one of the powered dolly vehicles based on said mission-characteristic; selecting a powered dolly vehicle among the group of powered dolly vehicles based at least in part on said evaluation; locating the selected powered dolly vehicle in the geographical area based at least in part on the identification information; and communicating the location of the selected powered dolly vehicle to the primary vehicle or operating the powered dolly vehicle to primary vehicle.\n | 18. A wireless control system for identifying and selecting a powered dolly vehicle among a group of powered dolly vehicles in a geographical area for forming a vehicle combination with a primary vehicle and one or more trailers, each one of the powered dolly vehicles having an associated distinguishable identification information and an operational characteristic, the system comprising a memory that stores a set of instructions and one or more processors which use the instructions from the set of instructions to: \n * - receive a request from the primary vehicle to select a powered dolly vehicle among the group of powered dolly vehicles based at least in part on a mission-characteristic for the vehicle combination;\n * - evaluate the operational characteristic of each one of the powered dolly vehicles based on said mission-characteristic;\n * - select a powered dolly vehicle among the group of powered dolly vehicles based at least in part on said evaluation;\n * - locate the selected powered dolly vehicle in the geographical area based at least in part on the identification information; and\n * - communicate the location of the selected powered dolly vehicle to the primary vehicle or operating the powered dolly vehicle to primary vehicle.\n | 19. Wireless control system according to claim 19, further comprising a communication interface operably coupled to the one or more processors for receiving instructions and for transmitting the location of the selected powered dolly vehicle to the primary vehicle.\n | 20. A vehicle for forming a vehicle combination with a powered dolly vehicle and one or more trailers, comprising a memory that stores a mission-characteristic for the vehicle combination and one or more processors which use the mission-characteristic to: \n * - select a powered dolly vehicle among a group of powered dolly vehicles based at least in part on the mission-characteristic for the vehicle combination, each one of the powered dolly vehicles having an associated distinguishable identification information and an operational characteristic;\n * - evaluate the operational characteristic of each one of the powered dolly vehicles based on said mission-characteristic;\n * - select a powered dolly vehicle among the group of powered dolly vehicles based at least in part on said evaluation;\n * - locate the selected powered dolly vehicle in the geographical area based at least in part on the identification information; and\n * - communicate the location of the selected powered dolly vehicle to the primary vehicle or operating the powered dolly vehicle to vehicle.", + "output": "The method involves receiving (S10) a request from the primary vehicle to select a powered dolly vehicle among the group of powered dolly vehicles based on a mission-characteristic for the vehicle combination. The operational characteristic of each one of the powered dolly vehicles evaluated (S20) based on mission-characteristic. A powered dolly vehicle selected (S30) among the group of powered dolly vehicles based on evaluation. The selected powered dolly vehicle located (S40) in the geographical area based on the identification information. The location of the selected powered dolly vehicle is communicated (S50) to the primary vehicle or operating the powered dolly vehicle to the primary vehicle. INDEPENDENT CLAIMS are included for the following:a computer program for selecting and identifying powered dolly vehicle;a non-transitory computer-readable medium storing program for selecting and identifying powered dolly vehicle;a wireless control system for identifying and selecting a powered dolly vehicle among a group of powered dolly vehicles in a geographical area for forming a vehicle combination with a primary vehicle and one or more trailers; anda vehicle for forming a vehicle combination with a powered dolly vehicle and one or more trailers. Method for selecting and identifying powered dolly vehicle such as electric-powered dolly, steerable dolly vehicles. The method enables providing more efficient transportation vehicle systems that are fully, or partially, autonomous, thus increasing operational capacity of heavy-duty vehicles by vehicle combinations with multiple vehicle units in form of trailer units. The drawing shows a flowchart illustrating method for selecting and identifying powered dolly vehicle. S10Step for receiving a request from the primary vehicle to select a powered dolly vehicleS20Step for evaluating the operational characteristic of each one of the powered dolly vehiclesS30Step for selecting a powered dolly vehicleS40Step for locating the selected powered dolly vehicle in the geographical areaS50Step for communicating the location of the selected powered dolly vehicle to the primary vehicle or operating the powered dolly vehicle to the primary vehicle", + "instruction": "Please summarize the input" + }, + { + "input": "providing supporting device of vehicle, vehicle and method for vehicle driversproviding under the possible condition of overtaking the manual or semi-autonomous driving during a vehicle driver assisting device (1), the vehicle (2) and method (100). device (1) comprises a detector (4), a communication unit (5), which is arranged to be based on historical route information and car navigation system information of at least one of the front vehicle (3) receives the vehicle route information, a processing unit (6); which is set to the vehicle (3) will travel along one or more routes to one or more routes in the vehicle (2) in front of the possibility of processing the received information into a format that is suitable for display, and display the processed information. provide support to the vehicle by the driver.|1. providing support of the vehicle (2) in the device (1) is a vehicle driver during manual or semi-autonomous driving under the possible overtaking condition, wherein said device (1) comprising: a detector (4); the detector (4) is set to detecting the vehicle (2) and vehicle (3) between the closing velocity and distance of the vehicle (2) in the vehicle (3) is at least one of a communication unit (5), the communication unit (5) is set to front vehicle (3) receiving based on historical route information and car navigation system information of the vehicle (3) at least one of the possible route information; synchronized processing unit (6), said processing unit (6) is arranged from the detector (4) receiving indication to the vehicle (2) is at least one of information of the vehicle (3) between the closing velocity and distance of the vehicle (2) in the vehicle (3). if the vehicle (2) detected by the approach speed (3) is greater than the vehicle (2) approaches the speed threshold value or if the detected and the distance between the vehicle (3) is less than the distance threshold, triggering the communication unit (5) to receive information about possible route of the vehicle (3); for the vehicle (2) in front of one or more the plurality of routes, determining vehicle (3) along the one or more possibility of route travel. the determination is based on the received vehicle data and received historical route information in the car navigation system information of at least one, and is set to the received information into a format that is suitable for display, one or more display units (7), the display unit (7) is arranged from the processing unit (6) receives the determined vehicle route information and displays the processed information. providing a support to the vehicle by the driver.\n | 2. The said device (1) according to claim 1, wherein determining the possible routes of two or more vehicle (3).\n | 3. The said device (1) according to claim 1 or 2, wherein said communication unit (5) is set with one or more possible routes to one or more vehicle (3) sending a request to receive with said one or more vehicle (3) of related information.\n | 4. The said device (1) according to claim 1 or 2, wherein said communication unit (5) is configured for communication with other vehicle through vehicle-to-vehicle communication (V2V).\n | 5. The said device (1) according to claim 1 or 2, wherein said communication unit (5) is configured to then through vehicle to infrastructure for communication with other vehicle-to-vehicle communication (V2I2V).\n | 6. The said device (1) according to claim 1 or 2, wherein said one or more display unit (7) is arranged to only less than the distance (3) between the vehicle (2) detected by the detector (4) of the display process of the information distance threshold value.\n | 7. The said device (1) according to claim 1, wherein said device (1) comprises a positioning system (8), the positioning system (8) is connected to a map database (9) and configured to continuously determine the position of the vehicle (2).\n | 8. The said device (1) according to claim 1, wherein the likelihood that the one or more display unit (7) is configured to display information via a chart, and the first graphic element (10a) represents (3) along the first path (11a), and the second graphic element (10b) represents (3) along a second path (11b) is possible.\n | 9. The method according to claim 7 8, wherein said device (1) connected to the vehicle positioning system (8) of the map database (9) is configured to receive the vehicle route, the processing unit (6) is arranged to the vehicle route is compared with the determined route of one or more vehicle (3), and the one or more display unit (7) is arranged to display indicating two or more route overlapped distance information.\n | 10. The method according to claim 7 8, wherein said device (1) is connected to a map database (9) of the vehicle locating system (8) is configured to receive the vehicle route. the processing unit (6) is arranged to the vehicle route is compared with the determined route of one or more vehicle (3), and the one or more display unit (7) is arranged to display indicating one or more vehicle (3) to be driving the possibility of optional distance along the route of the vehicle along the vehicle route information.\n | 11. A vehicle (2), wherein the vehicle (2) comprises said device according to any one of said claims (3).\n | 12. method (100) provides support to the vehicle driver during manual or semi-automatic driving under the possible condition of overtaking, wherein the method (100) comprises: at least one of a period detector (101) between the vehicle and the front vehicle closing velocity and the distance between the vehicle and the front vehicle, connecting the indication of vehicle and front vehicle closing velocity and the distance between the vehicle and the front of at least one of information received from the detector (102) to the processing unit; Pour detected if the closing velocity of the vehicle and the front vehicle is larger than the distance between the vehicle and the front vehicle approach speed threshold or if the detection is less than the distance threshold. is triggered by the processing unit (103) communication unit to front vehicle received based on historical route information and car navigation system information in front of at least one of the route information determined by the processing unit (104) along one or more routes to one or more routes the possibility of running in front of the vehicle, said determining historical route information based on at least one of the received data and the received car navigation system information in information processing (105) the received into a format that is suitable for display, by one or more display unit (106) for processing the information.", + "output": "The arrangement (1) has a detector (4) to detect closing velocity and/or distance between a host vehicle (2) and a preceding vehicle (3). A processing unit (6) triggers a communication unit (5) to receive information on a probable route of preceding vehicle when a detected closing velocity is above threshold velocity or distance is below threshold distance. The probability that preceding vehicle will drive along routes is determined for route ahead of host vehicle. The received information is processed for display (7a). An INDEPENDENT CLAIM is included for a method for providing vehicle driver support for a driver of a host vehicle during manual or semi-autonomous driving in a potential overtake scenario. Arrangement in host vehicle (claimed) for providing vehicle driver support during manual or semi-autonomous driving. The vehicle driver is supported such that unnecessary overtaking is avoided, since the probability that the preceding vehicle will drive along the one or more route ahead of the host vehicle is determined and arranged to be displayed. The vehicles are enabled to share information between them in an easy, reliable and cost efficient manner. The drawing shows a schematic view of the vehicle and the arrangement in the vehicle for providing vehicle driver support during manual or semi-autonomous driving in the potential overtake scenario. 1Arrangement2Host vehicle3Preceding vehicle4Detector5Communication unit6Processing unit7aDisplay", + "instruction": "Please summarize the input" + }, + { + "input": "Apparatus and method for prediction of time available for autonomous driving, in a vehicle having autonomous driving capProvided are a method and an apparatus (1) for prediction of time available for autonomous driving, in a vehicle (2) having autonomous driving capabilities and comprising remote sensors (3) arranged to acquire vehicle surrounding information (4) and vehicle dynamics sensors (5) arranged to determine vehicle dynamics parameters (6), as well as a vehicle (2) comprising such an apparatus (1). At least one of a positioning arrangement (7) that provides map data with associated information; a route planning arrangement (8) that enables route planning; and a real time information acquiring arrangement, that acquires at least one of real time traffic information (9a) and real time weather information (9b). The time available is calculated based on a planned route and at least one of vehicle surrounding information (4), vehicle dynamics parameters (6), map data with associated information, real time traffic information (9a) and real time weather information (9b), for the planned route. The calculated time is output to a human machine interface (11) arranged in a vehicle (2).|1. An apparatus (1) for prediction of time available for autonomous driving, in a vehicle (2) having autonomous driving capabilities, the vehicle (2) comprising: \n * remote sensors (3) arranged to acquire vehicle surrounding information (4);\n * vehicle dynamics sensors (5) arranged to determine vehicle dynamics parameters (6);\n * the apparatus further comprising: \n * at least one of: a positioning arrangement (7) arranged to provide map data with associated speed limit and road infrastructure information; a route planning arrangement (8); and an arrangement for acquiring real time information (9), including at least one of real time traffic information (9a) and real time weather information (9b), and further\n * a processor (10) arranged to calculate a time available for autonomous driving based on a planned route and at least one of vehicle surrounding information (4), vehicle dynamics parameters (6), map data with associated speed limit and infrastructure information, real time traffic information (9a) and real time weather information (9b),\n * associated with the planned route; and\n * a human machine interface (11) arranged to output to a vehicle (2) passenger compartment (12) the calculated time available for autonomous driving along the planned route characterized in that the processor (10) further is arranged to calculate a hand over time, required for hand over from autonomous driving to manual driving, and to include this calculated hand over time in the calculation of time available for autonomous driving.\n | 2. An apparatus (1) according to claim 1, characterized in that the processor (10) is arranged to calculate the time available for autonomous driving based on at least road infrastructure information, real time traffic information (9a) and real time weather information (9b).\n | 3. An apparatus (1) according to any one of claims 1 to 2, characterized in that the processor (10) further is arranged to calculate the time available for autonomous driving based on certified road sections allowed to drive autonomously on.\n | 4. An apparatus (1) according to any one of claims 1 to 3, characterized in that the arrangement for acquiring real time information (9), when present, comprises an interface for communication via one or more portable communication devices of vehicle occupants for acquiring the real time information.\n | 5. An apparatus (1) according to any one of claims 1 to 3, characterized in that the arrangement for acquiring real time information (9), when present, comprises an interface for performing at least one of vehicle-to-vehicle and vehicle-to-infrastructure communication for acquiring the real time information.\n | 6. A method for prediction of time available for autonomous driving, in a vehicle (2) having autonomous driving capabilities, the vehicle (2) comprising: \n * remote sensors (3) arranged to acquire vehicle surrounding information (4);\n * vehicle dynamics sensors (5) arranged to determine vehicle dynamics parameters (6);\n * the method comprising at least one of the steps of: \n * providing map data with associated speed limit and road infrastructure information using a positioning arrangement (7);\n * performing route planning using a route planning arrangement (8); and\n * acquiring real time information, including at least one of real time traffic information (9a) and real time weather information (9b), and the steps of: \n * calculating, using a processor (10), a time available for autonomous driving based on a planned route and at least one of vehicle surrounding information (4), vehicle dynamics parameters (6), map data with associated speed limit and infrastructure information,\n * real time traffic information (9a) and real time weather information (9b), associated with the planned route, and calculating a hand over time, required for hand over from autonomous driving to manual driving, and including this calculated hand over time in the calculation of time available for autonomous driving; and\n * outputting, to a human machine interface (11) arranged in a vehicle (2) passenger compartment (12), the calculated time available for autonomous driving along the planned route.\n | 7. An automotive vehicle (2) having autonomous driving capabilities characterized in that it comprises an apparatus (1) for prediction of time available for autonomous driving according to any one of claims 1 to 5.", + "output": "The apparatus (1) has a processor (10) arranged to calculate time available for autonomous driving based on a planned route and one of vehicle surrounding information (4), vehicle dynamics parameters (6), map data with associated speed limit and infrastructure information, real time traffic information (9a) and real time weather information (9b) associated with the planned route. A human machine interface (11) is arranged to output calculated time available for autonomous driving along the planned route to a vehicle and passenger compartment (12). Apparatus for prediction of time available for autonomous driving in an automotive vehicle (claimed). The apparatus calculates hand over time in calculation of time available for autonomous driving, thus ensuring that time available for autonomous driving is not less than time required for hand over from autonomous driving to manual driving so as to ensure that a vehicle driver does not suffer stressful and potentially dangerous transition to manual driving. The apparatus ensures that provision of an interface for communication through portable communication devices of vehicle occupants for acquiring real time information enables realization of less complex and cost effective apparatus or alternatively provision of a redundant back-up channel for acquiring real time information. The drawing shows a schematic view of an apparatus for prediction of time available for autonomous driving in a vehicle with autonomous driving capabilities. 1Apparatus for prediction of time available for autonomous driving in automotive vehicle4Vehicle surrounding information6Vehicle dynamics parameters9aReal time traffic information9bReal time weather information10Processor11Human machine interface12Vehicle and passenger compartment", + "instruction": "Please summarize the input" + }, + { + "input": "DEVICE AND METHOD FOR SAFETY STOPPAGE OF AN AUTONOMOUS ROAD VEHICLEDevice and method for safety stoppage of an autonomous road vehicle (2) having a localization system (3) and sensors (4) for monitoring the vehicle (2) surroundings and motion, and a signal processing system (5) for processing sensor signals enabling an autonomous drive mode thereof. Processing means (7) continuously: predict where a drivable space (8) exists; calculate and store a safe trajectory (10) to a stop within the drivable space (8); determine a current traffic situation; determine any disturbances in sensor data, vehicle systems or components enabling the autonomous drive mode. If an incapacitating disturbance is determined, a request for a driver to take over control is signaled and determined if a driver has assumed control within a pre-determined time. If not, the vehicle (2) is controlled to follow the most recent safe trajectory (10) to a stop in a safe stoppage maneuver during which, or after stopping, one or more risk mitigation actions adapted to the determined current traffic situation are performed.|1. A safety stoppage device (1) of an autonomous road vehicle (2) having a localization system (3) and sensors (4) for monitoring the autonomous road vehicle (2) surroundings and motion, and a signal processing system (5) for processing sensor signals enabling an autonomous drive mode of the autonomous road vehicle (2) by an autonomous drive control unit (6) thereof,\ncharacterized in that it comprises processing means (7) arranged to continuously: \n * predict where a drivable space (8) exists, based on data from the sensors (4);\n * calculate and store to memory means (9) of the autonomous drive control unit (6) a safe trajectory (10) to a stop within the drivable space (8);\n * determine from at least the localization system (3) and the sensors (4) a current traffic situation;\n * determine any disturbances in sensor data, vehicle systems or components enabling the autonomous drive mode of the autonomous road vehicle (2); and\n * if a disturbance is determined, such that the autonomous drive mode is incapacitated, signal to a driver environment of the autonomous road vehicle (2) a request for a driver to take over control of the autonomous road vehicle (2) and, determine if control of the autonomous road vehicle (2) has been assumed by a driver thereof within a pre-determined time, and ,upon a negative determination to control the autonomous vehicle (2) by the autonomous drive control unit (6) to follow the most recently calculated safe trajectory (10) to a stop within the drivable space (8) in a safe stoppage maneuver, wherein, during performance of such a safe stoppage maneuver or after the autonomous road vehicle (2) has stopped, the safety stoppage device (1) further is arranged to perform one or more risk mitigation actions adapted to the determined current traffic situation.\n | 2. The safety stoppage device (1) according to claim 1, characterized in that the processing means (7) further are arranged to continuously estimate a risk associated with performing the safe stoppage maneuver in the determined current traffic situation and to adapt the one or more risk mitigation actions to the estimated risk.\n | 3. The safety stoppage device (1) according to claims 2, characterized in that the processing means (7) further are arranged to adapt at least one of timing and intensity of the one or more risk mitigation actions to the estimated risk.\n | 4. The safety stoppage device (1) according to any one of claims 1 to 3, characterized in that the processing means (7) further are arranged to signal the request to take over control of the autonomous road vehicle (2) to a driver environment of the autonomous road vehicle (2) using means (11, 12, 13) for visual, audible or haptic communication, or any combination thereof.\n | 5. The safety stoppage device (1) according to any one of claims 1 to 4, characterized in that the one or more risk mitigation actions comprises at least one of: increasing the magnitude of the request for a driver to take over control of the autonomous road vehicle (2); activating hazard lights (14) of the autonomous road vehicle (2); activating a horn (15) of the autonomous road vehicle (2); warning or informing other traffic participants trough vehicle-to-vehicle communication (16); notifying a traffic control center (17) that a safe stoppage maneuver is in progress or completed; warning trailing vehicles (18) by blinking tail or brake lights (19) of the autonomous road vehicle (2).\n | 6. The safety stoppage device (1) according to any one of claims 1 to 5, characterized in that the safety stoppage device (1) further is arranged to activate the one or more risk mitigation actions a predetermined time period after the autonomous road vehicle (2) has come to a stop.\n | 7. The safety stoppage device (1) according to any one of claims 1 to 5, characterized in that the safety stoppage device (1) further is arranged to activate the one or more risk mitigation actions during performance of the safe stoppage maneuver.\n | 8. The safety stoppage device (1) according to any one of claims 1 to 5, characterized in that the safety stoppage device (1) further is arranged to activate the one or more risk mitigation actions after the autonomous vehicle (2) has stopped.\n | 9. The safety stoppage device (1) according to any one of claims 1 to 8, characterized in that it further comprises driver monitoring means (20) for determining a physical state of a driver of the autonomous road vehicle (2) and that the safety stoppage device (1) further is arranged to adapt the one or more risk mitigation actions to the monitored physical state of a driver of the autonomous road vehicle (2).\n | 10. The safety stoppage device (1) according to claim 9, characterized in that the safety stoppage device (1) further is arranged to adapt the one or more risk mitigation actions to be performed earlier when the monitored physical state of a driver of the autonomous road vehicle (2) indicates an incapacitated driver.\n | 11. The safety stoppage device (1) according to any one of claims 9 to 10, characterized in that it further is arranged to monitor and store to the memory means (9) data related to safe stoppage maneuver incidents where a monitored physical state of a driver of the autonomous road vehicle (2) indicates these safe stoppage maneuver incidents to be caused by a reckless driver and to deactivate the autonomous drive mode of the autonomous road vehicle (2) after a predetermined number (n) of such incidents.\n | 12. The safety stoppage device (1) according to any one of claims 1 to 11, characterized in that it further comprises communication means (21) for communicating with a traffic control center (17), such that the traffic control center (17) is allowed to monitor the position of the autonomous road vehicle (2) and trigger the safety stoppage device (1) to perform the one or more risk mitigation actions when the monitored the position of the autonomous road vehicle (2) indicates that it is stationary in a potentially unsafe position.\n | 13. A method for safety stoppage of an autonomous road vehicle (2) having a localization system (3) and sensors (4) for monitoring the autonomous road vehicle (2) surroundings and motion, and a signal processing system (5) for processing sensor signals enabling an autonomous drive mode of the autonomous road vehicle (2) by an autonomous drive control unit (6) thereof,\ncharacterized in that it comprises using processing means (7) for continuously: \n * predicting where a drivable space (8) exists, based on data from the sensors (4);\n * calculating and storing to memory means (9) of the autonomous drive control unit (6) a safe trajectory (10) to a stop within the drivable space (8);\n * determining from at least the localization system (3) and the sensors (4) a current traffic situation;\n * determining any disturbances in sensor data, vehicle systems or components enabling the autonomous drive mode of the autonomous road vehicle (2); and\n * if a disturbance is determined, such that the autonomous drive mode is incapacitated, signaling to a driver environment of the autonomous road vehicle (2) a request for a driver to take over control of the autonomous road vehicle (2) and, determining if control of the autonomous road vehicle (2) has been assumed by a driver thereof within a pre-determined time, and ,upon a negative determination controlling the autonomous vehicle (2) by the autonomous drive control unit (6) to follow the most recently calculated safe trajectory (10) to a stop within the drivable space (8) in a safe stoppage maneuver,\n * and, during performance of such a safe stoppage maneuver or after the autonomous road vehicle (2) has stopped, performing one or more risk mitigation actions adapted to the determined current traffic situation.\n | 14. An autonomous road vehicle (2) having a localization system (3) and sensors (4) for monitoring the autonomous road vehicle (2) surroundings and motion, and a signal processing system (5) for processing sensor signals enabling an autonomous drive mode of the autonomous road vehicle (2) by an autonomous drive control unit (6) of the autonomous road vehicle (2), characterized in that it comprises a safety stoppage device (1) according to any one of claims 1 to 12.", + "output": "The device (1) has a processing unit (7) for determining whether control of an autonomous road vehicle (2) has been assumed by a driver within pre-determined time and following calculated safe trajectory to a stop within drivable space in a safe stoppage maneuver based on negative determination to control the autonomous road vehicle by an autonomous drive control unit (6), where the device performs risk mitigation actions adapted to determined current traffic situation during performance of the safe stoppage maneuver or after stopping the autonomous road vehicle. An INDEPENDENT CLAIM is also included for a method for facilitating safety stoppage of an autonomous road vehicle. Safety stoppage device for an autonomous road vehicle. The device continuously estimates risk associated with performing the safe stoppage maneuver in the determined current traffic situation, and adapts the risk mitigation actions to the estimated risk for providing reduced risk of accident when the autonomous vehicle is to be stopped, and the driver is not capable of taking control over the vehicle. The drawing shows a schematic view of an autonomous road vehicle comprising a safety stoppage device. 1Safety stoppage device2Autonomous road vehicle4Sensors6Autonomous drive control unit7Processing unit", + "instruction": "Please summarize the input" + }, + { + "input": "Apparatus and method for continuously establishing a boundary for autonomous driving availability and an automotive vehicle comprising such an apparatusProvided are a method and an apparatus (1) for continuously establishing a boundary for autonomous driving availability, in a vehicle (2) having autonomous driving capabilities and comprising remote sensors (3) for acquiring vehicle surrounding information (4) and vehicle dynamics sensors (5) for determining vehicle dynamics parameters (6), as well as a vehicle (2) comprising such an apparatus (1). At least one of a positioning arrangement (7) that provides map data with associated information; a route planning arrangement (8) that enables route planning; a vehicle driver monitoring arrangement (9) that provides driver monitoring information (10); and a real time information acquiring arrangement, that acquires at least one of traffic information (11 a) and weather information (11 b). The boundary is calculated based on a planned route and at least one of vehicle surrounding information (4), vehicle dynamics parameters (6), driver monitoring information (10), map data, traffic information (11a) and weather information (11b), for the planned route. Changes in the calculated boundary are output to a human machine interface (13) arranged in the vehicle (2).|1. An apparatus (1) for continuously establishing a boundary for autonomous driving availability in a vehicle (2) having autonomous driving capabilities, the vehicle (2) comprising: \n * sensors (3) arranged to acquire vehicle surrounding information (4);\n * vehicle dynamics sensors (5) arranged to determine vehicle dynamics parameters (6);\n * further comprising: \n * at least one of: a positioning arrangement (7) arranged to provide map data with associated speed limit and road infrastructure information; a route planning arrangement (8); a vehicle driver monitoring arrangement (9) arranged to provide vehicle driver monitoring information (10); and an arrangement for acquiring real time information (11), including at least one of real time traffic information (11a) and real time weather information (11b), and\n * a processor (12) arranged to continuously calculate a boundary for autonomous driving availability based on a planned route and at least one of vehicle surrounding information (4), vehicle dynamics parameters (6), vehicle driver monitoring information (10), map data with associated speed limit and infrastructure information, real time traffic information (11a) and real time weather information (11b), associated with the planned route, and\n * characterized in that it further comprises: \n * a human machine interface (13) arranged to output to a vehicle (2) passenger compartment (14) information on any changes in the calculated boundary for autonomous driving availability along the planned route,\n * wherein the human machine interface (13) is arranged to present the information graphically to a display indicating a distance to the calculated boundary for autonomous driving availability.\n | 2. An apparatus (1) according to claim 1, characterized in that it further comprises an interface for communicating the information on any changes in the calculated boundary for autonomous driving availability along the planned route to an autonomous drive control unit of the vehicle.\n | 3. An apparatus (1) according to any one of claims 1 to 2, characterized in that the human machine interface (13) further is arranged to output to the vehicle (2) passenger compartment (14) information relating to changes in automation level available with the current calculated boundary for autonomous driving availability.\n | 4. An apparatus (1) according to any one of claims 1 to 3, characterized in that the arrangement for acquiring real time information (11), when present, comprises an interface for communication via one or more portable communication devices of vehicle (2) occupants for acquiring the real time information.\n | 5. An apparatus (1) according to any one of claims 1 to 4, characterized in that the arrangement for acquiring real time information (11), when present, comprises an interface for performing at least one of vehicle-to-vehicle and vehicle-to-infrastructure communication for acquiring the real time information.\n | 6. An apparatus (1) according to any one of claims 1 to 5, characterized in that it further comprises an interface for communicating the information on any changes in the calculated boundary for autonomous driving availability along the planned route externally of the vehicle (2) using at least one or more portable communication devices of vehicle (2) occupants, vehicle-to-vehicle communication and vehicle-to-infrastructure communication.\n | 7. A method for continuously establishing a boundary for autonomous driving availability in a vehicle (2) having autonomous driving capabilities, the vehicle (2) comprising: \n * sensors (3) arranged to acquire vehicle surrounding information (4);\n * vehicle dynamics sensors (5) arranged to determine vehicle dynamics parameters (6);\n * the method comprising at least one of the steps of: \n * providing map data with associated speed limit and road infrastructure information using a positioning arrangement (7);\n * performing route planning using a route planning arrangement (8);\n * monitoring a vehicle driver and providing vehicle driver monitoring information (10) using a vehicle driver monitoring arrangement (9); and\n * acquiring real time information, including at least one of real time traffic information (11a) and real time weather information (11b), and the step of:\n continuously calculating, using a processor (12), a boundary for autonomous driving availability based on based on a planned route and at least one of vehicle surrounding information (4), vehicle dynamics parameters (6), vehicle driver monitoring information (10), map data with associated speed limit and infrastructure information, real time traffic information (11a) and real time weather information (11b), associated with the planned route, \n * characterized in that the method further comprises:\n outputting, to a human machine interface (13) arranged in a vehicle (2) passenger compartment (14), information on any changes in the calculated boundary for autonomous driving availability along the planned route, \n * wherein the human machine interface (13) is arranged to present the information graphically to a display indicating a distance to the calculated boundary for autonomous driving availability.\n | 8. An automotive vehicle (2) having autonomous driving capabilities characterized in that it comprises an apparatus (1) for continuously establishing a boundary for autonomous driving availability according to any one of claims 1 to 6.", + "output": "The apparatus (1) has a processor (12) continuously calculating a boundary for autonomous driving availability based on a planned route and one of vehicle surrounding information (4), vehicle dynamics parameters (6), vehicle driver monitoring information (10), map data with associated speed limit and infrastructure information, real time traffic information (11a), and real time weather information (11b) associated with the planned route. A human-machine interface (13) outputs information on any changes in the calculated boundary along the planned route to the passenger compartment (14). An INDEPENDENT CLAIM is also included for a method for continuously establishing a boundary for autonomous driving availability in a vehicle having autonomous driving capabilities. Apparatus for continuously establishing boundary for autonomous driving availability in vehicle having autonomous driving capabilities. The provision of a continuously calculated boundary for autonomous driving availability and a human machine interface arranged to output to a vehicle passenger compartment information on any changes in the calculated boundary for autonomous driving availability along the planned route promotes the driver's trust in the autonomous driving capabilities of the vehicle as well as increases the driver's readiness to assume manual control of the vehicle if so required. The provision of a request for hand over from autonomous driving to manual driving provides sufficient time for safe hand over from autonomous driving to manual driving, thus ensuring that the vehicle driver does not suffer a stressful and potentially dangerous transition to manual driving. The provision of communicating the information on any changes in the calculated boundary for autonomous driving availability along the planned route to an autonomous drive control unit of the vehicle enables vehicle systems performs adaptations in dependence upon the available degree of automation indicated by the calculated boundary for autonomous driving availability. The provision of a human machine interface arranged to output to the vehicle passenger compartment information relating to changes in automation level available with the current calculated boundary for autonomous driving availability enables a driver of the vehicle to become aware of why adaptations in in the autonomous drive, e.g. towards a higher or lower degree of automation, is made and also enables the driver to continuously monitor the autonomous drive while retaining a feeling of control. The provision of outputting the information to the vehicle passenger compartment through one of a graphical, an audio or a tactile output arrangement provides options for ensuring that the information reaches the vehicle driver, irrespective of his/her current focus. The provision of an interface for communication via one or more portable communication devices of vehicle occupants for acquiring the real time information enables either the realization of a less complex and more cost effective apparatus or alternatively the provision of a redundant back-up channel for acquiring the real time information. The provision of an interface for performing one of vehicle-to-vehicle and vehicle-to-infrastructure communication for acquiring the real time information enables the realization of an effective apparatus for acquiring real time information which is highly relevant for the current surroundings. The drawing shows the schematic diagram of an apparatus for continuously establishing a boundary for autonomous driving availability, in a vehicle having autonomous driving capabilities. 1Apparatus4Vehicle surrounding information6Vehicle dynamics parameters10Vehicle driver monitoring information11aReal time traffic information11bReal time weather information12Processor13Human-machine interface14Passenger compartment", + "instruction": "Please summarize the input" + }, + { + "input": "CONCEPT OF COORDINATING AN EMERGENCY BRAKING OF A PLATOON OF COMMUNICATIVELY COUPLED VEHICLESThe present invention provides a concept of coordinating emergency braking of a platoon 100 of communicatively connected vehicles 110 . In response to the emergency situation 120 , individual braking control settings are centrally determined for one or more vehicles 110 of the platoon 100 by the management entity 110 - 3 managing the platoon 100 ( 230 ). ). The individual braking control settings are communicated from the management entity 110 - 3 to the one or more vehicles 110 of the platoon 100 . One or more vehicles 110 of the platoon 100 brake 250 according to each individual brake control setting received from the management entity 110 - 3 .|1. A method (200) for adjusting emergency braking of a platoon (100) of communicatively connected vehicles (110), wherein an emergency (120) is detected by the vehicle (110-1) of the platoon a step 210 of;\nbroadcasting (220) an emergency message from the vehicle (110-1) to other vehicles (110-2, 110-3) of the platoon in response to the detection of the emergency situation (120); \nin response to receiving the emergency message from the vehicle (110-1), forming a braking pressure of the other vehicles (110-2, 110-3) of the platoon; \nIndividual braking for one or more vehicles 110 of the platoon 100 by the management entity 110 - 3 managing the platoon 100 after generating the braking pressure in response to the emergency situation 120 . determining (230) control settings; \ntransmitting (240) the individual braking control settings from the management entity (110-3) to one or more vehicles (110) of the platoon (100); \nA method of adjusting emergency braking, comprising the step of braking (250) one or more vehicles (110) of the platoon (100) according to respective individual braking control settings received from the management entity (110-3) (200). \n | 2. The method (200) of claim 1, wherein the individual braking control settings indicate respective braking forces to be applied.\n | 3. The emergency braking according to claim 1 or 2, further comprising the step of notifying the management body (110-3) about at least one of individual characteristics and current conditions of each vehicle of the platoon (100). How to adjust (200).\n | 4. A method (200) according to claim 3, wherein determining the individual braking control settings of the vehicle (110) is based on at least one of an individual characteristic and a current condition of each vehicle.\n | 4. The method (200) according to claim 3, wherein at least one of the individual characteristics and the current state comprises at least one of weight, braking force, inter-vehicle distance, speed, and tire condition.\n | 6. Method (200) according to claim 1 or 2, wherein the plurality of vehicles (110) of the platoon (100) are each an autonomous vehicle or an at least partially autonomous vehicle. .\n | 3. A method (200) according to claim 1 or 2, wherein the plurality of vehicles (110) in the platoon communicate via a vehicle-to-vehicle communication system.\n | 8. The emergency braking according to claim 1 or 2, wherein the management body (110-3) is a vehicle of the platoon (100) acting as a master vehicle with respect to other vehicles of the platoon acting as a slave vehicle. How to adjust (200).\n | 9. In a platoon 100 of a plurality of communicatively connected vehicles 110, a first vehicle configured to detect an emergency situation and broadcast an emergency message to other vehicles in the platoon in response to the detected emergency situation ( 110-1), wherein in response to receiving the emergency message from the first vehicle (110-1), a braking pressure of another vehicle in the platoon is established;\na second vehicle (110-3) configured to determine a respective brake control parameter for each vehicle of the platoon in response to the emergency message and send the respective individual brake control parameter to each vehicle of the platoon; \na third vehicle (110-2) configured to adjust its braking setting according to its respective individual braking control parameter, wherein prior to reception of the individual braking control parameter by the second vehicle (110-3), the a plurality of communications, wherein in response to receiving the emergency message from the first vehicle 110 - 1 , a braking pressure is established in the second vehicle 110 - 3 and the third vehicle 110 - 2 . Platoon 100 of vehicles 110 connected to each other.", + "output": "The method (200) involves determining (230) individual braking control settings for one or more vehicles of the platoon by a managing entity managing the platoon in response to an emergency situation. The individual braking control settings from the managing entity are communicated (240) to one or more vehicles of the platoon. One or more vehicles of the platoon are braked (250) in accordance with the respective individual braking control settings received from the managing entity. INDEPENDENT CLAIMS are included for the following:a system of several communicatively coupled vehicle; anda vehicle. Method for coordinating emergency braking of platoon of communicatively coupled vehicle (claimed). The managing entity can be kept up to date with respect to current vehicle parameters, leading to more accurate individual braking control settings. The accurate prediction and coordination of the individual emergency braking maneuvers can be allowed. The drawing shows a flowchart illustrating the process for coordinating emergency braking of platoon of communicatively coupled vehicle. 200Method for coordinating emergency braking of platoon of communicatively coupled vehicle210Step for detecting emergency situation by vehicle of platoon230Step for determining individual braking control settings for one or more vehicles of platoon240Step for communicating individual braking control settings from managing entity to one or more vehicles of platoon250Step for braking one or more vehicles of platoon", + "instruction": "Please summarize the input" + }, + { + "input": "Method for providing travel route presettingThe invention relates to a method for providing travel route presetting (100) for a travel route system of a vehicle (300), the method comprises the following steps: providing a plurality of detected tracks (101) of other vehicles (320) in the route section (150) to be travelled, determining track preset (102) from the detected tracks (101), determining a deviation area (110) according to the detected tracks (101), wherein the deviation area (110) is determined according to the deviation between at least each detected track (101) and the track preset (102), and the travel route preset (100) is determined at least according to the track preset (102) and the deviation area (110). & #10; In addition, the present invention relates to a travel route system for a vehicle (300), comprising: a receiving module (301), for receiving the track (101) detected in the route section (150) to be travelled, a calculating unit (302), the receiving module (301) sends the detected track (101) to the calculating unit, -the computing unit is adapted to determine a trajectory preset (101) from the detected trajectory (101) and to determine a deviation area (110) based on at least the deviation of each detected trajectory (101) from the trajectory preset (102), and determining a travel route preset (100) based at least on the trajectory preset (102) and the deviation area (110).|1. A method for providing a travel route presetting (100) for a travel route system of a vehicle (300), the method comprising the steps of: providing a plurality of detected tracks (101) of other vehicles (320) in the route section (150) to be travelled, determining track presets (102) from the detected tracks (101), determining a deviation area (110) according to the detected trajectory (101), wherein the deviation area (110) is determined according to the deviation between at least each detected trajectory (101) and the trajectory preset (102), wherein the deviation area (110) is set as The deviation area (110) surrounds the track presetting (102), and thus forms the following area in the route section (150) to be travelled: when the vehicle (300) should move from the first route point (30) to the second route point (40), the vehicle (300) can preferentially stay in the area, wherein the detected track (101) and the track preset (102) extend between the first route point (30) and the second route point (40), determining the travel route preset (100) at least according to the track preset (102) and the deviation area (110).\n | 2. The method according to claim 1, wherein the detected trajectory (101) is transmitted by moving data and/or by vehicle-to-vehicle communication and/or between the vehicle (300) and/or infrastructure (400).\n | 3. The method according to claim 1 or 2, wherein the track presetting (102) is the average track of the detected track (101), wherein the deviation area is calculated by the standard deviation of the detected track (101).\n | 4. The method according to claim 1 or 2, wherein the following steps are further executed: detecting the sensor information about the environment of the vehicle (300) in the driving route, determining the track preset (103) based on the sensor according to the sensor information, and adapting the travel route presetting (100) according to the sensor-based trajectory presetting (103) and the deviation area (110).\n | 5. The method according to claim 1 or 2, wherein the weighting factor is considered when determining the track presetting (102) and/or the deviation area (110). wherein the weighting factor takes into account the time between determining the detected trajectory (101) and the time at which the vehicle (300) plans to travel through the route section (150), and/or the type of the other vehicle (320) of the detected trajectory (101).\n | 6. The method according to claim 1 or 2, wherein determining the interval of the travel route presetting (100) depends on the travel speed of the vehicle (300).\n | 7. The method according to claim 1 or 2, wherein the vehicle (300) is pre-set (100) by the at least partially autonomous vehicle controller using the travel route so as to guide the vehicle (300) in the transverse and/or longitudinal direction.\n | 8. The method according to claim 3, wherein the mean trajectory is determined by averaging through an arithmetic mean value of the detected trajectory (101).\n | 9. The method according to claim 4, wherein the travel route presetting (100) is adapted when the sensor information is specific to an obstacle (50) in a route section (150) to be travelled by the vehicle (300).\n | 10. A travel route system for a vehicle (300), the travel route system comprising: a receiving module (301), for receiving the track (101) detected in the route section (150) to be travelled, a calculating unit (302), the receiving module (301) sends the detected track (101) to the calculating unit, the calculation unit is adapted to determine a trajectory preset (102) from the detected trajectory (101) and determine a deviation area (110) based on at least a deviation of each detected trajectory (101) from the trajectory preset (102), and at least according to the track preset (102) and the deviation area (110) determining the driving route preset (100), wherein the deviation area (110) is set to, the deviation area (110) surrounds the track preset (102), and thereby forming the following areas in the route section (150) to be travelled: when the vehicle (300) should move from the first route point (30) to the second route point (40), the vehicle (300) can preferentially stay in the area, wherein the detected trajectory (101) and the trajectory presetting (102) extend between the first route point (30) and the second route point (40).\n | 11. The travel route system according to claim 10, wherein the sensor (303) suitable for detecting the environment around the vehicle (300) is connected with the calculating unit (302) so as to send the sensor information to the calculating unit (302). wherein the calculating unit (302) is adapted to determine a sensor-based trajectory presetting (103) based on the sensor information so as to additionally determine a travel route presetting (100) based on the sensor-based trajectory presetting (103).\n | 12. The travel route system according to claim 10 or 11, wherein the travel route system is adapted to perform the method for providing travel route presetting (100) according to any one of claims 1 to 8.", + "output": "The method involves providing detected trajectories of vehicles in a path section to be traveled. A trajectory specification is determined from the detected trajectories. A deviation zone is determined from the detected trajectories. The deviation zone is determined on the basis of deviation of individual detected trajectories from the trajectory specification. The route specification is determined based on the trajectory specification and the deviation zone. An INDEPENDENT CLAIM is included for route system of vehicle. Method for providing route specification for route system (claimed) of vehicle e.g. car. The significance and the benefit of the trajectory specification is improved. The provision of a particularly safe and precise route specification is enabled. The quality of the route specification and the safety is increased. The unnecessary arithmetic operations are avoided in sections in which few changes are required. The unnecessary adaptation of the route is avoided. The drawing shows a schematic view of the route section to be traveled with different trajectories. 10,20First and second axes30First waypoint", + "instruction": "Please summarize the input" + }, + { + "input": "Method for the autonomous or partly autonomous execution of a cooperative driving maneuverA method for autonomously or semi-autonomously carrying out a cooperative driving maneuver and a vehicle. Provision is made for a maneuvering vehicle which plans the execution of a driving maneuver to determine a maneuvering area of a road in which the driving maneuver is potentially executed, to communicate with one or more vehicles via vehicle-to-vehicle communication to detect one or more cooperation vehicles which will presumably be inside the maneuvering area during the execution of the driving maneuver, and to adapt its own driving behavior to the presumable driving behavior of the one or more cooperation vehicles to execute the planned driving maneuver. The disclosure provides a possibility which, by vehicle-to-vehicle communication, allows vehicles for jointly carrying out a cooperative driving maneuver to be identified and then allows the cooperative driving maneuver to be executed.The invention claimed is: \n | 1. A method for autonomously or semi-autonomously carrying out a cooperative driving maneuver, wherein a maneuvering vehicle plans execution of a driving maneuver, wherein during the method the maneuvering vehicle executes operations for planning and execution of a single driving maneuver comprising: \ndetermining a maneuvering area of a road in which the driving maneuver is potentially executed; \ncommunicating with other vehicles via vehicle-to-vehicle communication during the planning of the execution of the cooperative driving maneuver; \nfiltering communications received by the maneuvering vehicle via vehicle-to-vehicle communication from the vehicles, wherein the filtering is performed to determine which vehicles are relevant to carrying out the maneuvering vehicle's planned driving maneuver to detect cooperation vehicles which are presumed to be inside the maneuvering area during the execution of the driving maneuver; \ndetermining message formats of the messages received by the maneuvering vehicle via vehicle-to-vehicle communication; \ndetermining potential cooperation vehicles based on the message formats transmitted by the vehicles, and \nadapting the maneuvering vehicle's own driving behavior to presumable driving behavior of the one or more cooperation vehicles of the potential cooperation vehicles to execute the planned driving maneuver, \nwherein, in response to a determination that transmitted message formats are Environmental Perception Messages from the potential cooperation vehicles, wherein the Environmental Perception Messages contain information about free areas between the potential cooperation vehicles from the potential cooperation vehicles received by the maneuvering vehicle, the potential cooperation vehicles are determined to be one or more cooperation vehicles. \n | 2. The method of claim 1, wherein the maneuvering vehicle determines an approach period in which the maneuvering vehicle is presumed to reach the maneuvering area.\n | 3. The method of claim 1, wherein the maneuvering vehicle executes at least one of the following operations to detect the cooperation vehicles: \ndetermining approach areas of the road, from which the maneuvering area is theoretically reached within the approach period; \ndetermining potential cooperation vehicles which are in the approach areas based on the data received via vehicle-to-vehicle communication, predicting the driving behavior of each potential cooperation vehicle based on the data received via vehicle-to-vehicle communication; \npredicting its own driving behavior; and \ncomparing the predicted driving behavior of the one or more potential cooperation vehicles with the maneuvering vehicle's own predicted driving behavior to determine the plurality of cooperation vehicles. \n | 4. The method of claim 1, wherein the maneuvering vehicle continuously detects and evaluates the development of free areas between vehicles and, for this purpose, executes the following operations: \ndetermining the minimum size of a free area for carrying out the driving maneuver, the vehicle dimensions of the maneuvering vehicle and/or a safety distance; \ncomparing the size of newly detected free areas with the determined minimum size; and \nselecting a suitable free area presumed to be inside the maneuvering area during the planned execution of the maneuver and to have at least the minimum size for executing the driving maneuver. \n | 5. The method of claim 4, wherein the adaptation of the driving behavior of the maneuvering vehicle to the presumable driving behavior of the cooperation vehicles comprises adapting the trajectory of the maneuvering vehicle to reach the selected free area in response to the selected free area being inside the maneuvering area, wherein executability being cyclically checked as the free area is approached.\n | 6. The method of claim 1, further comprising: \ndetermining a maneuvering area of a road in which a driving maneuver of a vehicle is expected; \ncommunicating between the cooperation vehicle and one or more vehicles via vehicle-to-vehicle communication to detect a maneuvering vehicle which plans the execution of a driving maneuver and is presumed to be inside the maneuvering area during the execution of the driving maneuver; and \nadapting the driving behavior of the cooperation vehicle to the presumable driving behavior of the maneuvering vehicle to assist with the planned driving maneuver of the maneuvering vehicle. \n | 7. The method of claim 6, wherein the cooperation vehicle determines the maneuvering area and/or an approach period in which it will presumably reach the maneuvering area.\n | 8. The method of claim 6, wherein the cooperation vehicle executes at least one of the following operations to detect the maneuvering vehicle: \ndetermining the message formats of the messages received via vehicle-to-vehicle communication; \ndetermining one or more potential maneuvering vehicles based on the message format transmitted by these vehicles; \ndetermining approach areas of the road, from which the maneuvering area is theoretically reached within the approach period; \ndetermining one or more potential maneuvering vehicles which are in the approach areas by the data received via vehicle-to-vehicle communication; \npredicting the driving behavior of each potential maneuvering vehicle by the data received via vehicle-to-vehicle communication; \npredicting its own driving behavior; and \ncomparing the predicted driving behavior of the one or more potential maneuvering vehicles with the maneuvering vehicle's own predicted driving behavior to determine one or more maneuvering vehicles. \n | 9. The method of claim 6, wherein the adaptation of the driving behavior of the cooperation vehicle to the presumable driving behavior of the maneuvering vehicle comprises the following operations: \nadapting the trajectory of the cooperation vehicle to enlarge a free area in which the driving maneuver of the maneuvering vehicle is executed inside the maneuvering area. \n | 10. A transportation vehicle comprising: \na communication device for communicating with other transportation vehicles by vehicle-to-vehicle communication; and \nthe vehicle being configured to, operate as a maneuvering vehicle and/or as a cooperation vehicle, autonomously or semi-autonomously carrying out a cooperative driving maneuver, wherein in operating as a maneuvering vehicle, which plans the execution of a driving maneuver, the vehicle executes the following operations: \ndetermining a maneuvering area of a road in which the driving maneuver is potentially executed; \ncommunicating with vehicles via vehicle-to-vehicle communication with each of a plurality of other vehicles that are presumed to be inside the maneuvering area of the road during execution of the driving maneuver, filtering the communications received via vehicle-to-vehicle communication from the other vehicles according to vehicles which are relevant to carrying out the planned driving maneuver to detect cooperation vehicles; \ndetermining message formats of the messages received via vehicle-to-vehicle communication from these vehicles; \ndetermining potential cooperation vehicles based on the message formats transmitted by these vehicles; and \nadapting the maneuvering vehicle's own driving behavior to the presumable driving behavior of the cooperation vehicles to execute the planned driving maneuver, \nwherein in response to a determination that transmitted message formats are Environmental Perception Messages from the potential cooperation vehicles containing information about free areas between the potential cooperation vehicles, the potential cooperation vehicles are determined to be cooperation vehicles for the maneuvering vehicle. \n | 11. The method of claim 1, wherein information of each Environmental Perception Message is determined by sensors in each potential cooperation vehicle.\n | 12. The method of claim 11, wherein the sensors comprise radar sensors and the Environmental Perception Messages are transmitted several times a second.\n | 13. The vehicle of claim 10, wherein information of each Environmental Perception Message is determined by sensors in each potential cooperation vehicle.\n | 14. The vehicle of claim 13, wherein the sensors comprise radar sensors and the Environmental Perception Messages are transmitted several times a second.\n | 15. The method of claim 1, wherein the driving maneuver comprises merging into a flow of vehicles, wherein the communicating with the other vehicles includes communicating with a plurality of vehicles presumed to be in a maneuvering area during the execution of the merging.\n | 16. The transportation vehicle of claim 10, wherein the driving maneuver comprises merging into a flow of vehicles, wherein the communicating with the other vehicles includes communicating with a plurality of vehicles presumed to be in a merging area during the execution of the merging.", + "output": "The method involves determining a maneuvering area (28) of a road, in which the driving maneuver is potentially executed, and communicating with the vehicles by vehicle-to-vehicle communication. The filtering is carried out according to vehicles, which are relevant to carrying out the planned driving maneuver to detect the cooperation vehicles (26). The maneuvering vehicle own driving behavior is adapted to the presumable driving behavior of the cooperation vehicles to execute the planned driving maneuver. Method for autonomously or semi-autonomously carrying out a cooperative driving maneuver for vehicle (Claimed). The method involves determining a maneuvering area of a road, in which the driving maneuver is potentially executed, and communicating with the vehicles by vehicle-to-vehicle communication, and hence ensures safe and reliable driving maneuver carrying out method. The drawing shows a schematic representation of traffic situation. 26Cooperation vehicles28Maneuvering area30Road34Maneuvering vehicle36Approach areas", + "instruction": "Please summarize the input" + }, + { + "input": "Method and control system for determining a traffic gap between two vehicles for changing lanes of a vehicleThe present invention relates to a vehicle-to-vehicle communication system and a method for determining a traffic gap between two vehicles for changing lanes of a vehicle. The method includes identifying 110 a traffic gap based on a first detection and based on a second detection. The first detection is based on at least one vehicle-to-vehicle status message of at least one other vehicle 200. The second detection is based on the on-board sensor system of the vehicle 100. \n\n|1. A method for determining a traffic gap between two vehicles for lane change of a vehicle 100, the method comprising: identifying (110) a traffic gap based on a first detection and based on a second detection, the second detection 1 detection is based on at least one vehicle-to-vehicle status message of at least one other vehicle 200, and the second detection is based on an on-board sensor system of the vehicle 100, the step of identifying ( 110);\nDetecting (155) that the identifying (110) did not identify a traffic gap; And a driving intention message based on the step 155 of detecting that the identifying step 110 does not identify a traffic gap.- The driving intention message includes information on a request for a future lane change of the vehicle 100. Including the step of transmitting 160, wherein the rough detection of the traffic gap is performed in the first detection, and the precise detection of the traffic gap detected in the first detection in the second detection is performed. , A method for determining a traffic gap between two vehicles. \n | 2. The vehicle-to-vehicle status message according to claim 1, wherein the at least one vehicle-to-vehicle status message includes information on at least one of a location and a trajectory of the at least one other vehicle (200), and the first detection is performed on the at least one other vehicle ( 200) based on information about at least one of the location and trajectory.\n | 3. The method according to claim 1 or 2, wherein the identifying step (110) is further based on a third detection based on a vehicle-to-vehicle message comprising environmental information of the at least one other vehicle (200), and the environment The information is based on a sensor record of the environment of the at least one other vehicle 200 by at least one on-board sensor of the at least one other vehicle 200.\n | 3. The method of claim 1 or 2, further comprising: longitudinally adjusting (120) the vehicle parallel to the identified traffic gap; And transversely adjusting (130) the vehicle by changing lanes parallel to the identified traffic gap.\n | 5. The method of claim 4, wherein the longitudinal adjustment step (120) corresponds to the adjustment of the speed or position of the vehicle (100) in the driving direction, or the longitudinal adjustment step (120) is a speed for an adaptive cruise control system. -Including the step of providing a time curve, or the longitudinal adjustment step 120 includes displaying a longitudinal adjustment aid for the driver of the vehicle 100, or the vehicle 100 is automatically The method corresponds to the driving vehicle 100, wherein the longitudinal adjustment step 120 corresponds to the longitudinal control of the autonomous driving vehicle 100 based on the identified traffic gap.\n | 6. The method of claim 4, wherein the transverse adjustment step (130) corresponds to the adjustment of the position of the vehicle (100) in the horizontal direction with respect to the driving direction, or the longitudinal adjustment step (120) is performed by the vehicle (100). When the position is set parallel to the identified traffic gap, the lateral adjustment step 130 is performed, or the lateral adjustment step 130 includes a driver-led automatic lane change, or the lateral adjustment step ( 130) includes the step of displaying a lateral direction adjustment assistance means for a driver of the vehicle 100, or the vehicle 100 corresponds to an autonomous vehicle 100, and the lateral direction adjustment step 130 Is corresponding to the lateral control of the self-driving vehicle 100.\n | 3. A method according to any of the preceding claims, further comprising the step (150) of determining the driving intention of a driver of the vehicle with respect to a lane change.\n | 8. The method of claim 7, further comprising transmitting (160) a driving intention message based on the step of determining the driving intention (150).\n | 9. A method for a vehicle (205), the method comprising: receiving (210) a driving intention message including a lane change request from an inquiry vehicle (100);\nAs a step 220 of detecting information on cooperation during cooperative driving control with the inquiry vehicle 100, the information on cooperation includes cooperation in consideration of whether the vehicle 205 is possible as a cooperation partner and traffic conditions. The step of detecting 220 is to present whether the movement is possible based on the driving intention message; \nTo make it possible to calculate whether the interruption request can be met within the range of possible cooperation, information about at least one gap for at least one of the front vehicle and the rear vehicle is detected (232), and the driving control is performed. Based on the information, the information on the at least one interval, the speed of the vehicle 205 and the distance to the possible cooperation range, the execution of driving control is detected (234), and whether driving control is possible in consideration of the traffic situation By calculating whether or not (236), determining (230) information about the driving control; And providing (240) a driving assistance for executing driving control, wherein the method includes exchanging a vehicle-to-vehicle adjustment message for coordinating cooperative driving control with at least one other vehicle 200 Further comprising, upon receipt of a message regarding acceptance of a lane change request from the at least one other vehicle 200, the detecting step 220, the determining step 230 and the providing step 240 The method for vehicle 205, at least one of which is interrupted. \n | 10. The vehicle according to claim 9, wherein the providing (240) of the driving assistance corresponds to an automatic or semi-automatic execution of driving control, or the providing (240) of the driving assistance is performed by means of a human-machine interface. The method corresponding to the step of providing guidance for the driver of 205 regarding the implementation of the driving maneuver.\n | 11. The method of claim 9 or 10, wherein the providing step further comprises providing a message regarding acceptance of a lane change request between the inquiry vehicle 100 and the at least one other vehicle 200. Way.\n | 12. In the control system 10 for vehicle 100, identifying a traffic gap does not identify a traffic gap, so as to identify a traffic gap between two vehicles based on a first detection and a second detection. In order to detect, if identifying the traffic gap does not identify the traffic gap, the vehicle in parallel to the identified traffic gap to transmit a driving intent message containing information regarding a future lane change request of the vehicle 100. Is formed to adjust longitudinally and to adjust the vehicle laterally by changing lanes parallel to the identified traffic gap, wherein the first detection is at least one vehicle-to-vehicle status message of at least one other vehicle 200 Based on, and the second detection is based on the on-board sensor system of the vehicle 100, the rough detection of the traffic gap is performed in the first detection, and the detection in the first detection in the second detection A control system for a vehicle, in which precise detection is performed on the resulting traffic gap.\n | 13. In the control system 20 for the vehicle 205, the vehicle 205 when cooperative driving control with the inquiry vehicle 100 is received to receive a driving intention message including a lane change request from the inquiry vehicle 100 Calculate whether the interruption request can be met within the scope of possible cooperation, to detect information on cooperation that suggests based on the driving intention message whether it is possible as this cooperation partner and whether cooperation behavior is possible taking into account traffic conditions. In order to enable the detection of information on at least one interval for at least one of a front vehicle and a rear vehicle, information on driving control, information on the at least one interval, and speed of the vehicle 205 And a driving assistant for executing driving maneuvering to determine information about driving maneuvering by detecting the execution of driving maneuvering based on the distance for the possible cooperation range, and calculating whether driving maneuvering is possible in consideration of the traffic condition. To provide stance, And it is formed to exchange a vehicle-to-vehicle adjustment message for coordinating cooperative driving control with at least one other vehicle 205, upon receiving a message regarding acceptance of a lane change request from the at least one other vehicle 200 , At least one of the detecting (220), the determining (230) and the providing (240) is interrupted.", + "output": "The method involves identifying (110) the traffic gap based on a first detection and based on a second detection. The first detection is based on a vehicle-to-vehicle status message of a vehicle (200). The second detection is based on a board sensor system of a vehicle (100). The vehicle-to-vehicle status message comprises information about a position and/or a trajectory of the vehicle (200). The first detection is based on the information about the position and/or the trajectory of the vehicle (200). An INDEPENDENT CLAIM is included for a control system for determining traffic gap between vehicles for lane change for vehicle. Method for determining traffic gap between vehicles for lane change for vehicle e.g. car. The traffic gap between vehicles for lane change for vehicle is determined effectively. The cooperative driving functions of the vehicle are supported efficiently. The drawings show the flow diagrams illustrating the process for determining traffic gap between vehicles for lane change for vehicle, and block diagram of the control system for determining traffic gap between vehicles for lane change for vehicle. (Drawing includes non-English language text) 100,200Vehicles110Step for identifying traffic gap120Step for performing longitudinal regulation corresponding to regulation of speed of vehicle130Step for performing transverse regulation for threading into selected gap150Step for determining driving intention of driver of vehicle", + "instruction": "Please summarize the input" + }, + { + "input": "METHOD FOR RESOURCE ALLOCATION IN A MOBILE COMMUNICATION SYSTEM AND BASE STATION, AND PARTICIPANT COMMUNICATION MODULE FOR THE USE IN THE METHODFor the scenario of vehicles (30) equipped with wireless communication modules (31) that communicate directly with each other on public roads, either for a cooperative or autonomous driving scenario, a very high reliability is very important. With LTE-V, the 3GPP standardization organization has specified a technique called sidelink communication with which the direct communication between cars is possible in the LTE frequency bands. The resources are scheduled in a base station (20) of a mobile communication cell. Since different mobile communication providers are available, there is the problem how to make it possible that participants from different providers can communicate with each other for a cooperative awareness traffic scenario with LTE-V communication. The solution proposed is that each provider will assign a dedicated spectrum (V, T, E, O) that is controlled by each provider itself for resource allocation for its own participants and the participants of other providers. The resource allocation management functionality for the direct communication among the participants from the plurality of providers is shifted from provider to provider from time slice (t_0, t_1, t_2, t_3) to time slice (t_0, t_1, t_2, t_3). This provides for a fair distribution of the resource management functionality among the different providers. At the same time, it avoids the provision of multiple transceiver chains in the communication modules with which the vehicles are equipped.|1. Method for resource allocation in a mobile communication system, comprising a plurality of base stations (20) from a plurality of mobile communication providers and a plurality of participants from the plurality of mobile communication providers, wherein each provider has assigned a dedicated spectrum (V, T, E, O) for resource allocation for its own participants, wherein the participants from the plurality of providers communicate directly among each other, wherein a given provider allocates a part (V2V) of its dedicated spectrum for the direct communication among the participants from the plurality of providers, \n * ? wherein either said given provider will schedule the resources in the part (V2V) of the dedicated spectrum (V, T, E, O) for its own participants and the participants of the other providers by means of a scheduler (225) in a provider owned base station (20), or\n * ? wherein the part (V2V) of a dedicated spectrum (V, T, E, O) of said given provider for the direct communication among the participants from the plurality of providers is divided into sections (V2V_V, V2V_T, V2V_E, V2V_O), with each provider of the plurality of providers having been assigned at least one section (V2V_V, V2V_T, V2V_E, V2V_O) of said part (V2V) of the dedicated spectrum (V, T, E, O) of the given provider, and where a base station (20T, 20V) of each of the plurality of providers other than said given provider will schedule the resources in its assigned section (V2V_V, V2V_T, V2V_E, V2V_O) of said dedicated spectrum (V,T, E, O) for the direct communications of its own participants, wherein the resource allocation management functionality for allocating a part of its dedicated spectrum for the direct communication among the participants from the plurality of providers is shifted from provider to provider from time slice (t_0, t_1, t_2, t_3) to time slice (t_0, t_1, t_2, t_3).\n | 2. Method according to claim 1, wherein the resource allocation functionality is shifted from provider to provider from time slice (t_0, t_1, t_2, t_3) to time slice (t_0, t_1, t_2, t_3) in a round robin fashion, maximum rate queuing fashion or proportionally fair queuing fashion.\n | 3. Method according to claim 1 or 2, wherein each provider announces to all other providers which part (V2V) of its dedicated spectrum (V, T, E, O) is reserved for the direct communication among the participants from the plurality of providers.\n | 4. Method according to claim 3, wherein each provider announces to its own participants which section of the announced part (V2V) of the dedicated spectrum (V, T, E, O) is reserved for the direct communication among its own participants.\n | 5. Method according to claim 3 or 4, wherein each provider will schedule resources in its section (V2V_V, V2V_T, V2V_E, V2V_O) of the part of (V2V) the dedicated spectrum (V, T, E, O) for its own participants by means of a scheduler in said provider owned base station (20).", + "output": "The method involves providing base stations from multiple mobile communication providers and multiple participants from the multiple mobile communication providers in which each provider has assigned a dedicated spectrum (V,T,E,O) for resource allocation for its own participants and participants from the providers communicate directly among each other in particular with cooperative awareness messages. The resource allocation management functionality for the direct communication among the participants from the multiple providers is shifted from provider to provider from time slice (t-0-t-3) to time slice. INDEPENDENT CLAIMS are included for the following:a participant communication module; anda base station. Method for resource allocation in mobile communication system. The wireless vehicle communication network can help to reduce the weight of the vehicle by eliminating the need to install cables between the components which communicate. The drawing shows a schematic view illustrating how a portion of a dedicated spectrum in the LTE frequency bands which is allocated for communication is shifted from provider spectrum to provider spectrum per time slice. V,T,E,OSpectrumt-0-t-3Time slice", + "instruction": "Please summarize the input" + }, + { + "input": "METHOD FOR PLANNING A COOPERATIVE DRIVING MANEUVER, CORRESPONDING CONTROL UNIT AND VEHICLE EQUIPPED WITH A CONTROL UNIT AS WELL AS COMPUTER PROGRAMThe proposal concerns a method for planning a cooperative driving maneuver which may be used in the scenario of cooperative driving or autonomous driving. The method comprises the steps of observing the surroundings of a vehicle (10A), determining a planned trajectory (PT) the vehicle (10A) drives on for a certain amount of time, determining a desired trajectory (DT) different from the planned trajectory (PT) requiring a cooperative driving maneuver with at least one of the surrounding vehicles (10B, 10C). The solution according to the invention comprises the steps of determining a timeout value for the cooperative driving maneuver, starting a negotiation phase with the vehicles (10B, 10C) involved in the cooperative driving maneuver by sending a cooperative driving maneuver request message (MCM), waiting for the response messages from the involved vehicles (10B, 10C) and changing to the desired trajectory (DT) if the involved vehicles (10B, 10C) have accepted the desired trajectory (DT) before the negotiation phase has expired according to the timeout value.|1. Method for planning a cooperative driving maneuver, comprising the steps of observing the surroundings of a vehicle (10A), determining a planned trajectory (PT) the vehicle (10A) drives on for a certain amount of time, determining a desired trajectory (DT) different from the planned trajectory (PT) requiring a cooperative driving maneuver with at least one of the surrounding vehicles (10B, 10C), characterized by the steps of determining a timeout value (TO) for the cooperative driving maneuver, starting a negotiation phase with the vehicles (10B, 10C) involved in the cooperative driving maneuver by sending a maneuver coordination message (MCM), waiting for the response messages from the involved vehicles (10B, 10C) and changing to the desired trajectory (DT) if the involved vehicles (10B, 10C) have accepted the desired trajectory (DT) before the negotiation phase has expired according to the timeout value (TO).\n | 2. Method according to claim 1, further comprising a step of determining a branch point (BP) corresponding to a point lying on the planned trajectory (PT) and the desired trajectory (DT) at which the planned (PT) and the desired trajectory (DT) separate and checking if the vehicle (10A) will reach the branch point (BP) before the negotiation phase is over according to the determined timeout value (TO) and if yes, terminating the planning of the cooperative driving maneuver and not sending out said maneuver coordination message (MCM).\n | 3. Method according to claim 1 or 2, wherein for the step of determining a timeout value (TO) a step of determining the number of vehicles involved in the cooperative driving maneuver is performed and wherein the typical one-way trip time required for sending a message from one vehicle to another multiplied by the number of vehicles involved in the cooperative driving maneuver is added to the typical time for deciding on the acceptance or rejection of the cooperative driving maneuver to calculate the negotiation time (NT) for the cooperative driving maneuver.\n | 4. Method according to claim 3, wherein the typical round trip time for the internal network transfer in the vehicle (10A) having sent out the maneuver coordination message (MCM) is added to the negotiation time (NT) in order to determine the total negotiation time.\n | 5. Method according to claim 3 or 4, wherein in the vehicle (10A) having sent out the maneuver coordination message (MCM) the typical one-way trip time required for sending a message from one vehicle to another is adapted to the current estimation of the quality of service of the vehicle-to-vehicle radio communication system.\n | 6. Method according to one of claims 3 to 5, wherein in the vehicle (10A) having sent out the cooperative driving maneuver request message the timeout value (TO) is set to the negotiation time (NT) when it is found that the requesting vehicle (10A) will reach the branch point (BP) before the negotiation time (NT) is over.\n | 7. Method according to one of the previous claims, wherein the timeout value (TO) is entered into the payload field of the maneuver coordination message (MCM) to inform the involved vehicles (10B, 10C) about the timeout value for the negotiation phase of the cooperative driving maneuver.\n | 8. Method according to one of the previous claims, wherein the planned trajectory (PT) and the desired trajectory (DT) is entered into the payload field of the maneuver coordination message (MCM) to inform the involved vehicles (10B, 10C) about the planned cooperative driving maneuver.\n | 9. Method according to one of claims 6 to 8, wherein an involved vehicle (10B, 10C) performs a step of checking the timeout value (TO) in the received maneuver coordination message (MCM) and when it finds that the typical time for deciding on the acceptance or rejection of the cooperative driving maneuver is longer than the reported timeout value (TO), the involved vehicle (10B, 10C) will stop negotiating about the cooperative driving maneuver and transmit back to the requesting vehicle (10A) a message in which the cooperative driving maneuver is rejected.\n | 10. Computing unit, characterized in that, the computing unit (180) is adapted to perform the steps of one of the previous claims.\n | 11. Vehicle, characterized in that, the vehicle (30) is equipped with a computing unit (180) according to claim 10.\n | 12. Computer program, characterized in that, the computer program comprises program steps, which when the program is processed by a computing unit (180), cause it to carry out the method according to one of claims 1 to 9.", + "output": "The method involves observing the surroundings of a vehicle (10A). A planned trajectory (PT) is determined that the vehicle drives on for a certain amount of time. A desired trajectory (DT) id different from the planned trajectory requiring a cooperative driving maneuver with one of the surrounding vehicles (10B, 10C, 10D). A negotiation phase is started with the vehicles involved in the cooperative driving maneuver in response to the steps of determining a timeout value for the cooperative driving maneuver by sending a maneuver coordination message. Waiting is done for the response messages from the involved vehicles and changes are made to the desired trajectory if the involved vehicles have accepted the desired trajectory before the negotiation phase has expired according to the timeout value. INDEPENDENT CLAIMS are included for the following:a computing unit;a vehicle; anda computer program for planning a cooperative driving maneuver. Method for planning a cooperative driving maneuver in a vehicle e.g. mobile robot and driverless transport system, that is utilized in a motorway. Improves efficiency and comfort of automated driving. Ensures simple, reliable and efficient solution for cooperative driving maneuvers supported by vehicle-to-vehicle communication. The drawing shows a schematic view of the cooperative driving scenario. 10AVehicle10B, 10C, 10DSurrounding vehiclesDTDesired trajectoryPTPlanned trajectory", + "instruction": "Please summarize the input" + }, + { + "input": "System and method for using global electorate using regional certificate trust listThe invention claims a system, method and component for managing trust of a plurality of root certificate authority (CA) using both a voter and a regional certificate trust list (CTL). Accordingly, providing a system and method, the system and method is used for managing trust of multiple root CA, and in a more effective manner than the traditional known or can be used for the management. More specifically, the invention claims a system and a method for realizing V2I and/or V2X PKI technology.|1. A system for managing trust of a plurality of root certificate issuing mechanisms, which is used for using in the communication between at least two vehicles and vehicle (V2V) of a plurality of transport vehicles in the form of continuous broadcast of the basic safety (BSM), the system comprising: a transport vehicle device located on a transport vehicle of the plurality of transport vehicles; the transport vehicle device comprises a transceiver and at least one processor controlling the transceiver, wherein the at least one processor is configured to control the transceiver; so as to provide V2V communication through at least one communication link between the transport vehicle device of the transport vehicle and the transport vehicle device of other transport vehicles in the plurality of transport vehicles, wherein the communication link is provided by direct radio link, or through the communication of the mobile radio network; wherein the method uses the root certificate associated with the transport vehicle sending the BSM to perform digital signature to each BSM; the root certificate is used for protecting the transmission of the BSM on the communication link; and at least one area root certificate issuing mechanism in a plurality of area root certificate issuing mechanism; the area root certificate issuing mechanism determines whether the identity of the root certificate associated with the corresponding transport vehicle in the plurality of transport vehicles is legal for the jurisdiction of at least one area issuing mechanism.\n | 2. The system according to claim 1, wherein at least two of the plurality of region awarding mechanisms use a certificate trust list that includes at least one common root certificate in each of at least two of a plurality of respective jurisdictions identified as a legitimate root certificate.\n | 3. The system according to claim 2, wherein each region awarding mechanism is configured for modifying a certificate trust list listing legal certificates in their jurisdictions using an elector-based root management.\n | 4. The system according to claim 3, wherein the voter-based root management is performed using a ballot having a endorsement, wherein the majority of the voters identified by the area awarding mechanism of adding or deleting the root certificate in the certificate trust list are sought to sign the ballot; so as to carry out endorsement or revocation to the root certificate.\n | 5. The system according to claim 1, wherein the plurality of region awarding mechanisms are associated with respective jurisdictions that share a common border.\n | 6. The system according to claim 1, wherein each of the root certificate issuing mechanisms issues a digital certificate including a root certificate, wherein the digital certificate certiates the ownership of the public key through the named body of the digital certificate.\n | 7. The system according to claim 1, wherein each of the BSM comprises data specific to a transport vehicle, and the data includes a time, a position, a speed, and a forward direction of a transport vehicle to which the BSM is transmitted.\n | 8. The system according to claim 1, wherein the digital signature of the BSM is used as an authentication of the correctness and reliability of the data contained in the BSM.\n | 9. The system according to claim 1, wherein a digital signature of the BSM is analyzed by the transport vehicle safety application, prior to data prior to accessing the BSM by one or more transport vehicles safety on a transport vehicle in the plurality of transport vehicles that receive the BSM through V2V communication.\n | 10. The system according to claim 9, wherein the one or more transport vehicle safety applications are autonomous or auxiliary driving applications.\n | 11. The system according to claim 1, wherein a digital signature of the BSM is analyzed by the transport vehicle safety application, prior to receiving data of the BSM by one or more of the transport vehicles safety on the transport vehicle in the plurality of transport vehicles that are received through the V2V communication. .\n | 12. The system according to claim 11, wherein the one or more transport vehicle safety applications are autonomous driving or auxiliary driving applications.\n | 13. A method for managing trust of a plurality of root certificate issuing mechanisms, which is used for using in the form of continuous broadcast of the basic safety message (BSM) between at least two of the plurality of transport vehicles and the vehicle (V2V) communication, the method comprising: controlling the transmission of the V2V communication from the transport vehicle device located on the transport vehicle of the plurality of transport vehicles; the transport vehicle device comprises a transceiver and at least one processor controlling the transceiver, wherein the at least one processor controls the transceiver; so as to provide V2V communication through at least one communication link between the transport vehicle device of the transport vehicle and the transport vehicle device of other transport vehicles in the plurality of transport vehicles, wherein the communication link is provided by direct radio link, or through the communication of the mobile radio network; wherein the method uses the root certificate associated with the transport vehicle sending the BSM to perform digital signature to each BSM; the root certificate is used for protecting the transmission of the BSM on the communication link; wherein at least one of the plurality of area root certificate issuing mechanism determines whether the identity of the root certificate associated with the corresponding transport vehicle in the plurality of transport vehicles is legal to the jurisdiction of the at least one area issuing authority.\n | 14. The method according to claim 13, wherein at least two of the plurality of region awarding mechanisms use a certificate trust list that includes at least one common root certificate in each of at least two of a plurality of respective jurisdictions identified as a legitimate root certificate.\n | 15. The method according to claim 14, wherein each region awarding mechanism uses the root management based on the electorate to modify the certificate trust list listing the legal certificate in the jurisdiction.\n | 16. The method according to claim 15, wherein the voter-based root management is performed using a ballot having a endorsement, wherein the majority of the voters identified by the area awarding mechanism of the root certificate are sought to be added or deleted in the certificate trust list. to sign the vote, so as to carry out endorsement or revocation to the root certificate.\n | 17. The method according to claim 13, wherein the plurality of region awarding mechanisms are associated with respective jurisdictions that share a common border.\n | 18. The method according to claim 13, wherein each of the root certificate issuing mechanisms issues a digital certificate including a root certificate, wherein the digital certificate certiates the ownership of the public key through the named body of the digital certificate.\n | 19. The method according to claim 13, wherein each of the BSM comprises data specific to a transport vehicle, and the data includes a time, a position, a speed, and a forward direction of a transport vehicle transmitting the BSM.\n | 20. The method according to claim 13, wherein the digital signature of the BSM is used as an authentication of the correctness and reliability of the data contained in the BSM.\n | 21. The method according to claim 13, wherein the digital signature of the BSM is analyzed by the transport vehicle safety application, before one or more transport vehicles safety the transport vehicle of the plurality of transport vehicles receiving the BSM through the V2V communication are used to access the data of the BSM. .\n | 22. The method according to claim 21, wherein the one or more transport vehicle safety applications are autonomous or auxiliary driving applications.\n | 23. The method according to claim 13, wherein prior to receiving data of the BSM, one or more transport vehicles safety on a transport vehicle in the plurality of transport vehicles through V2V communication receiving the BSM; A digital signature of the BSM is analyzed by the transport vehicle safety\n | 24. The method according to claim 23, wherein the one or more transport vehicle safety applications are autonomous driving or auxiliary driving applications.\n | 25. A non-transitory computer readable medium, comprising a computer program with computer software code instruction, when the at least one computer processor to realize the code instruction computer software the computer software code instruction executes a method for managing trust of a plurality of root certificate issuing mechanism, which is used for using in the form of continuous broadcast of basic safety message (BSM) between at least two vehicles in a plurality of transport vehicles and vehicle (V2V) communication; the method comprises: controlling the transmission of the V2V communication from the transport vehicle device located on the transport vehicle of the plurality of transport vehicles; the transport vehicle device comprises a transceiver and at least one processor controlling the transceiver, wherein the at least one processor controls the transceiver; so as to provide V2V communication through at least one communication link between the transport vehicle device of the transport vehicle and the transport vehicle device of other transport vehicles in the plurality of transport vehicles, wherein the communication link is provided by direct radio link, or through the communication of the mobile radio network; wherein the method uses the root certificate associated with the transport vehicle sending the BSM to perform digital signature to each BSM; the root certificate is used for protecting the transmission of the BSM on the communication link; wherein at least one of the plurality of area root certificate issuing mechanism determines whether the identity of the root certificate associated with the corresponding transport vehicle in the plurality of transport vehicles is legal to the jurisdiction of the at least one area issuing authority.\n | 26. The non-transitory computer-readable medium according to claim 25, wherein at least two of the plurality of area issuing mechanisms use a certificate trust list; The certificate trust list includes at least one common root certificate that is identified as a legal root certificate in each of at least two of a plurality of corresponding jurisdictions.\n | 27. The non-transitory computer-readable medium according to claim 26, wherein each region awarding mechanism modifies a certificate trust list listing legal certificates in their jurisdictions using an elector-based root management.\n | 28. The non-transitory computer-readable medium according to claim 27, wherein the voter-based root management is performed using a ballot having a endorsement, wherein the voter-based root management is performed by using a rewritten vote. seeking to add or delete the area issuing mechanism of the root certificate in the certificate trust list, inquiring the majority of the voters identified by the area issuing mechanism, to sign the vote, so as to carry out endorsement or revocation to the root certificate.\n | 29. The non-transitory computer-readable medium according to claim 25, wherein the plurality of region awarding mechanisms are associated with respective jurisdictions that share a common border.\n | 30. The non-transitory computer-readable medium according to claim 25, wherein the root certificate authority issues a digital certificate including a root certificate, wherein the digital certificate certiates the ownership of the public key through the named body of the digital certificate.\n | 31. The non-transitory computer-readable medium according to claim 25, wherein the BSM includes data specific to a transport vehicle, and the data includes a time, a position, a speed, and a forward direction of a transport vehicle to which the BSM is transmitted.\n | 32. The non-transitory computer-readable medium according to claim 25, wherein the digital signature of the BSM is used as an authentication of the correctness and reliability of the data contained in the BSM.\n | 33. The non-transitory computer-readable medium according to claim 25, wherein the data of one or more transport vehicles safety on the transport vehicle of the plurality of transport vehicles receiving the BSM through the V2V communication is prior to the data of the BSM being accessed by the V2V communication; A digital signature of the BSM is analyzed by the transport vehicle safety\n | 34. The non-transitory computer-readable medium according to claim 25, wherein the one or more transport vehicle safety applications are autonomous or auxiliary driving applications.\n | 35. The non-transitory computer-readable medium according to claim 25, wherein prior to receiving data of the BSM by one or more transport vehicles safety on a transport vehicle in the plurality of transport vehicles received through the V2V communication, the transport vehicle is used to access the BSM; A digital signature of the BSM is analyzed by the transport vehicle safety\n | 36. The non-transitory computer-readable medium according to claim 25, wherein the one or more transport vehicle safety applications are autonomous driving or auxiliary driving applications.", + "output": "The system has a transportation vehicle equipment that is located on a transportation vehicle of multiple transportation vehicles. The transportation vehicle equipment includes a transceiver and a processor controlling the transceiver. The processor is configured to control the transceiver to provide V2V communication through communication link between the transportation vehicle equipment of the transportation vehicle and transportation vehicle equipment of other transportation vehicles of the multiple transportation vehicles. The communication link is provided through either direct radio link or communication or mobile- radio network. A regional root CA of multiple regional root CA dictate whether identities that associate root certificates with respective transportation vehicles of multiple transportation vehicles are legitimate for the jurisdiction for the regional authority. INDEPENDENT CLAIMS are included for the following:a method for managing trust across multiple root CA in V2V communication; anda non-transitory computer readable medium storing program for managing trust across multiple root CA in V2V communication. System for managing trust across multiple root certificate authorities (CA) in vehicle-to- vehicle (V2V) communication between transportation vehicles in the form of continuous broadcast of basic safety message (BSM). The system manages trust across multiple root CA using both electors and regional certificate trust lists (CTL) in a inventive way. The drawing shows the schematic diagram of the system for managing trust across multiple root CA in V2V communication. 110,120Jurisdictions115Credential management system117,127CTL125Security credential management system manager", + "instruction": "Please summarize the input" + }, + { + "input": "Method for planning the track of the vehicleA method for operating a navigation system with the method the destination of the autonomous vehicle of the driver guide to a desired path along a selected route, comprising: a step of obtaining information about the vehicle in the area around the autonomous vehicle, step to determine the trajectory of the vehicle based on the obtained information, and the vehicle of the track with the selected route path to the step of comparing. If it is determined that one vehicle in the vehicle once travel along the selected route path, currently travelling or will be travelling along the selected route path along the selected route path, then generates and outputs to the autonomous vehicle of the driver of the following instruction of the one vehicle.|1. A method for operating a navigation system with the method the destination for the autonomous vehicle of the driver guide to a desired path along a selected route, the method comprising: the information of the vehicle obtained in the region around the autonomous vehicle; based on the obtained information to determine the track of the vehicle around the autonomous vehicle of the; comparing the trajectory of the vehicle around the autonomous vehicle with the selected route path, and if it is determined that one vehicle in the vehicle around the autonomous vehicle: -been traveling along the selected route path, - currently running along the selected route path, or-will travel along the selected route path, then generates and outputs to the autonomous vehicle of the driver of the following instruction of the one vehicle.\n | 2. The method according to claim 1, wherein the step of obtaining information about the other vehicle comprises: using sensor data generated by at least one sensor for detecting at least one position in the vehicle around the autonomous vehicle, speed, heading, steering signal and/or lane allocation, and based on at least one of in the vehicle of the autonomous vehicle around the at least one position of at least one of the speed, the heading, the turn signal and/or the lane in the assignment to determine the trajectory of the vehicle around the autonomous vehicle.\n | 3. The method according to claim 1 or 2, wherein said step of obtaining the information about the vehicle around the autonomous vehicle comprises detecting, from the vehicle of the autonomous vehicle at the periphery of at least one of color and/or brand and/or manufacturer and and/or a turn signal and/or at least one of the type, and generating the instruction, said instruction comprises the detection of the vehicle in the autonomous vehicle to be followed to the color, the brand, the company, at least one of the turn signal and the type.\n | 4. The process according to any one of said claims, wherein said step of obtaining the information about the vehicle around the autonomous vehicle includes using the vehicle-to-vehicle (V2V) interface in the step of data receiving to the autonomous vehicle from the vehicle.\n | 5. The process according to any one of said claims, wherein using an audible signal and/or optical signal outputs said instruction to the driver.\n | 6. The process according to any one of said claims, further comprising the step the obtained information about the around the autonomous vehicle of the vehicle transmitted to the server, wherein the server performs the step of determining the track of the vehicle around the autonomous vehicle of the.\n | 7. The method according to claim 6, further comprising the step of transmitting to the server the selected route path, wherein the server performs the step comparing the trajectory of the vehicle around the autonomous vehicle with the selected route path.\n | 8. A program, the program implementing at least one of the method according to claim 1 to 7, the program is executed on the computer.\n | 9. The program according to claim 8, wherein the program is stored on a non-transitory computer readable medium accessible by the server.\n | 10. A method for autonomous vehicle navigation system (1), the system comprising: a routing unit (11), the route selecting unit for selecting the autonomous vehicle to the destination route path, a sensor unit (12); the sensor unit for obtaining sensor data of the vehicle in the region around the autonomous vehicle of from a plurality of sensors, and a processing unit (13), said processing unit is used for determining the track of the vehicle around the autonomous vehicle and the for the determined track with the selected route path to compare; instruction unit (14), said instruction unit for generating instructions for following the route path, wherein if matching the selected route path of the track of a vehicle around the autonomous vehicle, then the instruction unit is configured to generate following instructions of the one vehicle, and output unit (15), said output unit outputs said instruction for the driver to the autonomous vehicle to follow the one vehicle.\n | 11. The said system (1) according to claim 10, wherein the plurality of sensors comprises at least one of a camera, a radar, a laser radar, an inertial measurement unit (IMU) and a GNSS receiver, said GNSS receiver is used for receiving position coordinate of the autonomous vehicle from a global navigation satellite system (GNSS).\n | 12. The said system (1) according to claim 10 or 11, wherein the output unit (15) includes a head-up display.\n | 13. at least one system (1) according to claim 10 to 12, further comprising a receiving unit (16), said receiving unit for receiving, from the other vehicle receives information about the vehicle in the area around the autonomous vehicle, wherein the processing unit is configured for based on the received information to determine the trajectory and/or said instruction unit is configured to generates the instruction based on the received information.\n | 14. at least one system (1) according to claim 10 to 13, further comprising a server, the server being configured to communicate with the routing unit (11), the sensor unit (12), the processing at least one of communicate and exchange data in unit (13), said instruction unit (14) and the output unit (15).\n | 15. at least one system (1) according to claim 10 to 14, wherein the processing unit (13) is located at the server.", + "output": "The method involves obtaining information about vehicles in a region around the ego vehicle. Determine trajectories of the vehicles around the ego vehicle based on the obtained information. Compare the trajectories of the vehicles around the ego vehicle with the selected route path of the ego vehicle. If it is determined that one of the vehicles around the ego vehicle was driving, is currently driving, or will be driving along the selected route path then generate and output an instruction to the driver of the ego vehicle to follow the one vehicle. Detect one of a position, a velocity, a heading, a turn signal and a lane assignment of one of the vehicles around the ego vehicle using sensor data generated by one sensor;. INDEPENDENT CLAIMS are included for the following:a program implementing a method; anda navigation system for an ego vehicle. Method for operating a navigation system for guiding a driver of an ego vehicle to a desired destination along a selected route path. The server does not need to be a single centrally managed piece of hardware but may be implemented as a cloud computing network with the advantage of redundant components and simplified maintenance. The drawing shows a block representation of a navigation system. 11Routing unit12Sensor unit13Processing unit14Instruction unit16Reception unit", + "instruction": "Please summarize the input" + }, + { + "input": "Dynamically placing an internet protocol anchor point based on a user device and/or an applicationA device determines whether an application, utilized by a user device and associated with a network, is a low latency application or a best effort application. The device designates a first network device or a second network device as a designated network device to be an IP anchor point for the application based on a set of rules. The first network device is designated as the designated network device when the application is the low latency application, or the second network device is designated as the designated network device when the application is the best effort application. The device provides, to the user device, information informing the user device that the designated network device is to be the IP anchor point for the application, and provides, to the network, information instructing the network to utilize the designated network device as the IP anchor point for the application.What is claimed is: \n | 1. A device, comprising: \none or more memories; and \none or more processors, communicatively coupled to the one or more memories, to: \nreceive information indicating that a user device is utilizing an application associated with a network; \ndetermine whether the application is a low latency application or a best effort application; \ndesignate a first network device of the network or a second network device of the network as a designated network device to be an Internet protocol (IP) anchor point for the application based on a set of rules and based on determining whether the application is the low latency application or the best effort application, \nwherein the first network device is designated as the designated network device to be the IP anchor point for the application when the application is the low latency application, or \nwherein the second network device is designated as the designated network device to be the IP anchor point for the application when the application is the best effort application, \nwherein the set of rules includes two one or more of: \na rule indicating that the IP anchor point is to be as close to the user device as possible, \na rule indicating that the IP anchor point is to include a threshold amount of processing resources and memory resources, \na rule indicating that the IP anchor point is to be associated with a serving base station, \na rule indicating a timing advance distance between the IP anchor point and the user device, or \na rule indicating an operational pathloss between the IP anchor point and the user device, and \nthe one or more processors, when designating the first network device as the designated network device, are to: \napply a weight to each rule, of the set of rules, to generate a weighted set of rules, \ndetermine scores for a plurality of first network devices based on the weighted set of rules, and \nselect the first network device, from the plurality of first network devices, based on a score for the first network device being greater than scores associated with one or more other network devices from the scores for the plurality of first network devices; \nprovide, to the user device, information informing the user device that the designated network device is to be the IP anchor point for the application; and \nprovide, to the network, information instructing the network to utilize the designated network device as the IP anchor point for the application to permit the user device to utilize the designated network device as the IP anchor point for the application. \n | 2. The device of claim 1, wherein the one or more processors are further to: \nprovide, to the network, information instructing the network to utilize a third network device of the network as a control plane anchor point for the application of the user device. \n | 3. The device of claim 1, wherein: \nthe first network device is a user plane function (UPF) device provided at an edge of the network, and \nthe second network device is a UPF device provided at a central location of the network. \n | 4. The device of claim 1, wherein: \nthe low latency application includes one or more of: \nan autonomous driving application, \na real-time vehicle-to-vehicle (V2V) communication application, or \nan application that delivers video; and \nthe best effort application includes one or more of: \nan application to enable a web download, or \nan application to access the Internet. \n | 5. The device of claim 1, wherein the one or more processors are further to: \nreceive, from the user device, information indicating that a user stopped utilizing the application; and \nprovide, to the network, information instructing the network to stop utilizing the designated network device as the IP anchor point for the application. \n | 6. A non-transitory computer-readable medium storing instructions, the instructions comprising: \none or more instructions that, when executed by one or more processors of a device, cause the one or more processors to: \nreceive information indicating that a user device is utilizing an application associated with a network; \ndetermine whether the application is a low latency application or a best effort application; \ndesignate a first network device of the network or a second network device of the network as a designated network device to be an Internet protocol (IP) anchor point for the application based on a set of rules and based on determining whether the application is the low latency application or the best effort application, \nthe first network device being designated as the designated network device to be the IP anchor point for the application when the application is the low latency application, or \nthe second network device being designated as the designated network device to be the IP anchor point for the application when the application is the best effort application, \nwherein the set of rules includes two one or more of: \na rule indicating that the IP anchor point is to be as close to the user device as possible, \na rule indicating that the IP anchor point is to include a threshold amount of processing resources and memory resources, \na rule indicating that the IP anchor point is to be associated with a serving base station, \na rule indicating a timing advance distance between the IP anchor point and the user device, or \na rule indicating an operational pathloss between the IP anchor point and the user device, and \nthe one or more instructions, that cause the one or more processors to designate the first network device as the designated network device, cause the one or more processors to: \napply a weight to each rule, of the set of rules, to generate a weighted set of rules, \ndetermine scores for a plurality of first network devices based on the weighted set of rules, and \nselect the first network device, from the plurality of first network devices, based on a score for the first network device being greater than scores associated with one or more other network devices from the scores for the plurality of first network devices; \nprovide, to the user device, information informing the user device that the designated network device is to be the IP anchor point for the application; and \nprovide, to the network, information instructing the network to utilize the designated network device as the IP anchor point for the application to permit the user device to utilize the designated network device as the IP anchor point for the application. \n | 7. The non-transitory computer-readable medium of claim 6, wherein the instructions further comprise: \none or more instructions that, when executed by the one or more processors, cause the one or more processors to: \nprovide, to the network, information instructing the network to utilize a third network device of the network as a control plane anchor point for the application of the user device. \n | 8. The non-transitory computer-readable medium of claim 6, wherein: \nthe first network device is a user plane function (UPF) device provided at an edge of the network, and \nthe second network device is a UPF device provided at a central location of the network. \n | 9. The non-transitory computer-readable medium of claim 6, wherein: \nthe low latency application includes one or more of: \nan autonomous driving application, \na real-time vehicle-to-vehicle (V2V) communication application, or \nan application that delivers video; and \nthe best effort application includes one or more of: \nan application to enable a web download, or \nan application to access the Internet. \n | 10. The non-transitory computer-readable medium of claim 6, wherein the instructions further comprise: \none or more instructions that, when executed by the one or more processors, cause the one or more processors to: \nreceive, from the user device, information indicating that a user stopped utilizing the application; and \nprovide, to the network, information instructing the network to stop utilizing the designated network device as the IP anchor point for the application. \n | 11. A method, comprising: \nreceiving, by a device, information indicating that a user device is utilizing an application associated with a network; \ndetermining, by the device, whether the application is a low latency application or a best effort application; \ndesignating, by the device, a first network device of the network or a second network device of the network as a designated network device to be an Internet protocol (IP) anchor point for the application based on a set of rules and based on determining whether the application is the low latency application or the best effort application, \nthe first network device being designated as the designated network device to be the IP anchor point for the application when the application is the low latency application, or \nthe second network device being designated as the designated network device to be the IP anchor point for the application when the application is the best effort application, \nwherein the set of rules includes two one or more of: \na rule indicating that the IP anchor point is to be as close to the user device as possible, \na rule indicating that the IP anchor point is to include a threshold amount of processing resources and memory resources, \na rule indicating that the IP anchor point is to be associated with a serving base station, \na rule indicating a timing advance distance between the IP anchor point and the user device, or \na rule indicating an operational pathloss between the IP anchor point and the user device, and \ndesignating the first network device as the designated network device comprising: \napplying a weight to each rule, of the set of rules, to generate a weighted set of rules, \ndetermining scores for a plurality of first network devices based on the weighted set of rules, and \nselecting the first network device, from the plurality of first network devices, based on a score for the first network device being greater than scores associated with one or more other network devices from the scores for the plurality of first network devices; \nproviding, by the device and to the user device, information informing the user device that the designated network device is to be the IP anchor point for the application; and \nproviding, by the device and to the network, information instructing the network to utilize the designated network device as the IP anchor point for the application to permit the user device to utilize the designated network device as the IP anchor point for the application. \n | 12. The method of claim 11, further comprising: \nproviding, to the network, information instructing the network to utilize a third network device of the network as a control plane anchor point for the application of the user device. \n | 13. The method of claim 11, wherein: \nthe first network device is a user plane function (UPF) device provided at an edge of the network, and \nthe second network device is a UPF device provided at a central location of the network. \n | 14. The method of claim 11, wherein: \nthe low latency application includes one or more of: \nan autonomous driving application, \na real-time vehicle-to-vehicle (V2V) communication application, or \nan application that delivers video; and \nthe best effort application includes one or more of: \nan application to enable a web download, or \nan application to access the Internet. \n | 15. The method of claim 11, further comprising: \nreceiving, from the user device, information indicating that the user stopped utilizing the application; and \nproviding, to the network, information instructing the network to stop utilizing the designated network device as the IP anchor point for the application. \n | 16. The device of claim 1, wherein the one or more processors, when determining whether the application is the low latency application or the best effort application, are to: \ndetermine that the application is the low latency application based on a maximum allowed latency for the application. \n | 17. The device of claim 1, wherein the one or more processors, when applying the weight to each rule of the set of rules, are to: \napply different weights to different rules based on one or more of: \ninformation associated with the user device, \ninformation associated with the application, or \ninformation associated with the network. \n | 18. The non-transitory computer-readable medium of claim 6, wherein the one or more instructions, that cause the one or more processors to determine whether the application is the low latency application or the best effort application, cause the one or more processors to: \ndetermine that the application is the low latency application based on a maximum allowed latency for the application. \n | 19. The non-transitory computer-readable medium of claim 6, wherein the one or more instructions, that cause the one or more processors to apply the weight to each rule of the set of rules, cause the one or more processors to: \napply different weights to different rules based on one or more of: \ninformation associated with the user device, \ninformation associated with the application, or \ninformation associated with the network. \n | 20. The method of claim 11, wherein determining whether the application is the low latency application or the best effort application comprises: \ndetermining that the application is the low latency application based on a maximum allowed latency for the application.", + "output": "The device has a memories and a processors is coupled to the memories to receive information (510) indicating that a user device is utilizing an application. The application is a low latency application or a best effort application is determined (520). A first network device of the network or a second network device of the network as a designated network device to be an internet protocol (IP) anchor point for the application is designated (530) based on determining whether the application is the low latency application or the best effort application. The information informing the user device that the designated network device is to be the IP anchor point is provided (540) to the user device for the application. The information instructing the network to utilize the designated network device as the IP anchor point is provided (550) to the network for the application which permits the user device to utilize the designated network device as the IP anchor point for the application. INDEPENDENT CLAIMS are included for the following:a non-transitory computer-readable medium storing instructions for dynamically placing an IP anchor point based on a user device and an application; anda method for dynamically placing an IP anchor point based on a user device and an application. Device such as user equipment, mobile phone e.g. smart phone and radiotelephone, laptop computer, tablet computer, desktop computer, handheld computer, gaming device, wearable communication device e.g. smart watch and pair of smart glasses, mobile hotspot device, fixed wireless access device, or customer premises equipment. The anchor point platform can apply a greater weight to rule R2 than rules R3-R5 since rule R2 can ensure that the IP anchor point is an edge user plane function (UPF) with sufficient resources to handle the low latency application. The different stages of the process for dynamically placing an IP anchor point based on a user device and an application are automated, which can remove human subjectivity and waste from the process, and which can improve speed and efficiency of the process and conserve computing resources. The drawing shows a flow chart illustrating a process for dynamically placing an IP anchor point based on a user device and an application. 510Step for receiving information520Step for determining application530Step for designating a first or second network device540Step for providing information informing the user device550Step for providing information instructing the network", + "instruction": "Please summarize the input" + }, + { + "input": "Systems and methods for transforming high-definition geographical map data into messages for vehicle communicationsA device may receive three-dimensional geographical map data for a geographical region associated with a vehicle device of a vehicle and may process the three-dimensional geographical map data, with a data model, to transform the three-dimensional geographical map data into transformed geographical map data with a format that corresponds to a particular standard. The device may generate a message based on the transformed geographical map data and may cause the message to be provided to the vehicle device. The device may perform one or more actions based on the message.What is claimed is: \n | 1. A method, comprising: \nreceiving, by a device, three-dimensional geographical map data for a geographical region associated with a vehicle device of a vehicle; \nprocessing, by the device, the three-dimensional geographical map data, with a data model, to: \nanalyze different layers of the three-dimensional geographical map data, \nidentify a portion of the three-dimensional geographical map data based on analyzing the different layers of the three-dimensional geographical map data, and \ntransform the portion of the three-dimensional geographical map data into transformed geographical map data with a format that corresponds to a particular standard; \ngenerating, by the device, a message based on the transformed geographical map data; \ncausing, by the device, the message to be provided to the vehicle device; and \nperforming, by the device, one or more actions based on the message. \n | 2. The method of claim 1, wherein the particular standard includes a Society of Automotive Engineers J2735 standard.\n | 3. The method of claim 1, wherein causing the message to be provided to the vehicle device comprises one or more of: \ncausing the message to be provided to the vehicle device via a multi-access edge computing device associated with the vehicle device; \ncausing the message to be provided to the vehicle device via a registration representational state transfer application program interface; or \ncausing the message to be provided to the vehicle device via a cellular vehicle-to-everything message broker. \n | 4. The method of claim 1, wherein performing the one or more actions comprises: \nreceiving new three-dimensional geographical map data associated with the geographical region; \nupdating the message based on the new three-dimensional geographical map data and to generate an updated message; and \ncausing the updated message to be provided to the vehicle device. \n | 5. The method of claim 1, wherein performing the one or more actions comprises: \ngenerating an alert based on the message; and \nproviding the alert to the vehicle device. \n | 6. The method of claim 1, wherein performing the one or more actions comprises: \ndetermining a location of the vehicle based on the message; and \ncausing an emergency service to be dispatched to the location of the vehicle. \n | 7. The method of claim 1, wherein the transformed geographical map data includes map data identifying one or more of: \none or more traffic lanes associated with the geographical region; \none or more intersections associated with the geographical region; \none or more traffic signals associated with the geographical region; \none or more sidewalks associated with the geographical region; or \none or more pedestrian lanes associated with the geographical region. \n | 8. A device, comprising: \none or more processors configured to: \nreceive three-dimensional geographical map data for a geographical region associated with a vehicle device of a vehicle; \nprocess the three-dimensional geographical map data, with a data model, to: \nanalyze different layers of the three-dimensional geographical map data, \nidentify a portion of the three-dimensional geographical map data based on analyzing the different layers of the three-dimensional geographical map data, and \ntransform the portion of the three-dimensional geographical map data into transformed geographical map data with a format that corresponds to a particular standard; \ngenerate a message based on the transformed geographical map data; \ncause the message to be provided to the vehicle device via a multi-access edge computing device associated with the vehicle device, via a registration representational state transfer application program interface, or via a cellular vehicle-to-everything message broker; and \nperform one or more actions based on the message. \n | 9. The device of claim 8, wherein the one or more processors, to perform the one or more actions, are configured to: \nreceive feedback from the vehicle device based on the message; and \nretrain the data model based on the feedback. \n | 10. The device of claim 8, wherein the one or more processors, to perform the one or more actions, are configured to one or more of: \ncause the vehicle device to determine an actual location of the vehicle based on the message; \ncause the vehicle device to provide an actual location of the vehicle to one or more other vehicles based on the message; \ncause the vehicle device to position the vehicle in a traffic lane based on the message; or \ncause traffic analytics for the geographical region to be generated based on the message. \n | 11. The device of claim 8, wherein the transformed geographical map data includes map data identifying one or more of: \none or more traffic lanes associated with the geographical region; \none or more intersections associated with the geographical region; \none or more traffic signals associated with the geographical region; \none or more sidewalks associated with the geographical region; or \none or more pedestrian lanes associated with the geographical region. \n | 12. The device of claim 8, wherein the vehicle includes one or more of: \nan autonomous robot, \na semi-autonomous vehicle, \nan autonomous vehicle, or \na non-autonomous vehicle. \n | 13. The device of claim 8, wherein the one or more processors, to perform the one or more actions, are configured to: \ncalculate traffic analytics for the geographical region based on the message; and \nprovide the traffic analytics to an entity associated with managing traffic for the geographical region. \n | 14. The device of claim 8, wherein the one or more processors, to perform the one or more actions, are configured to: \nreceive new three-dimensional geographical map data associated with the geographical region; and \nupdate the message based on the new three-dimensional geographical map data. \n | 15. A non-transitory computer-readable medium storing a set of instructions, the set of instructions comprising: \none or more instructions that, when executed by one or more processors of a device, cause the device to: \nreceive three-dimensional geographical map data for a geographical region associated with a vehicle device of a vehicle; \nprocess the three-dimensional geographical map data, with a data model, to: \nanalyze different layers of the three-dimensional geographical map data, \nidentify a portion of the three-dimensional geographical map data based on analyzing the different layers of the three-dimensional geographical map data, and \ntransform the portion of the three-dimensional geographical map data into transformed geographical map data with a format that corresponds to a particular standard; \ngenerate a message based on the transformed geographical map data; \ncause the message to be provided to the vehicle device; and \nperform one or more actions based on the message. \n | 16. The non-transitory computer-readable medium of claim 15, wherein the one or more instructions, that cause the device to perform the one or more actions, cause the device to: \nreceive new three-dimensional geographical map data associated with the geographical region; \nupdate the message based on the new three-dimensional geographical map data and to generate an updated message; \ncause the updated message to be provided to the vehicle device; and \nperform one or more additional actions based on the updated message. \n | 17. The non-transitory computer-readable medium of claim 15, wherein the one or more instructions, that cause the device to perform the one or more actions, cause the device to: \ngenerate an alert based on the message; and \nprovide the alert to a vehicle located in the geographical region. \n | 18. The non-transitory computer-readable medium of claim 15, wherein the one or more instructions, that cause the device to perform the one or more actions, cause the device to: \ndetermine a location of a vehicle in the geographical region based on the message; and \ncause an emergency service to be dispatched to the location of the vehicle. \n | 19. The non-transitory computer-readable medium of claim 15, wherein the one or more instructions, that cause the device to perform the one or more actions, cause the device to: \ncalculate traffic analytics for the geographical region based on the message; and \nprovide the traffic analytics to an entity associated with managing traffic for the geographical region. \n | 20. The non-transitory computer-readable medium of claim 15, wherein the transformed geographical map data includes map data identifying one or more of: \none or more traffic lanes associated with the geographical region; \none or more intersections associated with the geographical region; \none or more traffic signals associated with the geographical region; \none or more sidewalks associated with the geographical region; or \none or more pedestrian lanes associated with the geographical region.", + "output": "The method involves receiving three-dimensional geographical map data for a geographical region associated with a vehicle device of a vehicle (410). The 3D map data is processed with a data model to transform the map data (420) into transformed map data with a format that corresponds to a particular standard e.g. Society of Automotive Engineers J2735 standard. A message is generated based on the transformed data (430). The message is caused to be provided to the vehicle device (440). A set of actions is performed by the device based on message (450). INDEPENDENT CLAIMS are included for: (1) a device, comprising: one or more processors configured to: receive three dimensional geographical map data for a geographical region associated with a vehicle device of a vehicle\\; (2) a non-transitory computer-readable medium storing a set of instructions. Method for generating a message based on the transformed geographical map data. The method utilizes a vehicle-to-vehicle (V2V) communication system to allow a user to communicate with the vehicle efficiently, and allows the V2V communications system to enable the user to receive information from the vehicle and the vehicle to communicate information to the user in a reliable manner. The drawing shows a flow diagram of the method for generating a message based on the transformed geographical map data.410Receiving Three Dimensional Geographical Map Data for a Geographical Region 420Processing the Three Dimensional Geographical Map Data with a Data Model 430Generating a Message based on the transformed geographical map data 440Causing the message to be provided to the vehicle device 450Performing one or more actions based on the message ", + "instruction": "Please summarize the input" + }, + { + "input": "Autonomous vehicles and systems thereforAbstract Title: Seatless vehicle with mobile charging \n\nA seatless autonomous vehicle which has the means to receive data, a battery 6 to power the vehicle and a charger arranged to charge a battery of another electrically powered vehicle. The vehicle may have a display for sending messages to other road users, the display may be mounted on the front or rear of the vehicle. The vehicle may have means to monitor its surroundings and send the information to other vehicles and/or back to a central hub. The vehicle may be a part of a fleet of vehicles which can communicate with each other (V2V) and a central hub. The hub may be cloud based. The battery of the vehicle may have apices at the hubs of the wheels. |\nCLAIMS \n\n1. A seatless road vehicle having an autonomous mode of operation, the vehicle including a wireless receiver for receiving data, a battery arranged to power the vehicle and a charger arranged to charge a battery of another, electrically powered, vehicle. \n | \n\n2. A vehicle according to claim 1, including an electronic illuminated display for providing instructions or information to drivers of other vehicles. \n | \n\n3. A vehicle according to claim 2, wherein the illuminated display is arranged on a rear surface and/or a front surface of the vehicle. \n | \n\n4. A vehicle according to claim 1, 2 or 3, including monitoring means for monitoring traffic conditions, road conditions and/or environmental conditions, and a transmitter for transmitting information gathered by the monitoring means. \n | \n\n5. A vehicle according to any preceding claim, including four wheels, wherein, viewed in plan, the battery occupies at least 60% of a rectangle having apices at hubs of the wheels. \n | \n\n6, A system comprising a fleet of vehicles, each according to any preceding claim, wherein each vehicle is arranged to communicate data including at least a location of the vehicle to at least one other vehicle in the fleet. \n | \n\n7. A system according to claim 6, including a control station arranged to coordinate control of the vehicles. \n | \n\n8. A system according to claim 7, wherein the control station is cloud based.", + "output": "The seatless road vehicle comprises a wireless receiver for receiving data, where a battery (6) is arranged to power the vehicle, and a charger is arranged to charge a battery of another electrically powered vehicle. An electronic illuminated display is used for providing instructions or information to drivers of other vehicles. The illuminated display is arranged on a rear surface and a front surface of the vehicle. A monitoring unit is provided for monitoring traffic conditions, road conditions or environmental conditions. A transmitter is provided for transmitting information gathered by the monitoring unit. An INDEPENDENT CLAIM is included for a system comprises a fleet of vehicles. Seatless autonomous road vehicle for use with autonomous vehicle system (claimed). The seatless road vehicle has an autonomous mode of operation, and can rescue an electric vehicle with a flat battery, and electronic illuminated display provides instructions or information to drivers of other vehicles, and reduces the traffic congestion on highways. The drawing shows a schematic view of a battery and a motor of the vehicle.4Wheels 6Battery 8Motor ", + "instruction": "Please summarize the input" + }, + { + "input": "For example, the system and method for dynamic management and control of several WI-FI radio in the network of the moving object containing an autonomous vehicleA system for communication is provided, where the system comprises a context broker configured to gather context information for use in managing a plurality of radios, a Wi-Fi radio manager configured to manage the plurality of radio managers using the context information, and a plurality of radios, where each of the plurality of radio managers is configured to manage one of the plurality of radios for communication with another electronic device.|1. In a communication system, a context broker configured to collect context information to be used when managing multiple radios; The context information from the context broker is used; a Wi-Fi radio manager configured to manage a plurality of radio managers is provided; and a plurality of radio stations are provided. Each of the plurality of radio managers is configured to use the radio configuration information received from the Wi-Fi radio manager; to communicate with other electronic devices; and to manage each of the plurality of radio stations; and to provide a method for managing the radio communication system. The context information includes both the mobility information about the mobile vehicle associated with the context broker and the necessary items and/or requirements associated with the application or service; A communication system that manages a plurality of radio managers includes the configuration of at least one radio based on both the mobility information and the necessary items and/or the requirements associated with the application or the service.\n | 2. In the system described in claim 1, the context broker, the Wi-Fi radio manager, the plurality of radio managers, and the plurality of radio stations are located in the moving vehicle.\n | 3. In the system described in claim 2, the context information includes: the position of the mobile vehicle; the speed of the moving vehicle; the moving direction of the moving vehicle; the processing capability of the moving vehicle; and at least one of the resources for at least one vehicle outside the moving vehicle.\n | 4. In the system described in Claim 2, the context information includes infrastructure information about one or more infrastructure.\n | 5. In the system described in claim 4, the infrastructure information includes information about a neighbor access point (AP), a current path of the mobile vehicle, and one or more of the neighboring vehicles.\n | 6. In the system described in Claim 2, at least one of the plurality of radios is configured to connect an electronic device in the mobile vehicle to a network outside of the mobile vehicle.\n | 7. In the system described in Claim 2, the Wi-Fi radio manager is configured to provide the service of the mobile vehicle and the need for an application, and to supply one of the plurality of radio-specific power sources.\n | 8. In the system described in Claim 2, the Wi-Fi radio manager is configured to determine whether or not to turn on a particular power supply that has been powered off because of a context trigger that requires the use of one or more Wi-Fi radios among the plurality of radio stations. The context trigger is the system by the context information from the inside of the moving vehicle; the neighborhood of the moving vehicle; one or more APs, or the cloud server.\n | 9. In the system described in Claim 8, when one of the radio stations is powered up, the specific radio is based on the context information about the particular radio; vehicle-infrastructure (V2I) connection mode; A system that is set to vehicle-vehicle (V2V) connection mode; V2I scanning mode; V2V scanning mode; or access point (AP) mode.\n | 10. In the system described in claim 9, the system is configured to use at least one threshold value to determine when the Wi-Fi radio manager changes the configuration of the particular radio.\n | 11. In the system described in Claim 8, the respective weights are applied to the moving vehicle; the neighborhood of the surrounding of the moving vehicle; and the context information from the one or more APs and the cloud servers.\n | 12. In the system described in Claim 1, at least a portion of the contextual information is received from the cloud server.\n | 13. In the system described in claim 1, the Wi-Fi radio manager is configured to turn on or power off a specific one of the two or more radio stations.\n | 14. A communication method; a step of collecting context information to be used when managing a plurality of radios by a context broker; By using context information from the context broker, by using context information from the context broker, by a Wi-Fi radio manager configured to manage multiple radio managers, a step to determine how one of the plurality of radios should be configured, and; By the Wi-Fi radio manager, based on context information from the context broker, the configuration of the specific radio is presented to the radio manager. The method includes the steps of: using the radio configuration information received from the Wi-Fi radio manager; and configuring the particular radio for communication with other electronic devices. The context information includes both the mobility information about the mobile vehicle associated with the context broker and the necessary items and/or requirements associated with the application or service; The method for communication includes managing the plurality of radio managers, including the configuration of at least one radio based on both the mobility information and the necessary items and/or the requirements associated with the application or the service.\n | 15. In the method described in the claim 14, the configuration is a method for considering one or more of a signal power; a received signal strength indication (RSSI), an interference; a channel; and a frequency.\n | 16. In the method described in the claim 14, the context broker, the Wi-Fi radio manager, the plurality of radio managers, and the plurality of radio stations are provided in the moving vehicle.\n | 17. In the method described in the claim 16, the context information includes: the position of the moving vehicle; the speed of the moving vehicle; the moving method of the moving vehicle; the processing capability of the moving vehicle; and one or more of the resources for at least one vehicle outside the moving vehicle.\n | 18. In the method described in the claim 16, the context information includes infrastructure information for one or more infrastructure.\n | 19. In the method described in the claim 16, the Wi-Fi radio manager is configured to determine whether or not a power source is to be turned on for a context trigger that requires the use of one or more Wi-Fi radios within the plurality of radio stations. The context trigger is a method according to context information from one or more neighboring areas around the mobile vehicle; one or more neighboring areas around the mobile vehicle; or a cloud server.\n | 20. In the method described in the claim 16, the respective weights are applied to the moving vehicle, the neighborhood of the surrounding of the moving vehicle, the one or more APs, and the context information from each of the cloud servers.", + "output": "The system has a context broker that is configured to gather context information for use in managing multiple radios. A WiFi radio manager is configured to manage multiple radio managers using the context information from the context broker. Each of multiple radio managers is configured to manage a respective one of multiple radios for communication with another electronic device using radio configuration information received from the WiFi radio manager. The context broker, the WiFi radio manager, multiple radio managers, and multiple radios are in a mobile vehicle (700). The context information comprises a location of the mobile vehicle, a speed of the mobile vehicle, a direction of travel of the mobile vehicle, processing capabilities of the mobile vehicle, and resources for vehicle external to the mobile vehicle. An INDEPENDENT CLAIM is included for a method for dynamic management and control of WiFi radio in network of mobile vehicle. System for dynamic management and control of WiFi radio in network of mobile vehicle. Uses include but are not limited to bus, truck, boat, forklift, human-operated vehicle, autonomous and/or remote controlled vehicles, boat, ship, speedboat, tugboat, barges, submarine, drone, airplane, and satellite, used in port, harbor, airport, factory, plantation, and mine. The communication network allows a port operator to improve the coordination of the ship loading processes and increase the throughput of the harbor by gathering real-time information on the position, speed, fuel consumption and carbon dioxide emissions of the vehicles. The communication network can operate in multiple modalities comprising various fixed nodes and mobile nodes, provide connectivity in dead zones or zones with difficult access, and reduce the costs for maintenance and accessing the equipment for updating/upgrading. The overall cost consumption per distance, time and vehicle/fleet is reduced. The data from vehicle is offloaded in faster and/or cheaper transfer manner and the overall quality experienced per application, service, or user is increased. The vehicle increases the data offloaded and reduces the costs or time of sending data over expensive or slow technologies. The time to first byte (TTFB) where the next available WiFi network detected is reduced. The drawing shows a block diagram of the communication devices in a vehicle. 700Mobile vehicle702,704,710,712,714Communication devices", + "instruction": "Please summarize the input" + }, + { + "input": "Systems and methods for vehicular positioning based on wireless fingerprinting data in a network of moving things including, for example, autonomous vehiclesCommunication network architectures, systems and methods for supporting a network of mobile nodes. As a non-limiting example, various aspects of this disclosure provide communication network architectures, systems, and methods for supporting a dynamically configurable communication network comprising a complex array of both static and moving communication nodes (e.g., the Internet of moving things). For example, systems and method for vehicular positioning based on wireless fingerprinting data in a network of moving things including, for example, autonomous vehicles.What is claimed is: \n | 1. A method of vehicular positioning of nodes of a radio frequency (RF) wireless network comprising a plurality of nodes disposed at respective fixed locations and a plurality of mobile nodes that reside in respective vehicles that move within a service area of the wireless network, and wherein each node of the plurality of nodes comprises one or more communication interfaces configured for scanning an RF wireless environment of the respective node, the method comprising: \nperiodically receiving respective wireless fingerprint sample data generated by each mobile node of the plurality of mobile nodes, the wireless fingerprint sample data comprising data elements characterizing RF signals received by the mobile node from RF signal sources during scanning of the RF wireless environment of the mobile node and a corresponding geographic location within the service area at which the RF signals were received; \nforming a collection of the wireless fingerprint sample data received from the plurality of mobile nodes; \nreceiving a request for an estimated geographic location of a particular mobile node of the plurality of mobile nodes; \nsearching the collection using a wireless snapshot comprising data elements characterizing RF signals received in a current RF wireless environment of the particular mobile node, to identify wireless fingerprint samples of the collection that match the data elements of the wireless snapshot; and \ncalculating an estimated location of the particular mobile node using the identified wireless fingerprint sample data. \n | 2. The method according to claim 1, wherein each mobile node of the plurality of mobile nodes comprises a wireless access point configured to provide wireless Internet access to end-user devices.\n | 3. The method according to claim 1, wherein each node of the plurality of nodes periodically wirelessly broadcasts its current geographic location to other nodes of the network.\n | 4. The method according to claim 1, wherein the scanning of RF signals within the service area of the wireless network is without regard to a route of travel of a vehicle in which the mobile node resides.\n | 5. The method according to claim 1, the method further comprising: \nadding the wireless snapshot and a respective estimated location of the particular mobile node to the collection as a wireless fingerprint sample, if the search fails to identify at least one wireless fingerprint sample that matches the wireless snapshot. \n | 6. The method according to claim 1, wherein the collection is indexed according to one or more of the data elements of each wireless fingerprint sample that characterize a signal source.\n | 7. The method according to claim 1, wherein the one or more communication interfaces are configured to scan and characterize RF signal sources comprising an RF signal of an IEEE 802.11p compliant vehicle to vehicle wireless communication standard and an RF signal compliant with a commercial cellular communication standard.\n | 8. A non-transitory computer-readable medium on which is stored instructions executable by one or more processors, the executable instructions causing the one or more processors to perform a method of vehicular positioning of nodes of a radio frequency (RF) wireless network comprising a plurality of nodes disposed at respective fixed locations and a plurality of mobile nodes that reside in respective vehicles that move within a service area of the wireless network, and wherein each node of the plurality of nodes comprises one or more communication interfaces configured for scanning an RF wireless environment of the respective node, the method comprising: \nperiodically receiving respective wireless fingerprint sample data generated by each mobile node of the plurality of mobile nodes, the wireless fingerprint sample data comprising data elements characterizing RF signals received by the mobile node from RF signal sources during scanning of the RF wireless environment of the mobile node and a corresponding geographic location within the service area at which the RF signals were received; \nforming a collection of the wireless fingerprint sample data received from the plurality of mobile nodes; \nreceiving a request for an estimated geographic location of a particular mobile node of the plurality of mobile nodes; \nsearching the collection using a wireless snapshot comprising data elements characterizing RF signals received in a current RF wireless environment of the particular mobile node, to identify wireless fingerprint samples of the collection that match the data elements of the wireless snapshot; and \ncalculating an estimated location of the particular mobile node using the identified wireless fingerprint sample data. \n | 9. The non-transitory computer-readable medium according to claim 8, wherein each mobile node of the plurality of mobile nodes comprises a wireless access point configured to provide wireless Internet access to end-user devices.\n | 10. The non-transitory computer-readable medium according to claim 8, wherein each node of the plurality of nodes periodically wirelessly broadcasts its current geographic location to other nodes of the network.\n | 11. The non-transitory computer-readable medium according to claim 8, wherein the scanning of RF signals within the service area of the wireless network is without regard to a route of travel of a vehicle in which the mobile node resides.\n | 12. The non-transitory computer-readable medium according to claim 8, the method further comprising: \nadding the wireless snapshot and a respective estimated location of the particular mobile node to the collection as a wireless fingerprint sample, if the search fails to identify at least one wireless fingerprint sample that matches the wireless snapshot. \n | 13. The non-transitory computer-readable medium according to claim 8, wherein the collection is indexed according to one or more of the data elements of each wireless fingerprint sample that characterize a signal source.\n | 14. The non-transitory computer-readable medium according to claim 8, wherein the one or more communication interfaces are configured to scan and characterize RF signal sources comprising an RF signal of an IEEE 802.11p compliant vehicle to vehicle wireless communication standard and an RF signal compliant with a commercial cellular communication standard.\n | 15. A system for vehicular positioning of nodes of a radio frequency (RF) wireless network comprising a plurality of nodes disposed at respective fixed locations and a plurality of mobile nodes that reside in respective vehicles that move within a service area of the wireless network, and wherein each node of the plurality of nodes comprises one or more communication interfaces configured for scanning an RF wireless environment of the respective node, the system comprising: \none or more processors operably coupled to storage and communicatively coupled to the plurality of nodes, the one or more processors operable to, at least: \nperiodically receive respective wireless fingerprint sample data generated by each mobile node of the plurality of mobile nodes, the wireless fingerprint sample data comprising data elements characterizing RF signals received by the mobile node from RF signal sources during scanning of the RF wireless environment of the mobile node and a corresponding geographic location within the service area at which the RF signals were received; \nform a collection of the wireless fingerprint sample data received from the plurality of mobile nodes; \nreceive a request for an estimated geographic location of a particular mobile node of the plurality of mobile nodes; \nsearch the collection using a wireless snapshot comprising data elements characterizing RF signals received in a current RF wireless environment of the particular mobile node, to identify wireless fingerprint samples of the collection that match the data elements of the wireless snapshot; and \ncalculate an estimated location of the particular mobile node using the identified wireless fingerprint sample data. \n | 16. The system according to claim 15, wherein each mobile node of the plurality of mobile nodes comprises a wireless access point configured to provide wireless Internet access to end-user devices.\n | 17. The system according to claim 15, wherein each node of the plurality of nodes periodically wirelessly broadcasts its current geographic location to other nodes of the network.\n | 18. The system according to claim 15, wherein the scanning of RF signals within the service area of the wireless network is without regard to a route of travel of a vehicle in which the mobile node resides.\n | 19. The system according to claim 15, wherein the one or more processors are further operable to: \nadd the wireless snapshot and a respective estimated location of the particular mobile node to the collection as a wireless fingerprint sample, if the search fails to identify at least one wireless fingerprint sample that matches the wireless snapshot. \n | 20. The system according to claim 15, wherein the collection is indexed according to one or more of the data elements of each wireless fingerprint sample that characterize a signal source.\n | 21. The system according to claim 15, wherein the one or more communication interfaces are configured to scan and characterize RF signal sources comprising an RF signal of an IEEE 802.11p compliant vehicle to vehicle wireless communication standard and an RF signal compliant with a commercial cellular communication standard.", + "output": "The method involves receiving a request for an estimated geographic location of a particular mobile node of a set of mobile nodes. Collection of wireless fingerprint sample data is searched using a wireless snapshot that comprises data elements characterizing radio frequency (RF) signals received in a current RF wireless environment of the mobile node, to identify wireless fingerprint samples of the collection that match data elements of the wireless snapshot. An estimated location of the particular mobile node is calculated using identified wireless fingerprint sample data. INDEPENDENT CLAIMS are also included for the following:a non-transitory computer-readable medium comprising a set of instructions for vehicular positioning of nodes of an RF wireless networka system for vehicular positioning of nodes of an RF wireless network. Method for vehicular positioning of nodes i.e. internet of things nodes, of an RF wireless network e.g. city-wide vehicular network, shipping port-sized vehicular network and campus-wide vehicular network, associated with vehicles. Uses include but are not limited to a smartphone, tablet, smart watch, laptop computer, webcam, personal gaming device, personal navigation device, personal media device, personal camera and a health-monitoring device associated with automobiles, buses, lorries, boats, forklifts, human-operated vehicles and autonomous and/or remote controlled vehicles. The method enables the platform to be flexibly optimized at design/installation time and/or in real-time for different purposes so as to reduce latency, increase throughput, reduce power consumption and increase reliability with regard to failures based on the content, service or data. The method enables utilizing multiple connections or pathways that exist between distinct sub-systems or elements within the same sub-system to increase robustness and/or load-balancing of the network. The method enables gathering real-time information on position, speed, fuel consumption and carbon dioxide emissions of the vehicles, so that the communication network allows a port operator to improve the coordination of ship loading processes, increase throughput of the harbor and enhance performance of the positioning systems. The communication interfaces scan and characterize the RF signal sources with IEEE 802.11p compliant RF signals. The drawing shows a schematic block diagram of a communication network. 400Communication network", + "instruction": "Please summarize the input" + }, + { + "input": "System and method for telematics for tracking equipment usageSystems and methods are described for tracking information of an equipment including a telematics device configured to receive data from the equipment to determine a telematics information. The telematics information includes at least two of an equipment type, a location, a duration in the location, and miles travelled. A transmission device is configured to transmit the vehicle telematics information to at least one of a third party entity device, a government device and a mobile device.We claim: \n | 1. A system for tracking local information of an equipment on a vehicle, comprising: \na telematics device in the vehicle configured to receive at variable data sampling rate, raw data of vehicle telematics information comprising two or more of: energy usage, rate of energy consumption, equipment type, vehicle owner's information, a vehicle location, a duration of vehicle in the location, parking and moving violation, vehicle fines, distance travelled on the vehicle, and weight and size of equipment; and \na transmission device configured to compress the raw data of the vehicle telematics information and directly transmit through a network, the compressed raw data of the vehicle telematics information to at least one of a third party entity device, a government device and a mobile device to determine a usage charge based on the vehicle telematics information, and wherein the telematics device is configured to receive one or a combination of: public emergency alert announcement, captured images and associated data for matching to an object of interest in the public emergency alert announcement, optical sensors data, on-board laser and sonar pulsed sensor and imaging camera data to render the captured images and associated data for remote analysis by the at least one of the third party entity device, the government device and the mobile device. \n | 2. The system of claim 1, wherein the energy usage comprises total energy consumed by one or a combination of battery electric power, hydrogen fuel, natural gas, diesel fuel, solar power and gasoline, and the rate of energy consumption comprises per unit time measurement of one or a combination of battery electric power, hydrogen fuel, natural gas, diesel fuel, solar power and gasoline.\n | 3. The system of claim 1, wherein the vehicle comprises one of: transportation vehicles, recreation vehicles, industrial or home equipment, autonomous vehicles, flying vehicles.\n | 4. The system of claim 3, wherein the transportation vehicles comprise anyone of: a hybrid vehicle, an electric powered vehicle, a rental or a leased vehicle, a fleet managed vehicle, a car, a bus, a truck, wherein the recreation vehicles comprise anyone of: an all-terrain vehicle (ATV), an off-road vehicle, a drone, a boat, and the industrial/home equipment comprise anyone of: a power generator, a mining equipment, an agriculture equipment, a construction equipment.\n | 5. The system of claim 3, wherein for autonomous self-driving vehicles, vehicle telematics information may be communicated to an infrastructure network communication on distance driven in autonomous mode to levy a usage tax on vehicle to infrastructure; and for flying cars, a flight tax may be levied per trip and based on amount of fuel consumed and distance flown.\n | 6. The system of claim 1, wherein the telematics device, the government device and the mobile device associates a credit card, a debit card bank account, or through connection with a mobile device.\n | 7. The system of claim 1, wherein the government device charges vehicle owner based on received vehicle telematics information, comprising: usage charges, parking metering, moving violations, vehicle fines, state lines, specified highways, crossing determined bridges and car sharing charges.\n | 8. The system of claim 1, where the vehicle telematics information further includes information of a duration the equipment spends in determined geo-fenced locations.\n | 9. The system of claim 8, further including an electronic control unit configured to restrict a use of a fuel source or switch to an alternate fuel source based on a determined geo-fenced area.\n | 10. A method for tracking local information of an equipment in a vehicle, comprising: \nreceiving by a server, compressed raw data of vehicle telematics information which are compressed before being transmitted from a transmission device of a vehicle, the raw data of vehicle telematics information indicates energy and equipment use in the vehicle over a period of time, wherein the raw data of vehicle telematics information are received at variable data sampling rate by a telematics device, and the raw data of vehicle telematics information includes two or more of: energy usage, rate of energy consumption, equipment type, vehicle owner's information, a vehicle location, a duration of vehicle in the location, parking and moving violation, vehicle fines, distance travelled on the vehicle, and weight and size of equipment; and \nprocessing the raw data of the vehicle telematics information to determine a usage charge or a tax; and \ndirectly transmitting through a network, the usage charge or the tax to at least one of a third party entity device, a government device and a mobile device in order to determine a usage charge based on the vehicle telematics information, and wherein the telematics device is configured to receive one or a combination of: public emergency alert announcement, captured images and associated data for matching to an object of interest in the public emergency alert announcement, optical sensors data, on-board laser and sonar pulsed sensor and imaging camera data to render the captured images and associated data for remote analysis by the at least one of the third party entity device, the government device and the mobile device. \n | 11. The method of claim 10, wherein the energy usage comprises total energy consumed by one or a combination of battery electric power, hydrogen fuel, natural gas, diesel fuel, solar power and gasoline, and the rate of energy consumption comprises per unit time measurement of one or a combination of battery electric power, hydrogen fuel, natural gas, diesel fuel, solar power and gasoline.\n | 12. The method of claim 10, wherein the vehicle comprises one of: transportation vehicles, recreation vehicles, industrial or home equipment, autonomous vehicles, flying vehicles.\n | 13. The method of claim 12, wherein the transportation vehicles comprise anyone of: a hybrid vehicle, an electric powered vehicle, a rental or a leased vehicle, a fleet managed vehicle, a car, a bus, a truck, wherein the recreation vehicles comprise anyone of: an all-terrain vehicle (ATV), an off-road vehicle, a drone, a boat, and the industrial/home equipment comprise anyone of: a power generator, a mining equipment, an agriculture equipment, a construction equipment.\n | 14. The method of claim 12, wherein for autonomous self-driving vehicles, vehicle telematics information may be communicated to an infrastructure network communication on distance driven in autonomous mode to levy a usage tax on vehicle to infrastructure; and for flying cars, a flight tax may be levied per trip and based on amount of fuel consumed and distance flown.\n | 15. The method of claim 10, wherein the telematics device, the government device and the mobile device associates a credit card, a debit card bank account, or through connection with a mobile device.\n | 16. The method of claim 10, wherein the government device charges vehicle owner based on received vehicle telematics information, comprising: usage charges, parking metering, moving violations, vehicle fines, state or federal taxes, state lines, specified highways, crossing determined bridges and car sharing charges.\n | 17. The method of claim 10, where the telematics information further includes information of a duration the equipment spends in determined geo-fenced locations.\n | 18. The method of claim 10, further comprising restricting by an electronic control unit, a use of the fuel source or switching to an alternate fuel source based on a determined geo-fenced area.", + "output": "The system comprises a telematics device (114) in the vehicle configured to receive at variable data sampling rate. The transmission device (115) configured to compress the raw data of the vehicle telematics information and directly transmit through a network. The compressed raw data of the vehicle telematics information to at least one of a third party entity device (104). A government device (150) and a mobile device (160) to determine a usage charge based on the vehicle telematics information. An INDEPENDENT CLAIM is included for a method for tracking local information of an equipment in a vehicle. System for tracking local information of an equipment on a vehicle. Minimizes cost of data transmission to the entity devices and/or other remote data locations. The drawing shows a block representation of a environment for tracking information. 104Third party entity device114Telematics device115Transmission device150Government device160Mobile device", + "instruction": "Please summarize the input" + }, + { + "input": "VEHICLE SYSTEM OF A VEHICLE FOR DETECTING AND VALIDATING AN EVENT USING A DEEP LEARNING MODELThe invention relates to a vehicle system (1) of a vehicle (2) configured to detect an event (E) and to broadcast said event (E) using a decentralized environmental notification message (DENM), wherein said vehicle system (1) comprises: \n - at least one camera sensor (10) configured to capture images (I1) of an environment of said vehicle (2), \n - an electronic control unit (11) configured to : \n - detect an event (E) using a primary deep learning model (M1) based on said images (I1), \n - apply an predictability level (A) on said event (E), said predictability level (A) being generated by said primary deep learning model (M1), \n - transmit said event (E) to a telematic control unit (12) if its predictability level (A) is above a defined level (L1), \n\n - said telematic control unit (12) configured to : \n - receive said event (E) from said electronic control unit (10) and broadcast a related decentralized environmental notification message (DENM) via a vehicle to vehicle communication (V2V) and/or a vehicle to infrastructure communication (V2I), \n - transmit at least one image (I1) and data details (D) of said event (E) to a server (3), \n - receive a primary validation information (30) of said event (E) from said server (3), said primary validation information (30) being generated by a secondary deep learning model (M2), and cancel the broadcasting of said decentralized environmental notification message (DENM) if said event (E) is not validated, \n - if said event (E) is validated, receive an updated instance (M3) of said primary deep learning model (M1) from said server (3) and transmit it to said primary electronic control unit (10) for updating said primary deep learning model (M1). \n\n|1. A vehicle system (1) of a vehicle (2) configured to detect an external event (E) and to broadcast said event (E) using a decentralized environmental notification message (DENM), wherein said vehicle system (1) comprises: \n * - at least one camera sensor (10) configured to capture images (11) of an environment of said vehicle (2),\n * - an electronic control unit (11) configured to : \n * - detect an event (E) using a primary deep learning model (M1) based on said images (11),\n * - determine a predictability level (A) of said event (E), said predictability level (A) being generated by said primary deep learning model (M1), (M1) for categorizing the different events (E), said events comprising accidents, road-block, animals on the road or on the pavement, pedestrians on the road or on the pavement, obstacles on the road or on the pavement and ambulance vehicles on the road,\n * - transmit said event (E) to a telematic control unit (12) of said vehicle system (1) if its predictability level (A) is above a defined level (L1),\n * - said telematic control unit (12) configured to : \n * - receive said event (E) from said electronic control unit (11) and broadcast a related decentralized environmental notification message (DENM) via a vehicle to vehicle communication (V2V) to other vehicles (6) in the environment of the vehicle (2) and/or via a vehicle to infrastructure communication (V2I), to infrastructures (7) in the environment of the vehicle (2),\n * - transmit at least one image (11) and data details (D) of said event (E) to a server (3), said data details (D) of said event (E) comprising a label (LB) of said event (E), a location (LO) of said event (E), a timestamp (TI) of said event (E), and the predictability level (A) of said event (E),\n * - receive a primary validation information (30) of said event (E) from said server (3), said primary validation information (30) being generated by a secondary deep learning model (M2), and cancel the broadcasting of said decentralized environmental notification message (DENM) if said event (E) is not validated,\n * - if said event (E) is validated, receive an updated instance (M3) of said primary deep learning model (M1) from said server (3) and transmit it to said electronic control unit (11) for updating said primary deep learning model (M1).\n | 2. A vehicle system (1) according to claim 1, wherein said electronic control unit (11) is configured to update said primary deep learning model (M1) with said updated instance (M3).\n | 3. A vehicle system (1) according to claim 1 or claim 2, wherein said telematic control unit (12) is further configured to: \n * - broadcast a periodic cooperative awareness message (CAM) based on said images (11) for stating the road conditions (R1) where said vehicle (2) is,\n * - receive a secondary validation information (31) of said road conditions (R1) from said server (3), said secondary validation information (31) being generated by said secondary deep learning model (M2),\n * - if said road conditions (R1) are validated, receive an updated instance (M3) of said primary deep learning model (M1) from said server (3) and transmit it to said electronic control unit (11) for update.\n | 4. A vehicle system (1) according to any of the preceding claims, wherein if the predictability level (A) of said event (E) is between a threshold (Th1) below the defined level (L1), the electronic control unit (11) is further configured to transmit a control signal (11a) to a human interface machine (20) of said vehicle (2) in order to have a confirmation of the predictability level (A) of said event (E).\n | 5. A vehicle system (1) according to any one of the preceding claims, wherein said event (E) is an accident, a road-block, an animal, a pedestrian, an obstacle, or an ambulance vehicle.\n | 6. A vehicle system (1) according to any of the preceding claims, wherein said primary deep learning model (M1) is associated to a geographical location (L3).\n | 7. A vehicle system (1) according to the preceding claim, wherein said vehicle system (1) comprises a plurality of primary deep learning models (M1) associated to different geographical locations (L3).\n | 8. A vehicle system (1) according to any of the preceding claims, wherein said at least one camera sensor (10) is a front camera.\n | 9. A vehicle system (1) according to any of the preceding claims, wherein said primary deep learning model (M1) and said secondary deep learning model (M2) are convolutional neural network (CNN) based.\n | 10. A vehicle system (1) according to any one of the preceding claims, wherein said vehicle (2) is an autonomous vehicle.\n | 11. A vehicle system (1) according to any one of the preceding claims, wherein if said electronic control unit (11) fails to detect an event (E), said telematic control unit (12) is further configured to send the images (11) captured by said at least one camera sensor (10) to said server (3).\n | 12. A method (4) comprising: \n * - a capture (E1) by at least one camera sensor (10) of a vehicle system (1) of a vehicle (2), of images (11) of the environment of said vehicle (2),\n * - a detection (E2) by an electronic control unit (11) of said vehicle system (1) of an external event (E) using a primary deep learning model (M1) based on said images (11),\n * - a determining (E3) by said electronic control unit (11) of an predictability level (A) of said event (E), said predictability level (A) being generated by said primary deep learning model (M1) for categorizing the different events (E), said events comprising accidents, road-block, animals on the road or on the pavement, pedestrians on the road or on the pavement, obstacles on the road or on the pavement and ambulance vehicles on the road,\n * - a transmission (E4) by said electronic control unit (11) of said event (E) to a telematic control unit (12) of said vehicle system (1) if its predictability level (A) is above a defined level (L1),\n * - the reception (E5) by said telematic control unit (12) of said event (E),\n * - the broadcasting (E6) by said telematic control unit (12) of a decentralized environmental notification message (DENM) related to said event (E) via a vehicle to vehicle communication (V2V) to other vehicles (6) in the environment of the vehicle (2) and/or via a vehicle to infrastructure communication (V2I), to infrastructures (7) in the environment of the vehicle (2)\n * - the transmission (E7) by said telematic control unit (12) of at least one image (11) and of data details (D) of said event (E) to a server (3), said data details (D) of said event (E) comprising a label (LB) of said event (E), a location (LO) of said event (E), a timestamp (TI) of said event (E), and the predictability level (A) of said event (E),\n * - the reception (E8) by said telematic control unit (12) of a primary validation information (30) from said server (3), said primary validation information (30) being generated by a secondary deep learning model (M2), and the cancellation (E9) by said telematic control unit (12) of said broadcasting if said event (E) is not validated,\n * - if said event (E) is validated, the reception (E10) by said telematic control unit (12) of an updated instance (M3) of said primary deep learning model (M1) from said server (3) and the transmission (E11) of said updated instance (M3) to an electronic control unit (11) of said vehicle system (1) for updating said primary deep learning model (M1).", + "output": "The vehicle system (1) comprises one camera sensor (10) used to capture images of an environment of the vehicle (2, 6). The electronic control unit used to detect an event using a primary deep learning model based on the images. The predictability level on the event is applied. The predictability level is generated by the primary deep learning model. The event is transmitted to a telematic control unit (12) if its predictability level is above a defined level. The telematics control unit is used to receive the event from the electronic control unit (10) and broadcast a related decentralized environmental notification message through the vehicle to vehicle communication and vehicle to infrastructure communication. The image and data details of the event are transmitted to a server. INDEPENDENT CLAIMS are included for the following:a server comprises a secondary deep learning model; anda first method; anda second method. Vehicle system of a vehicle used to detect an event and to broadcast the event using a decentralized environmental notification message. The system obtains a better accuracy of the primary deep learning model and uses less memory in the vehicle and avoids going to a service center to update a vehicle's deep learning model and enhance the training of the secondary deep learning model. The drawing shows a schematic block diagram of a vehicle system. 1Vehicle system2, 6Vehicle10Camera sensor10Electronic control unit12Telematic control unit", + "instruction": "Please summarize the input" + }, + { + "input": "Defining and delivering parking zones to vehiclesTechniques are described for defining and delivering parking area information to a vehicle. The parking area information can be sent by a parking assistant device associated with a parking area and in response to receiving one or more messages from a vehicle system of the vehicle. Messages from the vehicle system indicate a location of the vehicle and are used by the parking assistant device to track a movement of the vehicle. The parking area information is sent in one or more responses messages from the parking assistant device and can include a rule for determining whether the vehicle is permitted to park in an unoccupied parking zone within the parking area or indicate a result of applying the rule.What is claimed is: \n | 1. A system in a vehicle, the system comprising: \na communications interface; and \na vehicle control system including one or more processors configured to: \ntransmit, to a computer device associated with a parking area and through the communications interface, one or more messages indicating a location of the vehicle; \nreceive, through the communications interface, one or more response messages from the computer device, wherein the one or more response messages include information about the parking area, and wherein the information includes a rule comprising one or more conditions that must be satisfied in order for the vehicle to be permitted to park in an unoccupied parking zone within the parking area; \ndecode the one or more response messages to extract the information, including the one or more conditions; and \nprocess the information in connection with a parking operation, wherein to process the information, the vehicle control system is configured to: \npresent the information on an audio or visual output device of the vehicle, the information being presented prior to performance of the parking operation, during performance of the parking operation, or both; or \ndetermine, using the information, an autonomous driving maneuver performed as part of the parking operation. \n | 2. The system of claim 1, wherein the one or more messages indicating the location of the vehicle comprise a vehicle-to-everything (V2X) message broadcasted by the system, and wherein the one or more response messages comprise a V2X message broadcasted by the computer device.\n | 3. The system of claim 1, wherein the information included in the one or more response messages indicates whether the vehicle is permitted, based on a result of applying the rule, to park in the unoccupied parking zone within the parking area.\n | 4. The system of claim 1, wherein the parking area includes multiple unoccupied parking zones, and wherein the information indicates a particular unoccupied parking zone as being preferred.\n | 5. The system of claim 1, wherein the one or more conditions include a time-based restriction on parking.\n | 6. The system of claim 1, wherein the one or more conditions include a parking restriction relating to an attribute of the vehicle or relating to an identity of an owner or driver of the vehicle.\n | 7. The system of claim 1, wherein: \nto process the information, the vehicle control system is configured to determine, using the information, the autonomous driving maneuver performed as part of the parking operation; and \nthe vehicle control system is configured to perform the parking operation autonomously as a self-parking operation that parks the vehicle into the unoccupied parking zone. \n | 8. The system of claim 1, wherein: \nto process the information, the vehicle control system is configured to determine, using the information, the autonomous driving maneuver performed as part of the parking operation; and \nthe vehicle control system is configured to perform the parking operation autonomously as a self-parking operation that parks the vehicle into a different parking zone than the unoccupied parking zone. \n | 9. A method comprising: \ntransmitting, from a vehicle system of a vehicle to a computer device associated with a parking area, one or more messages indicating a location of the vehicle; \nreceiving, by the vehicle system, one or more response messages from the computer device, wherein the one or more response messages include information about the parking area, and wherein the information includes a rule comprising one or more conditions that must be satisfied in order for the vehicle to be permitted to park in an unoccupied parking zone within the parking area; \ndecoding, by the vehicle system, the one or more response messages to extract the information, including the one or more conditions; and \nprocessing, by the vehicle system, the information in connection with a parking operation, wherein the processing comprises: \npresenting the information on an audio or visual output device of the vehicle, the information being presented prior to performance of the parking operation, during performance of the parking operation, or both; or \ndetermining, using the information, an autonomous driving maneuver performed as part of the parking operation. \n | 10. The method of claim 9, wherein the one or more messages from the vehicle system comprise a vehicle-to-everything (V2X) message broadcasted by the vehicle system, and wherein the one or more response messages comprise a V2X message broadcasted by the computer device.\n | 11. The method of claim 9, wherein the information included in the one or more response messages indicates whether the vehicle is permitted, based on a result of applying the rule, to park in the unoccupied parking zone within the parking area.\n | 12. The method of claim 9, wherein the parking area includes multiple unoccupied parking zones, and wherein the information indicates a particular unoccupied parking zone as being preferred.\n | 13. The method of claim 9, wherein the one or more conditions include a time-based restriction on parking.\n | 14. The method of claim 9, wherein the one or more conditions include a parking restriction relating to an attribute of the vehicle or relating to an identity of an owner or driver of the vehicle.\n | 15. The method of claim 9, wherein the parking operation is an autonomously performed self-parking operation that parks the vehicle into the unoccupied parking zone, and wherein the processing of the information in connection with the parking operation comprises determining, using the information, the autonomous driving maneuver performed as part of the parking operation.\n | 16. The method of claim 9, wherein the parking operation is an autonomously performed self-parking operation that parks the vehicle into a different parking zone than the unoccupied parking zone, and wherein the processing of the information in connection with the parking operation comprises determining, using the information, the autonomous driving maneuver performed as part of the parking operation.\n | 17. The method of claim 9, wherein the information about the parking area indicates a boundary of the unoccupied parking zone.\n | 18. The method of claim 9, wherein the one or more conditions are determined based on identification, by the computer device, of a pattern in usage of the parking area.\n | 19. A non-transitory computer-readable storage medium containing instructions that, when executed by one or more processors in a vehicle system of a vehicle, configure the vehicle system to: \ntransmit, to a computer device associated with a parking area, one or more messages indicating a location of the vehicle; \nreceive one or more response messages from the computer device, wherein the one or more response messages include information about the parking area, and wherein the information includes a rule comprising one or more conditions that must be satisfied in order for the vehicle to be permitted to park in an unoccupied parking zone within the parking area; \ndecode the one or more response messages to extract the information, including the one or more conditions; and \nprocess the information in connection with a parking operation, wherein the processing comprises: \npresenting the information on an audio or visual output device of the vehicle, the information being presented prior to performance of the parking operation, during performance of the parking operation, or both; or \ndetermining, using the information, an autonomous driving maneuver performed as part of the parking operation.", + "output": "The method involves sending one or more messages that indicate the vehicle's location from the vehicle system 110 to a computer device connected to the parking area. One or more response messages are received from the computer device by the vehicle system. The one or more response messages has information about the parking area. The information has a rule to determine whether the vehicle is permitted to park in an unoccupied parking zone within the parking area. Also, the information indicates a result of applying the rule. The one or more response messages are decoded to extract the information. The information regarding parking operations is processed by the vehicle system. INDEPENDENT CLAIMS are included for: a computer-readable storage medium containing instructions. Method for defining parking zones and conveying information about the parking zones to a vehicle in order to assist in parking of the vehicle. The parking zones are predefined and generally comprise uniformly shaped spaces that are well-marked so as to make the parking zones easily identifiable to the driver even without the aid of the map data. The drawing shows a block diagram of a parking system. 100Parking system 110Vehicle system 130Communication network 140Computer system 142Datastore 144Parking area information ", + "instruction": "Please summarize the input" + }, + { + "input": "a network connection for automatically driving the vehicle dynamic behavior decision method in the environmentThe invention claims an automatic driving vehicle dynamic behavior decision method of network connection environment. the method comprises the following steps: step S1, in a V2X network environment, the surrounding road users gaining the surrounding environment information, and from the vehicle mass centre as the centre, performing region division with different radius, predicting the risk area, step S2 based on the surrounding road user surrounding environment information and the predicted risk area, performing the first stage of behavior decision, determining vehicle driving safety to ensure feasible action could take a set, step S3, behavior decision for the second stage: considering the un-safety constraint condition, from the feasible set of actions, final execution of the optimization selection action, driving behaviour decision.|1. A network dynamic behavior decision method of automatic driving vehicle environment, the vehicle is an automatic driving vehicle, wherein the method comprises the following steps: step S1, in a V2X network environment. gaining the surrounding environment information around the road user, and from the vehicle mass centre as the centre, performing region division with different radius, predicting the risk area, step S2, based on the surrounding road user surrounding environment information and the predicted risk area, performing the first stage of behavior decision. determining vehicle driving safety to ensure feasible action could take a set, step S3, behavior decision for the second stage: considering the non-safety constraint condition, from the feasible set of actions, final execution of the optimization selection action, driving behaviour decision.\n | 2. The automatic driving vehicle dynamic behavior decision method according to claim 1, wherein, in the step S1, predicting the risk region defined in the following way: from the vehicle mass centre as the centre, the risk area is a circular area of radius in a safe braking distance Lrisk, in the formula, vi is the own vehicle current speed, amax is the vehicle acceleration value, L is the vehicle length, if a surrounding road users located in the risk area, the surrounding road users defined as risk road users, defined from the vehicle centroid as centre, the security early warning distance Lp-risk is a circular area of radius risk region after removing the annular region is a potential risk region, wherein, adec is the maximum value of the self-vehicle deceleration, if some surrounding road users located in the potential risk area around the road users defined as a risk potential road users, defined from the vehicle the centroid as centre; safety pre-warning distance Lp-risk area outside the circular area of radius is safe area around, if some road users located outside the potential risk areas, or in the vehicle is out of communication range, the surrounding road users defined as safe road users.\n | 3. The automatic driving vehicle dynamic behavior decision method according to claim 2, wherein the step S2 comprises the following steps: step S21, surrounding road users in the risk area and risk potential region, calculating the risk degree C, wherein firstly calculating the surrounding road user in the risk area, then calculating the surrounding road users potential risk in the area risk degree C representing the automatic driving vehicle probability of conflict between the current state and the surrounding road user state. t is expected of the estimated collision time, if the risk area and risk potential region has two or more of the surrounding road users, t is minimum value of the predicted collision time of two or more estimated predicted collision occurs with each surrounding road users, if the risk area and risk potential region in not around the road user, t is greater than the setting value of tc, tc to avoid collision of the critical time, is set constant, when C=0, it is determined that there is no traffic conflict in this state. determining the risk metric value frisk is zero, and turning to step S23, when C=1, it is determined that potential traffic conflict exists in this state, turning to step S22, step S22, to calculate the risk metric value frisk, step S23. The risk metric value frisk for action selection, determining the feasible set of actions.\n | 4. The automatic driving vehicle dynamic behavior decision method according to claim 3, wherein the estimated, predicted collision time t is calculated by the following formula, t=min (TTC, PET, TTB), wherein Ξ is the vehicle position, Xj is the position of surrounding road followed by the user is from the vehicle current speed vi, vj is the other vehicle current speed, Li is the length from the vehicle, PET is the difference between the time the vehicle enters the conflict of ti time reach the conflict to other road users of tj, PET=t = | ti-tj | TTB for evaluating forward area, suitable for the vehicle, other vehicle the front scene, Ξ is the vehicle position, Xj is followed by other vehicle position, vi is the own vehicle current speed, Li is the length from the vehicle.\n | 5. The automatic driving vehicle dynamic behavior decision method according to claim 3, wherein the estimated, predicted collision time t calculated in the following way, when capable of distinguishing the scene, calculate the estimated only for estimation of the scene the collision time t, wherein, for the straight vehicle-following scene, t = TTC for intersection scene, t = PET for the vehicle after the forward collision scene, t = TTB; when the scene is complex, t=min (TTC, PET, TTB). wherein, Ξ is the vehicle position, Xj is the position of surrounding road followed by the user is from the vehicle current speed vi, vj is the other vehicle current speed, Li is the length from the vehicle, PET is the difference between the time the vehicle enters the conflict of ti time reach the conflict to other road users of tj, PET=t = | ti-tj | TTB for evaluating forward area, suitable for the vehicle, other vehicle the front scene, Ξ is the vehicle position, Xj is followed by other vehicle position, vi is the own vehicle current speed, Li is the length from the vehicle.\n | 6. The automatic driving vehicle dynamic behavior decision method according to claim 3, wherein, in the step S2, calculating the risk measurement value frisk by the following formula, or \n | 7. The automatic driving vehicle dynamic behavior decision method according to claim 1-6, wherein, in the step S3 of the second stage action decision, not include any influence on the security of the decision attribute, but considering the constraint function efficient soft, comfortable soft constraint function fc and traffic flow soft constraint function for the optimal decision.\n | 8. The automatic driving vehicle real-time trajectory planning method according to claim 7, wherein the efficient soft constraint function is defined as: wherein, t0 is vehicle initial departure time, tf is the destination arrival time, v (t) is the self-vehicle speed.\n | 9. The automatic driving vehicle real-time trajectory planning method according to claim 7, wherein the comfortable soft constraint function is defined as: wherein a is the vehicle acceleration, WorleeSol is the transverse acceleration, alon is the longitudinal acceleration.\n | 10. The automatic driving vehicle real-time trajectory planning method according to claim 7, wherein the traffic flow soft constraint function ft is defined as: minft= α (vave-vder) 2 + β (dave-dder) 2, wherein vave is the average speed levels of peripheral traffic flow before the decision, vder is average speed level of desired peripheral traffic flow decision after dave is peripheral traffic flow before the decision of an average vehicle distance, dder is the average vehicle distance decision periphery after the desired traffic flow, α, β is weight coefficient, is more than 0 less than 1.\n | 11. The automatic driving vehicle real-time trajectory planning method according to claim 7, wherein the second phase behavior decision step S3 is defined in the cost function J as follows: w1, w2, w3 is the weighting coefficient, all rates are more than 0 less than 1, and w1 +, w2 + w3=1, fe0, fc0, ft0 respectively represent the hypothesis according to the security decision state before continuing to execute after efficient, comfortable, and traffic flow function.", + "output": "The method involves obtaining surrounding environment information by surrounding road from a vehicle mass center for performing region division with different radius to predict a risk area (S1). A first stage of behavior decision is performed (S2) to determine a vehicle driving safety to ensure feasible action based on the surrounding environment information of the surrounding road users and the estimated risk area. A non-safety constraint condition is considered (S3) from the feasible set of actions for selecting final execution of the optimization selection action for a driving behavior decision. Automatic driving vehicle network connection environment dynamic behavior decision method. The drawing shows a flow diagram illustrating an automatic driving vehicle network connection environment dynamic behavior decision method. '(Drawing includes non-English language text)' S1Step for obtaining surrounding environment information by surrounding road from a vehicle mass center for performing region division with different radius to predict a risk areaS2Step for performing first stage of behavior decision to determine a vehicle driving safety to ensure feasible action based on the surrounding environment information of the surrounding road users and the estimated risk areaS3Step for considering non-safety constraint condition from the feasible set of actions for selecting final execution of the optimization selection action for a driving behavior decision", + "instruction": "Please summarize the input" + } +] \ No newline at end of file