Fordonsinformatik. Datasystem i fordon
- +Ämnesområden
- +Fordonsteknik (20)
- Fordon: allmänt (1)
- +Fordonssystem (11)
- Fordonssystem: allmänt (0)
- Elektriska utrustningar (3)
- Fordonsinformatik. Datasystem i fordon (8)
- Ljus- och signalanordningar (0)
- Indikerings- och styrutustningar (0)
- Bromssystem (0)
- Transmissioner, hjulupphängningar (0)
- Karosser och karossdelar (0)
- Bilglas och torkarsystem (0)
- Kopplingsanordningar (0)
- Säkerhetssystem i bil, inklusive krockkuddar, bältessystem och trafikolycksrelaterade frågor (0)
- Övriga fordonssystem (0)
- +Förbränningsmotorer för fordon (2)
- +Kommersiella fordon (0)
- Personbilar. Husvagnar och lätta släpvagnar (0)
- Elfordon (3)
- Motorcyklar och mopeder (1)
- Cyklar (1)
- Fordon för särskilda ändamål (0)
- Diagnostik-, underhålls- och provningsutrustningar (3)
ISO 15118-2:2014 specifies the communication between battery electric vehicles (BEV) or plug-in hybrid electric vehicles (PHEV) and the Electric Vehicle Supply Equipment. The application layer message set defined in ISO 15118-2:2014 is designed to support the energy transfer from an EVSE to an EV. ISO 15118-1 contains additional use case elements describing the bidirectional energy transfer. The implementation of these use cases requires enhancements of the application layer message set defined herein. The purpose of ISO 15118-2:2014 is to detail the communication between an EV (BEV or a PHEV) and an EVSE. Aspects are specified to detect a vehicle in a communication network and enable an Internet Protocol (IP) based communication between EVCC and SECC. ISO 15118-2:2014 defines messages, data model, XML/EXI based data representation format, usage of V2GTP, TLS, TCP and IPv6. In addition, it describes how data link layer services can be accessed from a layer 3 perspective. The Data Link Layer and Physical Layer functionality is described in ISO 15118-3.
This document is applicable to road vehicles with automated driving functions. The document specifies the logical interface between in-vehicle environmental perception sensors (for example, radar, lidar, camera, ultrasonic) and the fusion unit which generates a surround model and interprets the scene around the vehicle based on the sensor data. The interface is described in a modular and semantic representation and provides information on object level (for example, potentially moving objects, road objects, static objects) as well as information on feature and detection levels based on sensor technology specific information. Further supportive information is available.
This document does not provide electrical and mechanical interface specifications. Raw data interfaces are also excluded.
This document is applicable to road vehicles with automated driving functions. The document specifies the logical interface between in-vehicle environmental perception sensors (for example, radar, lidar, camera, ultrasonic) and the fusion unit which generates a surround model and interprets the scene around the vehicle based on the sensor data. The interface is described in a modular and semantic representation and provides information on object level (for example, potentially moving objects, road objects, static objects) as well as information on feature and detection levels based on sensor technology specific information. Further supportive information is available.
This document does not provide electrical and mechanical interface specifications. Raw data interfaces are also excluded.
This document is applicable to road vehicles with automated driving functions. The document specifies the logical interface between in-vehicle environmental perception sensors (for example, radar, lidar, camera, ultrasonic) and the fusion unit which generates a surround model and interprets the scene around the vehicle based on the sensor data. The interface is described in a modular and semantic representation and provides information on object level (for example, potentially moving objects, road objects, static objects) as well as information on feature and detection levels based on sensor technology specific information. Further supportive information is available.
This document does not provide electrical and mechanical interface specifications. Raw data interfaces are also excluded.
This document is applicable to road vehicles with automated driving functions. The document specifies the logical interface between in-vehicle environmental perception sensors (for example, radar, lidar, camera, ultrasonic) and the fusion unit which generates a surround model and interprets the scene around the vehicle based on the sensor data. The interface is described in a modular and semantic representation and provides information on object level (for example, potentially moving objects, road objects, static objects) as well as information on feature and detection levels based on sensor technology specific information. Further supportive information is available.
This document does not provide electrical and mechanical interface specifications. Raw data interfaces are also excluded.
This document is applicable to road vehicles with automated driving functions. The document specifies the logical interface between in-vehicle environmental perception sensors (for example, radar, lidar, camera, ultrasonic) and the fusion unit which generates a surround model and interprets the scene around the vehicle based on the sensor data. The interface is described in a modular and semantic representation and provides information on object level (for example, potentially moving objects, road objects, static objects) as well as information on feature and detection levels based on sensor technology specific information. Further supportive information is available.
This document does not provide electrical and mechanical interface specifications. Raw data interfaces are also excluded.
This document is applicable to road vehicles with automated driving functions. The document specifies the logical interface between in-vehicle environmental perception sensors (for example, radar, lidar, camera, ultrasonic) and the fusion unit which generates a surround model and interprets the scene around the vehicle based on the sensor data. The interface is described in a modular and semantic representation and provides information on object level (for example, potentially moving objects, road objects, static objects) as well as information on feature and detection levels based on sensor technology specific information. Further supportive information is available.
This document does not provide electrical and mechanical interface specifications. Raw data interfaces are also excluded.
This document specifies physical medium attachment (PMA) sublayers for the controller area network (CAN). This includes the high-speed (HS) PMA without and with low-power mode capability, without and with selective wake-up functionality. Additionally, this document specifies PMAs supporting the signal improvement capability (SIC) mode and the FAST mode in Annex A. The physical medium dependent (PMD) sublayer is not in the scope of this document.