The functions of a self-driving car depends on the ability of sensing surrounding environment and ego state, in anther word, we must be clear of "where am I" and "who are they". The technical names of these questions are perception of localization.
The project aims to implement centimeter level localization for ADAS and Autonomous Driving on FPGA. Since it's the footing stone for function as well as perception and other fundermental service, like map and v2x.
The system will include GNSS, IMU, chassic information, LIDAR, optional camera module and main processing unit. ZCU104 Evaluation Kit acts as the "main processing unit", where the algorithm runs on.
Step 1:
In building machine, use petalinux to compile the image and write it into SD card.
You can put source code into the file system.
Step 2:
Plug the SD card into ZCU104 and start the board.
Step 3:
Use serial teminator to connect ZCU104 and launch the application.
- Since the workload of porting device drive is a lot, the project use pre-recorded raw data and feed it into the application to get the result.
Comments
Please log in or sign up to comment.