{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2025,12,10]],"date-time":"2025-12-10T09:04:34Z","timestamp":1765357474088,"version":"build-2065373602"},"reference-count":39,"publisher":"MDPI AG","issue":"21","license":[{"start":{"date-parts":[[2023,11,3]],"date-time":"2023-11-03T00:00:00Z","timestamp":1698969600000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"funder":[{"name":"Ouster Inc.","award":["NA"],"award-info":[{"award-number":["NA"]}]}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Sensors"],"abstract":"<jats:p>Self-driving vehicles demand efficient and reliable depth-sensing technologies. Lidar, with its capability for long-distance, high-precision measurement, is a crucial component in this pursuit. However, conventional mechanical scanning implementations suffer from reliability, cost, and frame rate limitations. Solid-state lidar solutions have emerged as a promising alternative, but the vast amount of photon data processed and stored using conventional direct time-of-flight (dToF) prevents long-distance sensing unless power-intensive partial histogram approaches are used. In this paper, we introduce a groundbreaking \u2018guided\u2019 dToF approach, harnessing external guidance from other onboard sensors to narrow down the depth search space for a power and data-efficient solution. This approach centers around a dToF sensor in which the exposed time window of independent pixels can be dynamically adjusted. We utilize a 64-by-32 macropixel dToF sensor and a pair of vision cameras to provide the guiding depth estimates. Our demonstrator captures a dynamic outdoor scene at 3 fps with distances up to 75 m. Compared to a conventional full histogram approach, on-chip data is reduced by over twenty times, while the total laser cycles in each frame are reduced by at least six times compared to any partial histogram approach. The capability of guided dToF to mitigate multipath reflections is also demonstrated. For self-driving vehicles where a wealth of sensor data is already available, guided dToF opens new possibilities for efficient solid-state lidar.<\/jats:p>","DOI":"10.3390\/s23218943","type":"journal-article","created":{"date-parts":[[2023,11,3]],"date-time":"2023-11-03T02:41:56Z","timestamp":1698979316000},"page":"8943","update-policy":"https:\/\/doi.org\/10.3390\/mdpi_crossmark_policy","source":"Crossref","is-referenced-by-count":4,"title":["Guided Direct Time-of-Flight Lidar Using Stereo Cameras for Enhanced Laser Power Efficiency"],"prefix":"10.3390","volume":"23","author":[{"ORCID":"https:\/\/orcid.org\/0000-0002-3073-5855","authenticated-orcid":false,"given":"Filip","family":"Taneski","sequence":"first","affiliation":[{"name":"Institute for Integrated Micro and Nano Systems, University of Edinburgh, Edinburgh EH9 3FF, UK"}]},{"ORCID":"https:\/\/orcid.org\/0000-0003-3931-7972","authenticated-orcid":false,"given":"Istvan","family":"Gyongy","sequence":"additional","affiliation":[{"name":"Institute for Integrated Micro and Nano Systems, University of Edinburgh, Edinburgh EH9 3FF, UK"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-5486-8791","authenticated-orcid":false,"given":"Tarek","family":"Al Abbas","sequence":"additional","affiliation":[{"name":"Ouster Automotive, Ouster, Inc., Edinburgh EH2 4AD, UK"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-0398-7520","authenticated-orcid":false,"given":"Robert K.","family":"Henderson","sequence":"additional","affiliation":[{"name":"Institute for Integrated Micro and Nano Systems, University of Edinburgh, Edinburgh EH9 3FF, UK"}]}],"member":"1968","published-online":{"date-parts":[[2023,11,3]]},"reference":[{"key":"ref_1","unstructured":"Rangwala, S. (2022). Automotive LiDAR Has Arrived, Forbes."},{"key":"ref_2","doi-asserted-by":"crossref","first-page":"2847","DOI":"10.1109\/ACCESS.2019.2962554","article-title":"Multi-Sensor Fusion in Automated Driving: A Survey","volume":"8","author":"Wang","year":"2020","journal-title":"IEEE Access"},{"key":"ref_3","unstructured":"Aptiv, A., Apollo, B., Continenta, D., FCA, H., and Infineon, I.V. (2023, October 28). Safety First For Automated Driving [White Paper]. Available online: https:\/\/group.mercedes-benz.com\/documents\/innovation\/other\/safety-first-for-automated-driving.pdf."},{"key":"ref_4","unstructured":"Ford (2018). A Matter of Trust: Ford\u2019s Approach to Developing Self-Driving Vehicles, Ford."},{"key":"ref_5","doi-asserted-by":"crossref","first-page":"131699","DOI":"10.1109\/ACCESS.2020.3009680","article-title":"Performance Analysis of 10 Models of 3D LiDARs for Automated Driving","volume":"8","author":"Lambert","year":"2020","journal-title":"IEEE Access"},{"key":"ref_6","doi-asserted-by":"crossref","unstructured":"Villa, F., Severini, F., Madonini, F., and Zappa, F. (2021). SPADs and SiPMs Arrays for Long-Range High-Speed Light Detection and Ranging (LiDAR). Sensors, 21.","DOI":"10.3390\/s21113839"},{"key":"ref_7","unstructured":"Rangwala, S. (2023). Lidar Miniaturization, ADAS & Autonomous Vehicle International."},{"key":"ref_8","doi-asserted-by":"crossref","unstructured":"Niclass, C., Soga, M., Matsubara, H., Ogawa, M., and Kagami, M. (2013, January 17\u201321). A 0.18 \u00b5m CMOS SoC for a 100 m-range 10 fps 200 \u00d7 96-pixel time-of-flight depth sensor. Proceedings of the 2013 IEEE International Solid-State Circuits Conference Digest of Technical Papers, San Francisco, CA, USA.","DOI":"10.1109\/ISSCC.2013.6487827"},{"key":"ref_9","doi-asserted-by":"crossref","first-page":"1137","DOI":"10.1109\/JSSC.2018.2883720","article-title":"A 30-frames\/s, 252 \u00d7 144 SPAD Flash LiDAR with 1728 Dual-Clock 48.8-ps TDCs, and Pixel-Wise Integrated Histogramming","volume":"54","author":"Zhang","year":"2019","journal-title":"IEEE J. Solid State Circuits"},{"key":"ref_10","doi-asserted-by":"crossref","unstructured":"Henderson, R.K., Johnston, N., Hutchings, S.W., Gyongy, I., Abbas, T.A., Dutton, N., Tyler, M., Chan, S., and Leach, J. (2019, January 17\u201321). 5.7 A 256 \u00d7 256 40 nm\/90 nm CMOS 3D-Stacked 120 dB Dynamic-Range Reconfigurable Time-Resolved SPAD Imager. Proceedings of the IEEE International Solid-State Circuits Conference\u2014(ISSCC), San Francisco, CA, USA.","DOI":"10.1109\/ISSCC.2019.8662355"},{"key":"ref_11","doi-asserted-by":"crossref","unstructured":"Kim, B., Park, S., Chun, J.H., Choi, J., and Kim, S.J. (2021, January 13\u201322). 7.2 A 48 \u00d7 40 13.5 mm Depth Resolution Flash LiDAR Sensor with In-Pixel Zoom Histogramming Time-to-Digital Converter. Proceedings of the 2021 IEEE International Solid- State Circuits Conference (ISSCC), San Francisco, CA, USA.","DOI":"10.1109\/ISSCC42613.2021.9366022"},{"key":"ref_12","unstructured":"Gyongy, I., Erdogan, A.T., Dutton, N.A., Mai, H., Rocca, F.M.D., and Henderson, R.K. (2021, January 20\u201323). A 200kFPS, 256 \u00d7 128 SPAD dToF sensor with peak tracking and smart readout. Proceedings of the International Image Sensor Workshop, Virtual."},{"key":"ref_13","unstructured":"Stoppa, D., Abovyan, S., Furrer, D., Gancarz, R., Jessenig, T., Kappel, R., Lueger, M., Mautner, C., Mills, I., and Perenzoni, D. (2021, January 20\u201323). A Reconfigurable QVGA\/Q3VGA Direct Time-of-Flight 3D Imaging System with On-chip Depth-map Computation in 45\/40 nm 3D-stacked BSI SPAD CMOS. Proceedings of the International Image Sensor Workshop, Virtual."},{"key":"ref_14","doi-asserted-by":"crossref","first-page":"3","DOI":"10.1109\/OJSSCS.2021.3118332","article-title":"A 240 \u00d7 160 3D Stacked SPAD dToF Image Sensor with Rolling Shutter and In Pixel Histogram for Mobile Devices","volume":"2","author":"Zhang","year":"2021","journal-title":"IEEE Open J. Solid State Circuits Soc."},{"key":"ref_15","doi-asserted-by":"crossref","unstructured":"Park, S., Kim, B., Cho, J., Chun, J., Choi, J., and Kim, S. (2022, January 20\u201326). 5.3 An 80 \u00d7 60 Flash LiDAR Sensor with In-Pixel Histogramming TDC Based on Quaternary Search and Time-Gated \u0394-Intensity Phase Detection for 45m Detectable Range and Background Light Cancellation. Proceedings of the 2022 IEEE International Solid-State Circuits Conference (ISSCC), San Francisco, CA, USA.","DOI":"10.1109\/ISSCC42614.2022.9731112"},{"key":"ref_16","unstructured":"Taloud, P.-Y., Bernhard, S., Biber, A., Boehm, M., Chelvam, P., Cruz, A., Chele, A.D., Gancarz, R., Ishizaki, K., and Jantscher, P. (2022, January 13\u201315). A 1.2 K dots dToF 3D Imaging System in 45\/22 nm 3D-stacked BSI SPAD CMOS. Proceedings of the International SPAD Sensor Workshop, Virtual."},{"key":"ref_17","doi-asserted-by":"crossref","unstructured":"Ximenes, A.R., Padmanabhan, P., Lee, M.J., Yamashita, Y., Yaung, D.N., and Charbon, E. (2018, January 11\u201315). A 256 \u00d7 256 45\/65 nm 3D-stacked SPAD-based direct TOF image sensor for LiDAR applications with optical polar modulation for up to 18.6dB interference suppression. Proceedings of the 2018 IEEE International Solid-State Circuits Conference\u2014(ISSCC), San Francisco, CA, USA.","DOI":"10.1109\/ISSCC.2018.8310201"},{"key":"ref_18","doi-asserted-by":"crossref","unstructured":"Padmanabhan, P., Zhang, C., Cazzaniga, M., Efe, B., Ximenes, A.R., Lee, M.J., and Charbon, E. (2021, January 13\u201322). 7.4 A 256 \u00d7 128 3D-Stacked (45 nm) SPAD FLASH LiDAR with 7-Level Coincidence Detection and Progressive Gating for 100 m Range and 10 klux Background Light. Proceedings of the 2021 IEEE International Solid-State Circuits Conference (ISSCC), San Francisco, CA, USA.","DOI":"10.1109\/ISSCC42613.2021.9366010"},{"key":"ref_19","unstructured":"Taneski, F., Gyongy, I., Abbas, T.A., and Henderson, R. (2023, January 21\u201325). Guided Flash Lidar: A Laser Power Efficient Approach for Long-Range Lidar. Proceedings of the International Image Sensor Workshop, Crieff, UK."},{"key":"ref_20","doi-asserted-by":"crossref","first-page":"29","DOI":"10.1109\/MM.2022.3219803","article-title":"Data Centers on Wheels: Emissions from Computing Onboard Autonomous Vehicles","volume":"43","author":"Sudhakar","year":"2023","journal-title":"IEEE Micro"},{"key":"ref_21","doi-asserted-by":"crossref","first-page":"5884","DOI":"10.1109\/JLT.2022.3187293","article-title":"Laser Power Efficiency of Partial Histogram Direct Time-of-Flight LiDAR Sensors","volume":"40","author":"Taneski","year":"2022","journal-title":"J. Light. Technol."},{"key":"ref_22","unstructured":"Fisher, R.B. (2021). Computer Vision, Springer."},{"key":"ref_23","doi-asserted-by":"crossref","unstructured":"Geiger, A., Lenz, P., and Urtasun, R. (2012, January 16\u201321). Are we ready for autonomous driving? The KITTI vision benchmark suite. Proceedings of the 2012 IEEE Conference on Computer Vision and Pattern Recognition, Providence, RI, USA.","DOI":"10.1109\/CVPR.2012.6248074"},{"key":"ref_24","doi-asserted-by":"crossref","first-page":"1738","DOI":"10.1109\/TPAMI.2020.3032602","article-title":"A Survey on Deep Learning Techniques for Stereo-Based Depth Estimation","volume":"44","author":"Laga","year":"2022","journal-title":"IEEE Trans. Pattern Anal. Mach. Intell."},{"key":"ref_25","doi-asserted-by":"crossref","first-page":"328","DOI":"10.1109\/TPAMI.2007.1166","article-title":"Stereo Processing by Semiglobal Matching and Mutual Information","volume":"30","author":"Hirschmuller","year":"2008","journal-title":"IEEE Trans. Pattern Anal. Mach. Intell."},{"key":"ref_26","doi-asserted-by":"crossref","first-page":"1330","DOI":"10.1109\/34.888718","article-title":"A flexible new technique for camera calibration","volume":"22","author":"Zhang","year":"2000","journal-title":"IEEE Trans. Pattern Anal. Mach. Intell."},{"key":"ref_27","unstructured":"(2023). MATLAB, MathWorks. R2023b."},{"key":"ref_28","doi-asserted-by":"crossref","first-page":"1","DOI":"10.1109\/TIM.2021.3073684","article-title":"Models of Direct Time-of-Flight Sensor Precision That Enable Optimal Design and Dynamic Configuration","volume":"70","author":"Koerner","year":"2021","journal-title":"IEEE Trans. Instrum. Meas."},{"key":"ref_29","doi-asserted-by":"crossref","unstructured":"Bijelic, M., Gruber, T., and Ritter, W. (2018, January 26\u201330). A Benchmark for Lidar Sensors in Fog: Is Detection Breaking Down?. Proceedings of the 2018 IEEE Intelligent Vehicles Symposium (IV), Suzhou, China.","DOI":"10.1109\/IVS.2018.8500543"},{"key":"ref_30","doi-asserted-by":"crossref","first-page":"2794","DOI":"10.1109\/TED.2021.3131430","article-title":"Direct Time-of-Flight Single-Photon Imaging","volume":"69","author":"Gyongy","year":"2022","journal-title":"IEEE Trans. Electron Devices"},{"key":"ref_31","doi-asserted-by":"crossref","unstructured":"Tontini, A., Gasparini, L., and Perenzoni, M. (2020). Numerical Model of SPAD-Based Direct Time-of-Flight Flash LIDAR CMOS Image Sensors. Sensor, 20.","DOI":"10.3390\/s20185203"},{"key":"ref_32","doi-asserted-by":"crossref","first-page":"7064","DOI":"10.1109\/TVT.2020.2989148","article-title":"Full Waveform LiDAR for Adverse Weather Conditions","volume":"69","author":"Wallace","year":"2020","journal-title":"IEEE Trans. Veh. Technol."},{"key":"ref_33","doi-asserted-by":"crossref","unstructured":"Sch\u00f6nlieb, A., Lugitsch, D., Steger, C., Holweg, G., and Druml, N. (November, January 19). Multi-Depth Sensing for Applications With Indirect Solid-State LiDAR. Proceedings of the 2020 IEEE Intelligent Vehicles Symposium (IV), Las Vegas, NV, USA.","DOI":"10.1109\/IV47402.2020.9304684"},{"key":"ref_34","doi-asserted-by":"crossref","unstructured":"Okino, T., Yamada, S., Sakata, Y., Kasuga, S., Takemoto, M., Nose, Y., Koshida, H., Tamaru, M., Sugiura, Y., and Saito, S. (2020, January 16\u201320). 5.2 A 1200 \u00d7 900 6 \u00b5m 450 fps Geiger-Mode Vertical Avalanche Photodiodes CMOS Image Sensor for a 250m Time-of-Flight Ranging System Using Direct-Indirect-Mixed Frame Synthesis with Configurable-Depth-Resolution Down to 10cm. Proceedings of the 2020 IEEE International Solid-State Circuits Conference (ISSCC), San Francisco, CA, USA.","DOI":"10.1109\/ISSCC19947.2020.9063045"},{"key":"ref_35","doi-asserted-by":"crossref","unstructured":"Kumagai, O., Ohmachi, J., Matsumura, M., Yagi, S., Tayu, K., Amagawa, K., Matsukawa, T., Ozawa, O., Hirono, D., and Shinozuka, Y. (2021, January 13\u201322). 7.3 A 189 \u00d7 600 Back-Illuminated Stacked SPAD Direct Time-of-Flight Depth Sensor for Automotive LiDAR Systems. Proceedings of the 2021 IEEE International Solid- State Circuits Conference (ISSCC), San Francisco, CA, USA.","DOI":"10.1109\/ISSCC42613.2021.9365961"},{"key":"ref_36","doi-asserted-by":"crossref","unstructured":"Badki, A., Troccoli, A., Kim, K., Kautz, J., Sen, P., and Gallo, O. (2020, January 14\u201319). Bi3D: Stereo Depth Estimation via Binary Classifications. Proceedings of the IEEE\/CVF Conference on Computer Vision and Pattern Recognition, Seattle, WA, USA.","DOI":"10.1109\/CVPR42600.2020.00167"},{"key":"ref_37","doi-asserted-by":"crossref","first-page":"1536","DOI":"10.1109\/TIP.2009.2017824","article-title":"Continuous Stereo Self-Calibration by Camera Parameter Tracking","volume":"18","author":"Dang","year":"2009","journal-title":"IEEE Trans. Image Process."},{"key":"ref_38","doi-asserted-by":"crossref","unstructured":"Warren, M.E. (2019, January 9\u201314). Automotive LIDAR Technology. Proceedings of the 2019 Symposium on VLSI Circuits, Kyoto, Japan.","DOI":"10.23919\/VLSIC.2019.8777993"},{"key":"ref_39","doi-asserted-by":"crossref","unstructured":"Morimoto, K., Iwata, J., Shinohara, M., Sekine, H., Abdelghafar, A., Tsuchiya, H., Kuroda, Y., Tojima, K., Endo, W., and Maehashi, Y. (2021, January 11\u201316). 3.2 Megapixel 3D-Stacked Charge Focusing SPAD for Low-Light Imaging and Depth Sensing. Proceedings of the 2021 IEEE International Electron Devices Meeting (IEDM), San Francisco, CA, USA.","DOI":"10.1109\/IEDM19574.2021.9720605"}],"container-title":["Sensors"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/1424-8220\/23\/21\/8943\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,10]],"date-time":"2025-10-10T21:16:21Z","timestamp":1760130981000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/1424-8220\/23\/21\/8943"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2023,11,3]]},"references-count":39,"journal-issue":{"issue":"21","published-online":{"date-parts":[[2023,11]]}},"alternative-id":["s23218943"],"URL":"https:\/\/doi.org\/10.3390\/s23218943","relation":{},"ISSN":["1424-8220"],"issn-type":[{"type":"electronic","value":"1424-8220"}],"subject":[],"published":{"date-parts":[[2023,11,3]]}}}