{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,1,10]],"date-time":"2026-01-10T19:30:54Z","timestamp":1768073454843,"version":"3.49.0"},"publisher-location":"New York, NY, USA","reference-count":35,"publisher":"ACM","license":[{"start":{"date-parts":[[2021,8,14]],"date-time":"2021-08-14T00:00:00Z","timestamp":1628899200000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/www.acm.org\/publications\/policies\/copyright_policy#Background"}],"content-domain":{"domain":["dl.acm.org"],"crossmark-restriction":true},"short-container-title":[],"published-print":{"date-parts":[[2021,8,14]]},"DOI":"10.1145\/3447548.3467448","type":"proceedings-article","created":{"date-parts":[[2021,8,13]],"date-time":"2021-08-13T18:21:39Z","timestamp":1628878899000},"page":"925-933","update-policy":"https:\/\/doi.org\/10.1145\/crossmark-policy","source":"Crossref","is-referenced-by-count":10,"title":["Physical Equation Discovery Using Physics-Consistent Neural Network (PCNN) Under Incomplete Observability"],"prefix":"10.1145","author":[{"given":"Haoran","family":"Li","sequence":"first","affiliation":[{"name":"Arizona State University, Tempe, AZ, USA"}]},{"given":"Yang","family":"Weng","sequence":"additional","affiliation":[{"name":"Arizona State University, Tempe, AZ, USA"}]}],"member":"320","published-online":{"date-parts":[[2021,8,14]]},"reference":[{"key":"e_1_3_2_2_1_1","doi-asserted-by":"publisher","DOI":"10.1371\/journal.pone.0130140"},{"key":"e_1_3_2_2_2_1","doi-asserted-by":"publisher","DOI":"10.1073\/pnas.1517384113"},{"key":"e_1_3_2_2_3_1","doi-asserted-by":"publisher","DOI":"10.1073\/pnas.1906995116"},{"key":"e_1_3_2_2_4_1","doi-asserted-by":"publisher","DOI":"10.1145\/1374376.1374441"},{"key":"e_1_3_2_2_5_1","doi-asserted-by":"publisher","DOI":"10.1145\/2049662.2049670"},{"key":"e_1_3_2_2_6_1","doi-asserted-by":"publisher","DOI":"10.1109\/HASE.2017.36"},{"key":"e_1_3_2_2_7_1","volume-title":"Rewiring networks for synchronization. Chaos: An interdisciplinary journal of nonlinear science","author":"Hagberg Aric","year":"2008","unstructured":"Aric Hagberg and Daniel A Schult . 2008. Rewiring networks for synchronization. Chaos: An interdisciplinary journal of nonlinear science , Vol. 18 , 3 ( 2008 ), 037105. Aric Hagberg and Daniel A Schult. 2008. Rewiring networks for synchronization. Chaos: An interdisciplinary journal of nonlinear science, Vol. 18, 3 (2008), 037105."},{"key":"e_1_3_2_2_8_1","doi-asserted-by":"publisher","DOI":"10.1109\/CVPR.2016.90"},{"key":"e_1_3_2_2_9_1","volume-title":"Physics-Guided Deep Neural Networks for PowerFlow Analysis. arXiv preprint arXiv:2002.00097","author":"Hu Xinyue","year":"2020","unstructured":"Xinyue Hu , Haoji Hu , Saurabh Verma , and Zhi-Li Zhang . 2020. Physics-Guided Deep Neural Networks for PowerFlow Analysis. arXiv preprint arXiv:2002.00097 ( 2020 ). Xinyue Hu, Haoji Hu, Saurabh Verma, and Zhi-Li Zhang. 2020. Physics-Guided Deep Neural Networks for PowerFlow Analysis. arXiv preprint arXiv:2002.00097 (2020)."},{"key":"e_1_3_2_2_10_1","doi-asserted-by":"publisher","DOI":"10.1007\/978-3-030-01258-8_32"},{"key":"e_1_3_2_2_11_1","doi-asserted-by":"publisher","DOI":"10.1137\/1.9781611975673.63"},{"key":"e_1_3_2_2_12_1","volume-title":"Physics-guided machine learning for scientific discovery: An application in simulating lake temperature profiles. arXiv preprint arXiv:2001.11086","author":"Jia Xiaowei","year":"2020","unstructured":"Xiaowei Jia , Jared Willard , Anuj Karpatne , Jordan S Read , Jacob A Zwart , Michael Steinbach , and Vipin Kumar . 2020. Physics-guided machine learning for scientific discovery: An application in simulating lake temperature profiles. arXiv preprint arXiv:2001.11086 ( 2020 ). Xiaowei Jia, Jared Willard, Anuj Karpatne, Jordan S Read, Jacob A Zwart, Michael Steinbach, and Vipin Kumar. 2020. Physics-guided machine learning for scientific discovery: An application in simulating lake temperature profiles. arXiv preprint arXiv:2001.11086 (2020)."},{"key":"e_1_3_2_2_13_1","doi-asserted-by":"publisher","DOI":"10.1109\/TPAMI.2017.2670560"},{"key":"e_1_3_2_2_14_1","volume-title":"Physics-guided neural networks (pgnn): An application in lake temperature modeling. arXiv preprint arXiv:1710.11431","author":"Karpatne Anuj","year":"2017","unstructured":"Anuj Karpatne , William Watkins , Jordan Read , and Vipin Kumar . 2017. Physics-guided neural networks (pgnn): An application in lake temperature modeling. arXiv preprint arXiv:1710.11431 ( 2017 ). Anuj Karpatne, William Watkins, Jordan Read, and Vipin Kumar. 2017. Physics-guided neural networks (pgnn): An application in lake temperature modeling. arXiv preprint arXiv:1710.11431 (2017)."},{"key":"e_1_3_2_2_15_1","doi-asserted-by":"publisher","DOI":"10.1016\/j.ijepes.2021.106794"},{"key":"e_1_3_2_2_16_1","volume-title":"International Conference on Artificial Intelligence and Statistics. PMLR, 4313--4324","author":"Li Mingchen","year":"2020","unstructured":"Mingchen Li , Mahdi Soltanolkotabi , and Samet Oymak . 2020 . Gradient descent with early stopping is provably robust to label noise for overparameterized neural networks . In International Conference on Artificial Intelligence and Statistics. PMLR, 4313--4324 . Mingchen Li, Mahdi Soltanolkotabi, and Samet Oymak. 2020. Gradient descent with early stopping is provably robust to label noise for overparameterized neural networks. In International Conference on Artificial Intelligence and Statistics. PMLR, 4313--4324."},{"key":"e_1_3_2_2_17_1","unstructured":"Yin Liu and Vincent Chen. 2018. On the Generalization Effects of DenseNet Model Structures. (2018).  Yin Liu and Vincent Chen. 2018. On the Generalization Effects of DenseNet Model Structures. (2018)."},{"key":"e_1_3_2_2_18_1","unstructured":"Scott M Lundberg and Su-In Lee. 2017. A unified approach to interpreting model predictions. In Advances in neural information processing systems. 4765--4774.  Scott M Lundberg and Su-In Lee. 2017. A unified approach to interpreting model predictions. In Advances in neural information processing systems. 4765--4774."},{"key":"e_1_3_2_2_19_1","volume-title":"https:\/\/matpower.org\/","author":"MATPOWER","year":"2020","unstructured":"MATPOWER community. 2020. MATPOWER. ( 2020 ). https:\/\/matpower.org\/ . MATPOWER community. 2020. MATPOWER. (2020). https:\/\/matpower.org\/."},{"key":"e_1_3_2_2_20_1","volume-title":"Data augmentation for improving deep learning in image classification problem. In 2018 international interdisciplinary PhD workshop (IIPhDW)","author":"Miko\u0142ajczyk Agnieszka","unstructured":"Agnieszka Miko\u0142ajczyk and Micha\u0142 Grochowski . 2018. Data augmentation for improving deep learning in image classification problem. In 2018 international interdisciplinary PhD workshop (IIPhDW) . IEEE , 117--122. Agnieszka Miko\u0142ajczyk and Micha\u0142 Grochowski. 2018. Data augmentation for improving deep learning in image classification problem. In 2018 international interdisciplinary PhD workshop (IIPhDW). IEEE, 117--122."},{"key":"e_1_3_2_2_21_1","unstructured":"PJM Interconnection LLC. 2018. Metered Load Data. (2018). https:\/\/dataminer2.pjm.com\/feed\/hrl_load_metered\/definition.  PJM Interconnection LLC. 2018. Metered Load Data. (2018). https:\/\/dataminer2.pjm.com\/feed\/hrl_load_metered\/definition."},{"key":"e_1_3_2_2_22_1","doi-asserted-by":"publisher","DOI":"10.1109\/ACC.2003.1239709"},{"key":"e_1_3_2_2_23_1","volume-title":"Airsim: High-fidelity visual and physical simulation for autonomous vehicles. In Field and service robotics","author":"Shah Shital","year":"2018","unstructured":"Shital Shah , Debadeepta Dey , Chris Lovett , and Ashish Kapoor . 2018 . Airsim: High-fidelity visual and physical simulation for autonomous vehicles. In Field and service robotics . Springer , 621--635. Shital Shah, Debadeepta Dey, Chris Lovett, and Ashish Kapoor. 2018. Airsim: High-fidelity visual and physical simulation for autonomous vehicles. In Field and service robotics. Springer, 621--635."},{"key":"e_1_3_2_2_24_1","volume-title":"Learning important features through propagating activation differences. arXiv preprint arXiv:1704.02685","author":"Shrikumar Avanti","year":"2017","unstructured":"Avanti Shrikumar , Peyton Greenside , and Anshul Kundaje . 2017. Learning important features through propagating activation differences. arXiv preprint arXiv:1704.02685 ( 2017 ). Avanti Shrikumar, Peyton Greenside, and Anshul Kundaje. 2017. Learning important features through propagating activation differences. arXiv preprint arXiv:1704.02685 (2017)."},{"key":"e_1_3_2_2_25_1","volume-title":"Deep inside convolutional networks: Visualising image classification models and saliency maps. arXiv preprint arXiv:1312.6034","author":"Simonyan Karen","year":"2013","unstructured":"Karen Simonyan , Andrea Vedaldi , and Andrew Zisserman . 2013. Deep inside convolutional networks: Visualising image classification models and saliency maps. arXiv preprint arXiv:1312.6034 ( 2013 ). Karen Simonyan, Andrea Vedaldi, and Andrew Zisserman. 2013. Deep inside convolutional networks: Visualising image classification models and saliency maps. arXiv preprint arXiv:1312.6034 (2013)."},{"key":"e_1_3_2_2_26_1","volume-title":"Dropout: a simple way to prevent neural networks from overfitting. The journal of machine learning research","author":"Srivastava Nitish","year":"2014","unstructured":"Nitish Srivastava , Geoffrey Hinton , Alex Krizhevsky , Ilya Sutskever , and Ruslan Salakhutdinov . 2014. Dropout: a simple way to prevent neural networks from overfitting. The journal of machine learning research , Vol. 15 , 1 ( 2014 ), 1929--1958. Nitish Srivastava, Geoffrey Hinton, Alex Krizhevsky, Ilya Sutskever, and Ruslan Salakhutdinov. 2014. Dropout: a simple way to prevent neural networks from overfitting. The journal of machine learning research, Vol. 15, 1 (2014), 1929--1958."},{"key":"e_1_3_2_2_27_1","volume-title":"Science Advances","volume":"6","author":"Udrescu Silviu-Marian","year":"2020","unstructured":"Silviu-Marian Udrescu and Max Tegmark . 2020 . AI Feynman: A physics-inspired method for symbolic regression . Science Advances , Vol. 6 , 16 (2020), eaay2631. Silviu-Marian Udrescu and Max Tegmark. 2020. AI Feynman: A physics-inspired method for symbolic regression. Science Advances, Vol. 6, 16 (2020), eaay2631."},{"key":"e_1_3_2_2_28_1","doi-asserted-by":"publisher","DOI":"10.1016\/j.sysconle.2015.08.013"},{"key":"e_1_3_2_2_29_1","doi-asserted-by":"publisher","DOI":"10.1371\/journal.pone.0197704"},{"key":"e_1_3_2_2_30_1","volume-title":"Integrating physics-based modeling with machine learning: A survey. arXiv preprint arXiv:2003.04919","author":"Willard Jared","year":"2020","unstructured":"Jared Willard , Xiaowei Jia , Shaoming Xu , Michael Steinbach , and Vipin Kumar . 2020. Integrating physics-based modeling with machine learning: A survey. arXiv preprint arXiv:2003.04919 ( 2020 ). Jared Willard, Xiaowei Jia, Shaoming Xu, Michael Steinbach, and Vipin Kumar. 2020. Integrating physics-based modeling with machine learning: A survey. arXiv preprint arXiv:2003.04919 (2020)."},{"key":"e_1_3_2_2_31_1","doi-asserted-by":"publisher","DOI":"10.1109\/ICASSP.2013.6639201"},{"key":"e_1_3_2_2_32_1","volume-title":"2017 North American Power Symposium (NAPS). 1--6.","author":"Yu J.","unstructured":"J. Yu , Y. Weng , and R. Rajagopal . 2017. Robust mapping rule estimation for power flow analysis in distribution grids . In 2017 North American Power Symposium (NAPS). 1--6. J. Yu, Y. Weng, and R. Rajagopal. 2017. Robust mapping rule estimation for power flow analysis in distribution grids. In 2017 North American Power Symposium (NAPS). 1--6."},{"key":"e_1_3_2_2_33_1","doi-asserted-by":"publisher","DOI":"10.1007\/978-3-319-10590-1_53"},{"key":"e_1_3_2_2_34_1","volume-title":"Predicting effects of noncoding variants with deep learning-based sequence model. Nature methods","author":"Zhou Jian","year":"2015","unstructured":"Jian Zhou and Olga G Troyanskaya . 2015. Predicting effects of noncoding variants with deep learning-based sequence model. Nature methods , Vol. 12 , 10 ( 2015 ), 931--934. Jian Zhou and Olga G Troyanskaya. 2015. Predicting effects of noncoding variants with deep learning-based sequence model. Nature methods, Vol. 12, 10 (2015), 931--934."},{"key":"e_1_3_2_2_35_1","volume-title":"Visualizing deep neural network decisions: Prediction difference analysis. arXiv preprint arXiv:1702.04595","author":"Zintgraf Luisa M","year":"2017","unstructured":"Luisa M Zintgraf , Taco S Cohen , Tameem Adel , and Max Welling . 2017. Visualizing deep neural network decisions: Prediction difference analysis. arXiv preprint arXiv:1702.04595 ( 2017 ). Luisa M Zintgraf, Taco S Cohen, Tameem Adel, and Max Welling. 2017. Visualizing deep neural network decisions: Prediction difference analysis. arXiv preprint arXiv:1702.04595 (2017)."}],"event":{"name":"KDD '21: The 27th ACM SIGKDD Conference on Knowledge Discovery and Data Mining","location":"Virtual Event Singapore","acronym":"KDD '21","sponsor":["SIGMOD ACM Special Interest Group on Management of Data","SIGKDD ACM Special Interest Group on Knowledge Discovery in Data"]},"container-title":["Proceedings of the 27th ACM SIGKDD Conference on Knowledge Discovery &amp; Data Mining"],"original-title":[],"link":[{"URL":"https:\/\/dl.acm.org\/doi\/10.1145\/3447548.3467448","content-type":"unspecified","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/dl.acm.org\/doi\/pdf\/10.1145\/3447548.3467448","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,6,17]],"date-time":"2025-06-17T20:18:37Z","timestamp":1750191517000},"score":1,"resource":{"primary":{"URL":"https:\/\/dl.acm.org\/doi\/10.1145\/3447548.3467448"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2021,8,14]]},"references-count":35,"alternative-id":["10.1145\/3447548.3467448","10.1145\/3447548"],"URL":"https:\/\/doi.org\/10.1145\/3447548.3467448","relation":{},"subject":[],"published":{"date-parts":[[2021,8,14]]},"assertion":[{"value":"2021-08-14","order":2,"name":"published","label":"Published","group":{"name":"publication_history","label":"Publication History"}}]}}