- home
- Advanced Search
- Energy Research
- Energy Research
description Publicationkeyboard_double_arrow_right Article , Preprint , Journal 2020Embargo end date: 01 Jan 2019Publisher:Institute of Electrical and Electronics Engineers (IEEE) Authors: Shibo Zhou; Ying Chen; Xiaohua Li; Arindam Sanyal;Real-time accurate detection of three-dimensional (3D) objects is a fundamental necessity for self-driving vehicles. Most existing computer vision approaches are based on convolutional neural networks (CNNs). Although the CNN-based approaches can achieve high detection accuracy, their high energy consumption is a severe drawback. To resolve this problem, novel energy efficient approaches should be explored. Spiking neural network (SNN) is a promising candidate because it has orders-of-magnitude lower energy consumption than CNN. Unfortunately, the studying of SNN has been limited in small networks only. The application of SNN for large 3D object detection networks has remain largely open. In this paper, we integrate spiking convolutional neural network (SCNN) with temporal coding into the YOLOv2 architecture for real-time object detection. To take the advantage of spiking signals, we develop a novel data preprocessing layer that translates 3D point-cloud data into spike time data. We propose an analog circuit to implement the non-leaky integrate and fire neuron used in our SCNN, from which the energy consumption of each spike is estimated. Moreover, we present a method to calculate the network sparsity and the energy consumption of the overall network. Extensive experiments have been conducted based on the KITTI dataset, which show that the proposed network can reach competitive detection accuracy as existing approaches, yet with much lower average energy consumption. If implemented in dedicated hardware, our network could have a mean sparsity of 56.24% and extremely low total energy consumption of 0.247mJ only. Implemented in NVIDIA GTX 1080i GPU, we can achieve 35.7 fps frame rate, high enough for real-time object detection.
IEEE Access arrow_drop_down https://dx.doi.org/10.48550/ar...Article . 2019License: arXiv Non-Exclusive DistributionData sources: Dataciteadd ClaimPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://beta.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.1109/access.2020.2990416&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.euAccess RoutesGreen gold 34 citations 34 popularity Top 10% influence Top 10% impulse Top 10% Powered by BIP!
more_vert IEEE Access arrow_drop_down https://dx.doi.org/10.48550/ar...Article . 2019License: arXiv Non-Exclusive DistributionData sources: Dataciteadd ClaimPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://beta.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.1109/access.2020.2990416&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eudescription Publicationkeyboard_double_arrow_right Article , Preprint 2025Embargo end date: 01 Jan 2022Publisher:Institute for Operations Research and the Management Sciences (INFORMS) Authors: Jin Yang; Guangxin Jiang; Yinan Wang; Ying Chen;Recent years have witnessed exponential growth in developing deep learning models for time series electricity forecasting in power systems. However, most of the proposed models are designed based on the designers’ inherent knowledge and experience without elaborating on the suitability of the proposed neural architectures. Moreover, these models cannot be self-adjusted to dynamically changed data patterns due to the inflexible design of their structures. Although several recent studies have considered the application of the neural architecture search (NAS) technique for obtaining a network with an optimized structure in the electricity forecasting sector, their training process is computationally expensive and their search strategies are not flexible, indicating that the NAS application in this area is still at an infancy stage. In this study, we propose an intelligent automated architecture search (IAAS) framework for the development of time series electricity forecasting models. The proposed framework contains three primary components, that is, network function–preserving transformation operation, reinforcement learning–based network transformation control, and heuristic network screening, which aim to improve the search quality of a network structure. After conducting comprehensive experiments on two publicly available electricity load data sets and two wind power data sets, we demonstrate that the proposed IAAS framework significantly outperforms the 10 existing models or methods in terms of forecasting accuracy and stability. Finally, we perform an ablation experiment to showcase the importance of critical components in the proposed IAAS framework in improving forecasting accuracy. History: Accepted by Ram Ramesh, Area Editor for Data Science and Machine Learning. Funding: J. Yang, G. Jiang, and Y. Chen were supported by the National Natural Science Foundation of China [Grants 72293562, 72121001, 72101066, 72131005, 71801148, and 72171060]. Y. Chen was supported by the Heilongjiang Natural Science Excellent Youth Fund [YQ2022G004]. Supplemental Material: The software ( Yang et al. 2023 ) that supports the findings of this study is available within the paper and its Supplemental Information ( https://pubsonline.informs.org/doi/suppl/10.1287/ijoc.2023.0034 ) as well as from the IJOC GitHub software repository ( https://github.com/INFORMSJoC/2023.0034 ). The complete IJOC Software and Data Repository is available at https://informsjoc.github.io/ .
arXiv.org e-Print Ar... arrow_drop_down https://dx.doi.org/10.48550/ar...Article . 2022License: arXiv Non-Exclusive DistributionData sources: Dataciteadd ClaimPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://beta.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.1287/ijoc.2023.0034&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eu0 citations 0 popularity Average influence Average impulse Average Powered by BIP!
more_vert arXiv.org e-Print Ar... arrow_drop_down https://dx.doi.org/10.48550/ar...Article . 2022License: arXiv Non-Exclusive DistributionData sources: Dataciteadd ClaimPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://beta.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.1287/ijoc.2023.0034&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eu
description Publicationkeyboard_double_arrow_right Article , Preprint , Journal 2020Embargo end date: 01 Jan 2019Publisher:Institute of Electrical and Electronics Engineers (IEEE) Authors: Shibo Zhou; Ying Chen; Xiaohua Li; Arindam Sanyal;Real-time accurate detection of three-dimensional (3D) objects is a fundamental necessity for self-driving vehicles. Most existing computer vision approaches are based on convolutional neural networks (CNNs). Although the CNN-based approaches can achieve high detection accuracy, their high energy consumption is a severe drawback. To resolve this problem, novel energy efficient approaches should be explored. Spiking neural network (SNN) is a promising candidate because it has orders-of-magnitude lower energy consumption than CNN. Unfortunately, the studying of SNN has been limited in small networks only. The application of SNN for large 3D object detection networks has remain largely open. In this paper, we integrate spiking convolutional neural network (SCNN) with temporal coding into the YOLOv2 architecture for real-time object detection. To take the advantage of spiking signals, we develop a novel data preprocessing layer that translates 3D point-cloud data into spike time data. We propose an analog circuit to implement the non-leaky integrate and fire neuron used in our SCNN, from which the energy consumption of each spike is estimated. Moreover, we present a method to calculate the network sparsity and the energy consumption of the overall network. Extensive experiments have been conducted based on the KITTI dataset, which show that the proposed network can reach competitive detection accuracy as existing approaches, yet with much lower average energy consumption. If implemented in dedicated hardware, our network could have a mean sparsity of 56.24% and extremely low total energy consumption of 0.247mJ only. Implemented in NVIDIA GTX 1080i GPU, we can achieve 35.7 fps frame rate, high enough for real-time object detection.
IEEE Access arrow_drop_down https://dx.doi.org/10.48550/ar...Article . 2019License: arXiv Non-Exclusive DistributionData sources: Dataciteadd ClaimPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://beta.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.1109/access.2020.2990416&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.euAccess RoutesGreen gold 34 citations 34 popularity Top 10% influence Top 10% impulse Top 10% Powered by BIP!
more_vert IEEE Access arrow_drop_down https://dx.doi.org/10.48550/ar...Article . 2019License: arXiv Non-Exclusive DistributionData sources: Dataciteadd ClaimPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://beta.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.1109/access.2020.2990416&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eudescription Publicationkeyboard_double_arrow_right Article , Preprint 2025Embargo end date: 01 Jan 2022Publisher:Institute for Operations Research and the Management Sciences (INFORMS) Authors: Jin Yang; Guangxin Jiang; Yinan Wang; Ying Chen;Recent years have witnessed exponential growth in developing deep learning models for time series electricity forecasting in power systems. However, most of the proposed models are designed based on the designers’ inherent knowledge and experience without elaborating on the suitability of the proposed neural architectures. Moreover, these models cannot be self-adjusted to dynamically changed data patterns due to the inflexible design of their structures. Although several recent studies have considered the application of the neural architecture search (NAS) technique for obtaining a network with an optimized structure in the electricity forecasting sector, their training process is computationally expensive and their search strategies are not flexible, indicating that the NAS application in this area is still at an infancy stage. In this study, we propose an intelligent automated architecture search (IAAS) framework for the development of time series electricity forecasting models. The proposed framework contains three primary components, that is, network function–preserving transformation operation, reinforcement learning–based network transformation control, and heuristic network screening, which aim to improve the search quality of a network structure. After conducting comprehensive experiments on two publicly available electricity load data sets and two wind power data sets, we demonstrate that the proposed IAAS framework significantly outperforms the 10 existing models or methods in terms of forecasting accuracy and stability. Finally, we perform an ablation experiment to showcase the importance of critical components in the proposed IAAS framework in improving forecasting accuracy. History: Accepted by Ram Ramesh, Area Editor for Data Science and Machine Learning. Funding: J. Yang, G. Jiang, and Y. Chen were supported by the National Natural Science Foundation of China [Grants 72293562, 72121001, 72101066, 72131005, 71801148, and 72171060]. Y. Chen was supported by the Heilongjiang Natural Science Excellent Youth Fund [YQ2022G004]. Supplemental Material: The software ( Yang et al. 2023 ) that supports the findings of this study is available within the paper and its Supplemental Information ( https://pubsonline.informs.org/doi/suppl/10.1287/ijoc.2023.0034 ) as well as from the IJOC GitHub software repository ( https://github.com/INFORMSJoC/2023.0034 ). The complete IJOC Software and Data Repository is available at https://informsjoc.github.io/ .
arXiv.org e-Print Ar... arrow_drop_down https://dx.doi.org/10.48550/ar...Article . 2022License: arXiv Non-Exclusive DistributionData sources: Dataciteadd ClaimPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://beta.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.1287/ijoc.2023.0034&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eu0 citations 0 popularity Average influence Average impulse Average Powered by BIP!
more_vert arXiv.org e-Print Ar... arrow_drop_down https://dx.doi.org/10.48550/ar...Article . 2022License: arXiv Non-Exclusive DistributionData sources: Dataciteadd ClaimPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://beta.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.1287/ijoc.2023.0034&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eu