2020 연구성과 (50 / 270)
※ 컨트롤 + 클릭으로 열별 다중 정렬 가능합니다.
Excel 다운로드
WoS | SCOPUS | Document Type | Document Title | Abstract | Authors | Affiliation | ResearcherID (WoS) | AuthorsID (SCOPUS) | Author Email(s) | Journal Name | JCR Abbreviation | ISSN | eISSN | Volume | Issue | WoS Edition | WoS Category | JCR Year | IF | JCR (%) | FWCI | FWCI Update Date | WoS Citation | SCOPUS Citation | Keywords (WoS) | KeywordsPlus (WoS) | Keywords (SCOPUS) | KeywordsPlus (SCOPUS) | Language | Publication Stage | Publication Year | Publication Date | DOI | JCR Link | DOI Link | WOS Link | SCOPUS Link |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
○ | ○ | Article | Validation of RELAP5 MOD3.3 code for Hybrid-SIT against SET and IET experimental data | We validated the performance of RELAP MOD3.3 code regarding the hybrid SIT with available experimental data. The concept of the hybrid SIT is to connect the pressurizer to SIT to utilize the water inside SIT in the case of SBO or SB-LOCA combined with TLOFW. We investigated how well RELAP5 code predicts the physical phenomena in terms of the equilibrium time, stratification, condensation against Separate Effect Test (SET) data. We also conducted the validation of RELAP5 code against Integrated Effect Test (IET) experimental data produced by the ATLAS facility. We followed conventional approach for code validation of IET data, which are pre-test and post-test calculation. RELAP5 code shows substantial difference with changing number of nodes. The increase of the number of nodes tends to reduce the condensation rate at the interface between liquid and vapor inside the hybrid SIT. The environmental heat loss also contributes to the large discrepancy between the simulation results of RELAP5 and the experimental data. (c) 2020 Korean Nuclear Society, Published by Elsevier Korea LLC. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/). | Yoon, Ho Joon; Al Naqbi, Waleed; Al-Yahia, Omar S.; Jo, Daeseong | Khalifa Univ Sci & Technol KUST, Dept Nucl Engn, Abu Dhabi, U Arab Emirates; Emirates Nucl Technol Ctr, POB 127788, Abu Dhabi, U Arab Emirates; Kyungpook Natl Univ, Sch Mech Engn, 80 Daehak Ro, Daegu 41566, South Korea | Al-yahia, Omar/AAH-8536-2019 | 55221657300; 57215220807; 55788375900; 16424303000 | hojoon.yoon@ku.ac.ae; | NUCLEAR ENGINEERING AND TECHNOLOGY | NUCL ENG TECHNOL | 1738-5733 | 52 | 9 | SCIE | NUCLEAR SCIENCE & TECHNOLOGY | 2020 | 2.341 | 13.2 | 0.42 | 2025-06-25 | 5 | 5 | RELAP5 MOD3.3; Hybrid SIT; Separate effect test; Integrated effect test; ATLAS; Condensation; Stratification | ATLAS; Condensation; Hybrid SIT; Integrated effect test; RELAP5 MOD3.3; Separate effect test; Stratification | English | 2020 | 2020-09 | 10.1016/j.net.2020.02.007 | 바로가기 | 바로가기 | 바로가기 | 바로가기 | ||||
○ | ○ | Article | 3D Point Cloud and BIM-Based Reconstruction for Evaluation of Project by As-Planned and As-Built | Progress management of a construction project can detect changes early by visualizing the progress of the project, as it is important to be capable of predicting the success or failure of future project objectives. However, to perform reliable progress management tasks, accurate measurement data is required. In this study, the basic principle of the evaluation of project progress was performed through the 3D point cloud and the 4D attributes of BIM. The evaluation of project progress proposed in this study was based on as-built data to assess the progress of the project site. The specific improvements via the proposed process for this study in the construction project-progress control area were as follows: (1) visualization of construction project progress, (2) calculation of project as-built quantity, and (3) evaluation of a project's progress. This study improved the efficiency and productivity in the management of a construction project through detection of the progress process. It provided easy monitoring of the overall project status, such as productivity analysis, progress rate and quality verifications, and easy identification of the problems created and foreseeable engineering tasks. | Kim, Seungho; Kim, Sangyong; Lee, Dong-Eun | Yeungnam Univ Coll, Dept Architecture, 170 Hyeonchung Ro, Daegu 42415, South Korea; Yeungnam Univ, Sch Architecture, 280 Daehak Ro, Gyongsan 38541, Gyeongbuk, South Korea; Kyungpook Natl Univ, Sch Architecture & Civil Engn, Daehak Ro 80, Daegu 41566, South Korea | 57191481643; 55498494400; 56605563300 | kimseungho@ync.ac.kr;sangyong@yu.ac.kr;dolee@knu.ac.kr; | REMOTE SENSING | REMOTE SENS-BASEL | 2072-4292 | 12 | 9 | SCIE | ENVIRONMENTAL SCIENCES;GEOSCIENCES, MULTIDISCIPLINARY;IMAGING SCIENCE & PHOTOGRAPHIC TECHNOLOGY;REMOTE SENSING | 2020 | 4.848 | 13.3 | 1.14 | 2025-06-25 | 23 | 29 | 3D point cloud; reconstruction; as-planned; as-built; progress management | SCAN-TO-BIM; BUILDING INFORMATION; PROGRESS; MANAGEMENT; OPERATION; SYSTEM | 3D point cloud; As-built; As-planned; Progress management; Reconstruction | Architectural design; Image reconstruction; Productivity; Quality control; Accurate measurement; Basic principles; Construction projects; Engineering tasks; Productivity analysis; Progress managements; Project objectives; Quality verification; Project management | English | 2020 | 2020-05 | 10.3390/rs12091457 | 바로가기 | 바로가기 | 바로가기 | 바로가기 | |||
○ | ○ | Article | A Case Study on Microphysical Characteristics of Mesoscale Convective System Using Generalized DSD Parameters Retrieved from Dual-Polarimetric Radar Observations | The microphysical characteristics of a mesoscale convective system (MCS) during a summer monsoon of South Korea are investigated using the generalized drop size distributions (DSD) that are derived from S-band dual-polarization radar data. The characteristics parameters of generalized DSDs (generalized number concentration, N-0 ' and generalized mean diameter, D-m) are directly calculated from DSD's two moments without any assumption on the DSD model. Relationships between Z(DR) and generalized DSD parameters normalized by Z(H) are derived in the form of the polynomial equation. Verification of the retrieved DSD parameters is conducted with the 2-D video disdrometer (2DVD) located about 23 km from the radar. The standard deviations (SD) of retrieved DSD parameters are about 0.26 for log N-0 ', and about 0.11 for D-m because of the variability of DSDs. The SD of the retrieved log N-0 ' from the dual-polarimetric measurement reaches to about 0.46 (almost double) for 11 rain events while the accuracy of retrieved D-m is quite higher (similar to 0.19). This higher error in retrieved log N-0 ' is likely attributed to the larger discrepancy in radar-observed and DSD-calculated Z(DR) when Z(H) is low. This retrieval technique is applied to a mesoscale convective system (MCS) case to investigate the Lagrangian characteristics of the microphysical process. The MCS is classified into the leading edge and trailing stratiform region by using the storm classification algorithm. The leading edge dominated by strong updraft showed the broad DSD spectra with a steady temporal increase of D-m throughout the event, likely because of the dominant drop growth by the collision-coalescence process. On the other hand, the drop growth is less significant in the trailing stratiform region as shown by the nearly constant D-m for the entire period. The DSD variation is also controlled by the new generation of drops in the leading edge and less extent in the trailing stratiform during the early period when precipitation systems grow. When the system weakens, the characteristic number concentration decreases with time, indicating the new generation of drops becomes less significant in both regions. | Kwon, Soohyun; Jung, Sung-Hwa; Lee, GyuWon | Korea Meteorol Adm, Weather Radar Ctr, Seoul 07062, South Korea; Kyungpook Natl Univ, Ctr Atmospher Remote Sensing CARE, Dept Astron & Atmospher Sci, Daegu 41566, South Korea | 56555682900; 55837204300; 7404852271 | soohyun03@korea.kr;shjung95@korea.kr;gyuwon@knu.ac.kr; | REMOTE SENSING | REMOTE SENS-BASEL | 2072-4292 | 12 | 11 | SCIE | ENVIRONMENTAL SCIENCES;GEOSCIENCES, MULTIDISCIPLINARY;IMAGING SCIENCE & PHOTOGRAPHIC TECHNOLOGY;REMOTE SENSING | 2020 | 4.848 | 13.3 | 0.29 | 2025-06-25 | 8 | 8 | drop size distribution; S-band dual-polarization radar; microphysics; DSD retrieval; mesoscale convective system | RAINDROP SIZE DISTRIBUTION; VIDEO DISDROMETER; SQUALL LINE; SHAPE; MODEL; IDENTIFICATION; PRECIPITATION; REMOVAL; ECHOES | Drop size distribution; DSD retrieval; Mesoscale convective system; Microphysics; S-band dual-polarization radar | Drops; Polarimeters; Polynomials; Radar; Storms; Characteristics parameters; Drop size distribution; Dual polarization radars; Mesoscale Convective System; Microphysical process; Polarimetric measurements; Polarimetric radar observation; Precipitation systems; Distributed database systems | English | 2020 | 2020-06 | 10.3390/rs12111812 | 바로가기 | 바로가기 | 바로가기 | 바로가기 | |||
○ | ○ | Article | A Double Epipolar Resampling Approach to Reliable Conjugate Point Extraction for Accurate Kompsat-3/3A Stereo Data Processing | Kompsat-3/3A provides along-track and across-track stereo data for accurate three-dimensional (3D) topographic mapping. Stereo data preprocessing involves conjugate point extraction and acquisition of ground control points (GCPs), rational polynomial coefficient (RPC) bias compensation, and epipolar image resampling. Applications where absolute positional accuracy is not a top priority do not require GCPs, but require precise conjugate points from stereo images for subsequent RPC bias compensation, i.e., relative orientation. Conjugate points are extracted between the original stereo data using image-matching methods by a proper outlier removal process. Inaccurate matching results and potential outliers produce geometric inconsistency in the stereo data. Hence, the reliability of conjugate point extraction must be improved. For this purpose, we proposed to apply the coarse epipolar resampling using raw RPCs before the conjugate point matching. We expect epipolar images with even inaccurate RPCs to show better stereo similarity than the original images, providing better conjugate point extraction. To this end, we carried out the quantitative analysis of the conjugate point extraction performance by comparing the proposed approach using the coarsely epipolar resampled images to the traditional approach using the original stereo images. We tested along-track Kompsat-3 stereo and across-track Kompsat-3A stereo data with four well-known image-matching methods: phase correlation (PC), mutual information (MI), speeded up robust features (SURF), and Harris detector combined with fast retina keypoint (FREAK) descriptor (i.e., Harris). These matching methods were applied to the original stereo images and coarsely resampled epipolar images, and the conjugate point extraction performance was investigated. Experimental results showed that the coarse epipolar image approach was very helpful for accurate conjugate point extraction, realizing highly accurate RPC refinement and sub-pixel y-parallax through fine epipolar image resampling, which was not achievable through the traditional approach. MI and PC provided the most stable results for both along-track and across-track test data with larger patch sizes of more than 400 pixels. | Oh, Jaehong; Han, Youkyung | Korea Maritime & Ocean Univ, Dept Civil Engn, Busan 49112, South Korea; Kyungpook Natl Univ, Sch Convergence & Fus Syst Engn, Sangju 37224, South Korea | 36140723100; 55457676600 | jhoh@kmou.ac.kr;han602@knu.ac.kr; | REMOTE SENSING | REMOTE SENS-BASEL | 2072-4292 | 12 | 18 | SCIE | ENVIRONMENTAL SCIENCES;GEOSCIENCES, MULTIDISCIPLINARY;IMAGING SCIENCE & PHOTOGRAPHIC TECHNOLOGY;REMOTE SENSING | 2020 | 4.848 | 13.3 | 1.02 | 2025-06-25 | 12 | 15 | stereo; RPCs; epipolar; conjugate point; image matching; relative orientation | REGISTRATION; ORIENTATION; FEATURES; IMAGERY; SIFT | Conjugate point; Epipolar; Image matching; Relative orientation; RPCs; Stereo | Data handling; Data mining; Extraction; Geometrical optics; Image coding; Image matching; Pixels; Rock mechanics; Statistics; Ground control points; Rational polynomial coefficients; Re-sampling approach; Relative orientation; Speeded up robust features; Threedimensional (3-d); Topographic mapping; Traditional approaches; Stereo image processing | English | 2020 | 2020-09 | 10.3390/rs12182940 | 바로가기 | 바로가기 | 바로가기 | 바로가기 | |||
○ | ○ | Article | A Framework for Unsupervised Wildfire Damage Assessment Using VHR Satellite Images with PlanetScope Data | The application of remote sensing techniques for disaster management often requires rapid damage assessment to support decision-making for post-treatment activities. As the on-demand acquisition of pre-event very high-resolution (VHR) images is typically limited, PlanetScope (PS) offers daily images of global coverage, thereby providing favorable opportunities to obtain high-resolution pre-event images. In this study, we propose an unsupervised change detection framework that uses post-fire VHR images with pre-fire PS data to facilitate the assessment of wildfire damage. To minimize the time and cost of human intervention, the entire process was executed in an unsupervised manner from image selection to change detection. First, to select clear pre-fire PS images, a blur kernel was adopted for the blind and automatic evaluation of local image quality. Subsequently, pseudo-training data were automatically generated from contextual features regardless of the statistical distribution of the data, whereas spectral and textural features were employed in the change detection procedure to fully exploit the properties of different features. The proposed method was validated in a case study of the 2019 Gangwon wildfire in South Korea, using post-fire GeoEye-1 (GE-1) and pre-fire PS images. The experimental results verified the effectiveness of the proposed change detection method, achieving an overall accuracy of over 99% with low false alarm rate (FAR), which is comparable to the accuracy level of the supervised approach. The proposed unsupervised framework accomplished efficient wildfire damage assessment without any prior information by utilizing the multiple features from multi-sensor bi-temporal images. | Chung, Minkyung; Han, Youkyung; Kim, Yongil | Seoul Natl Univ, Dept Civil & Environm Engn, 1 Gwanak Ro, Seoul 08826, South Korea; Kyungpook Natl Univ, Sch Convergence & Fus Syst Engn, Sangju 37224, South Korea | 57214257658; 55457676600; 7410213546 | mkjung4876@snu.ac.kr;han602@knu.ac.kr;yik@snu.ac.kr; | REMOTE SENSING | REMOTE SENS-BASEL | 2072-4292 | 12 | 22 | SCIE | ENVIRONMENTAL SCIENCES;GEOSCIENCES, MULTIDISCIPLINARY;IMAGING SCIENCE & PHOTOGRAPHIC TECHNOLOGY;REMOTE SENSING | 2020 | 4.848 | 13.3 | 0.51 | 2025-06-25 | 10 | 12 | wildfire damage assessment; very high resolution (VHR); image quality assessment; unsupervised change detection; multi-sensor image application | BURN SEVERITY; LANDSAT TM; TOPOGRAPHIC NORMALIZATION; CONTEXTUAL INFORMATION; COVER CLASSIFICATION; TEXTURAL FEATURES; AREA | Image quality assessment; Multi-sensor image application; Unsupervised change detection; Very high resolution (VHR); Wildfire damage assessment | Decision making; Disaster prevention; Disasters; Fires; Remote sensing; Automatic evaluation; Automatically generated; Disaster management; Human intervention; Remote sensing techniques; Statistical distribution; Unsupervised change detection; Very high resolution (VHR) image; Damage detection | English | 2020 | 2020-11 | 10.3390/rs12223835 | 바로가기 | 바로가기 | 바로가기 | 바로가기 | |||
○ | ○ | Article | A Hybrid Spatio-Temporal Prediction Model for Solar Photovoltaic Generation Using Numerical Weather Data and Satellite Images | Precise and accurate prediction of solar photovoltaic (PV) generation plays a major role in developing plans for the supply and demand of power grid systems. Most previous studies on the prediction of solar PV generation employed only weather data composed of numerical text data. The numerical text weather data can reflect temporal factors, however, they cannot consider the movement features related to the wind direction of the spatial characteristics, which include the amount of both clouds and particulate matter (PM) among other weather features. This study aims developing a hybrid spatio-temporal prediction model by combining general weather data and data extracted from satellite images having spatial characteristics. A model for hourly prediction of solar PV generation is proposed using data collected from a solar PV power plant in Incheon, South Korea. To evaluate the performance of the prediction model, we compared and performed ARIMAX analysis, which is a traditional statistical time-series analysis method, and SVR, ANN, and DNN, which are based on machine learning algorithms. The models that reflect the temporal and spatial characteristics exhibited better performance than those using only the general weather numerical data or the satellite image data. | Kim, Bowoo; Suh, Dongjun | Kyungpook Natl Univ, Dept Convergence & Fus Syst Engn, Sangju 37224, South Korea | 57219947521; 36613529600 | kbw5913@knu.ac.kr;dongjunsuh@knu.ac.kr; | REMOTE SENSING | REMOTE SENS-BASEL | 2072-4292 | 12 | 22 | SCIE | ENVIRONMENTAL SCIENCES;GEOSCIENCES, MULTIDISCIPLINARY;IMAGING SCIENCE & PHOTOGRAPHIC TECHNOLOGY;REMOTE SENSING | 2020 | 4.848 | 13.3 | 0.65 | 2025-06-25 | 13 | 18 | solar PV generation; spatio-temporal; prediction; ARIMAX; SVR; ANN; DNN; satellite image | DUST | ANN; ARIMAX; DNN; Prediction; Satellite image; Solar PV generation; Spatio-temporal; SVR | Economics; Electric power transmission networks; Learning algorithms; Machine learning; Meteorology; Photovoltaic cells; Predictive analytics; Satellites; Solar power plants; Time series analysis; Weather forecasting; Accurate prediction; Particulate Matter; Satellite image datas; Solar photovoltaic generations; Solar photovoltaics; Spatial characteristics; Spatio-temporal prediction; Temporal and spatial; Solar power generation | English | 2020 | 2020-11 | 10.3390/rs12223706 | 바로가기 | 바로가기 | 바로가기 | 바로가기 | |||
○ | ○ | Article | Application of Convolutional Neural Network for Spatiotemporal Bias Correction of Daily Satellite-Based Precipitation | Spatiotemporal precipitation data is one of the essential components in modeling hydrological problems. Although the estimation of these data has achieved remarkable accuracy owning to the recent advances in remote-sensing technology, gaps remain between satellite-based precipitation and observed data due to the dependence of precipitation on the spatiotemporal distribution and the specific characteristics of the area. This paper presents an efficient approach based on a combination of the convolutional neural network and the autoencoder architecture, called the convolutional autoencoder (ConvAE) neural network, to correct the pixel-by-pixel bias for satellite-based products. The two daily gridded precipitation datasets with a spatial resolution of 0.25 degrees employed are Asian Precipitation-Highly Resolved Observational Data Integration towards Evaluation (APHRODITE) as the observed data and Precipitation Estimation from Remotely Sensed Information using Artificial Neural Networks-Climate Data Record (PERSIANN-CDR) as the satellite-based data. Furthermore, the Mekong River basin was selected as a case study, because it is one of the largest river basins, spanning six countries, most of which are developing countries. In addition to the ConvAE model, another bias correction method based on the standard deviation method was also introduced. The performance of the bias correction methods was evaluated in terms of the probability distribution, temporal correlation, and spatial correlation of precipitation. Compared with the standard deviation method, the ConvAE model demonstrated superior and stable performance in most comparisons conducted. Additionally, the ConvAE model also exhibited impressive performance in capturing extreme rainfall events, distribution trends, and described spatial relationships between adjacent grid cells well. The findings of this study highlight the potential of the ConvAE model to resolve the precipitation bias correction problem. Thus, the ConvAE model could be applied to other satellite-based products, higher-resolution precipitation data, or other issues related to gridded data. | Le, Xuan-Hien; Lee, Giha; Jung, Kwansue; An, Hyun-uk; Lee, Seungsoo; Jung, Younghun | Kyungpook Natl Univ, Dept Disaster Prevent & Environm Engn, 2559 Gyeongsang Daero, Sangju 37224, South Korea; Thuyloi Univ, Fac Water Resources Engn, 175 Tay Son, Hanoi 10000, Vietnam; Chungnam Natl Univ, Dept Civil Engn, Daejeon 34134, South Korea; Chungnam Natl Univ, Dept Agr & Rural Engn, Daejeon 34134, South Korea; Korea Environm Inst KEI, Dept Integrated Water Management, 370 Sicheong Daero,Bldg B 819, Sejong 30147, South Korea | Le, Xuan-Hien/AAZ-9166-2021 | 57209735659; 35069799400; 37015343400; 36639175600; 55583318300; 55195880200 | hienlx@knu.ac.kr;leegiha@knu.ac.kr;ksjung@cnu.ac.kr;hyunuk@cnu.ac.kr;seungsoo@kei.re.kr;y.jung@knu.ac.kr; | REMOTE SENSING | REMOTE SENS-BASEL | 2072-4292 | 12 | 17 | SCIE | ENVIRONMENTAL SCIENCES;GEOSCIENCES, MULTIDISCIPLINARY;IMAGING SCIENCE & PHOTOGRAPHIC TECHNOLOGY;REMOTE SENSING | 2020 | 4.848 | 13.3 | 2.47 | 2025-06-25 | 43 | 48 | precipitation bias correction; APHRODITE; PERSIANN-CDR; Mekong River basin; convolutional neural network (CNN); convolutional autoencoder (ConvAE) | DAILY RAINFALL; DENSE NETWORK; DATASET; MODEL | APHRODITE; convolutional autoencoder (ConvAE); Convolutional neural network (CNN); Mekong River basin; PERSIANN-CDR; Precipitation bias correction | Convolution; Data integration; Developing countries; Learning systems; Pixels; Precipitation (meteorology); Probability distributions; Remote sensing; Satellites; Statistics; Watersheds; Bias-correction methods; Precipitation estimation from remotely sensed information; Remote sensing technology; Spatial correlations; Spatial relationships; Spatiotemporal distributions; Standard deviation method; Temporal correlations; Convolutional neural networks | English | 2020 | 2020-09 | 10.3390/rs12172731 | 바로가기 | 바로가기 | 바로가기 | 바로가기 | ||
○ | ○ | Article | BIM-Based Registration and Localization of 3D Point Clouds of Indoor Scenes Using Geometric Features for Augmented Reality | Augmented reality can improve construction and facility management by visualizing an as-planned model on its corresponding surface for fast, easy, and correct information retrieval. This requires the localization registration of an as-built model in an as-planned model. However, the localization and registration of indoor environments fail, owing to self-similarity in an indoor environment, relatively large as-planned models, and the presence of additional unplanned objects. Therefore, this paper proposes a computer vision-based method to (1) homogenize indoor as-planned and as-built models, (2) reduce the search space of model matching, and (3) localize the structure (e.g., room) for registration of the scanned area in its as-planned model. This method extracts a representative horizontal cross section from the as-built and as-planned point clouds to make these models similar, restricts unnecessary transformation to reduce the search space, and corresponds the line features for the estimation of the registration transformation matrix. The performance of this method, in terms of registration accuracy, is evaluated on as-built point clouds of rooms and a hallway on a building floor. A rotational error of 0.005 rad and a translational error of 0.088 m are observed in the experiments. Hence, the geometric feature described on a representative cross section with transformation restrictions can be a computationally cost-effective solution for indoor localization and registration. | Mahmood, Bilawal; Han, SangUk; Lee, Dong-Eun | Hanyang Univ, Dept Civil & Environm Engn, 222 Wangsimni Ro, Seoul 04763, South Korea; KyungPook Natl Univ, Sch Architectural Civil Environm & Energy Engn, 1370 Sangyegk Dong, Daegu 702701, South Korea | Han, SangUk/JNS-8543-2023; mahmood, bilawal/L-8331-2017 | 57215462118; 55487857100; 56605563300 | bilawal@hanyang.ac.kr;sanguk@hanyang.ac.kr;dolee@knu.ac.kr; | REMOTE SENSING | REMOTE SENS-BASEL | 2072-4292 | 12 | 14 | SCIE | ENVIRONMENTAL SCIENCES;GEOSCIENCES, MULTIDISCIPLINARY;IMAGING SCIENCE & PHOTOGRAPHIC TECHNOLOGY;REMOTE SENSING | 2020 | 4.848 | 13.3 | 2.61 | 2025-06-25 | 44 | 49 | augmented reality; localization; registration; indoor point cloud | TRACKING SYSTEM; CONGRUENT SETS; FRAMEWORK | Augmented reality; Indoor point cloud; Localization; Registration | Architectural design; Augmented reality; Cost effectiveness; Linear transformations; Mathematical transformations; Office buildings; Search engines; Cost-effective solutions; Facility management; Horizontal cross section; Indoor environment; Indoor localization; Registration accuracy; Registration transformation; Vision-based methods; Indoor positioning systems | English | 2020 | 2020-07 | 10.3390/rs12142302 | 바로가기 | 바로가기 | 바로가기 | 바로가기 | ||
○ | ○ | Article | Constrained Linear Deconvolution of GRACE Anomalies to Correct Spatial Leakage | Time-varying gravity observed by the Gravity Recovery and Climate Experiment (GRACE) satellites measures surface water and ice mass redistribution driven by weather and climate forcing and has emerged as one of the most important data types in measuring changes in Earth's climate. However, spatial leakage of GRACE signals, especially in coastal areas, has been a recognized limitation in quantitatively assessing mass change. It is evident that larger terrestrial signals in coastal regions spread into the oceans and vice versa and various remedies have been developed to address this problem. An especially successful one has been Forward Modeling but it requires knowledge of geographical locations of mass change to be fully effective. In this study, we develop a new method to suppress leakage effects using a linear least squares operator applied to GRACE spherical harmonic data. The method is effectively a constrained deconvolution of smoothing inherent in GRACE data. It assumes that oceanic mass changes near the coast are negligible compared to terrestrial changes, with additional spatial regularization constraints. Some calibration of constraint weighting is required. We apply the method to estimate surface mass loads over Australia using both synthetic and real GRACE data. Leakage into the oceans is effectively suppressed and when compared with mascon solutions there is better performance over interior basins. | Seo, Ki-Weon; Oh, Seokhoon; Eom, Jooyoung; Chen, Jianli; Wilson, Clark R. | Seoul Natl Univ, Dept Earth Sci Educ, Seoul 08826, South Korea; Kangwon Natl Univ, Dept Energy & Resources Engn, Kangwon 24341, South Korea; Kyungpook Natl Univ, Dept Earth Sci Educ, Daegu 41566, South Korea; Univ Texas Austin, Ctr Space Res, Austin, TX 78759 USA; Univ Texas Austin, Jackson Sch Geosci, Dept Geol Sci, Austin, TX 78712 USA | Seo, Ki-weon/AAH-7729-2021; Eom, Jooyoung/KBC-4439-2024; Chen, Jianli/KUD-8259-2024 | 8407160800; 16553124600; 36645970800; 57205523218; 7404896041 | seokiweon@snu.ac.kr;gimul@kangwon.ac.kr;eomjy@knu.ac.kr;chen@csr.utexas.edu;crwilson@jsg.utexas.edu; | REMOTE SENSING | REMOTE SENS-BASEL | 2072-4292 | 12 | 11 | SCIE | ENVIRONMENTAL SCIENCES;GEOSCIENCES, MULTIDISCIPLINARY;IMAGING SCIENCE & PHOTOGRAPHIC TECHNOLOGY;REMOTE SENSING | 2020 | 4.848 | 13.3 | 0.44 | 2025-06-25 | 8 | 8 | GRACE; leakage error; inversion; TWS | INVERSION; GREENLAND | GRACE; Inversion; Leakage error; TWS | Coastal zones; Gravitation; Least squares approximations; Surface waters; Weather satellites; Constrained deconvolution; Constraint weighting; Geographical locations; Gravity recovery and climate experiment satellites; Linear least squares; Spatial regularizations; Spherical harmonics; Surface mass loads; Geodetic satellites | English | 2020 | 2020-06 | 10.3390/rs12111798 | 바로가기 | 바로가기 | 바로가기 | 바로가기 | ||
○ | ○ | Article | Deep Learning-Based Drivers Emotion Classification System in Time Series Data for Remote Applications | Aggressive driving emotions is indeed one of the major causes for traffic accidents throughout the world. Real-time classification in time series data of abnormal and normal driving is a keystone to avoiding road accidents. Existing work on driving behaviors in time series data have some limitations and discomforts for the users that need to be addressed. We proposed a multimodal based method to remotely detect driver aggressiveness in order to deal these issues. The proposed method is based on change in gaze and facial emotions of drivers while driving using near-infrared (NIR) camera sensors and an illuminator installed in vehicle. Driver's aggressive and normal time series data are collected while playing car racing and truck driving computer games, respectively, while using driving game simulator. Dlib program is used to obtain driver's image data to extract face, left and right eye images for finding change in gaze based on convolutional neural network (CNN). Similarly, facial emotions that are based on CNN are also obtained through lips, left and right eye images extracted from Dlib program. Finally, the score level fusion is applied to scores that were obtained from change in gaze and facial emotions to classify aggressive and normal driving. The proposed method accuracy is measured through experiments while using a self-constructed large-scale testing database that shows the classification accuracy of the driver's change in gaze and facial emotions for aggressive and normal driving is high, and the performance is superior to that of previous methods. | Naqvi, Rizwan Ali; Arsalan, Muhammad; Rehman, Abdul; Rehman, Ateeq Ur; Loh, Woong-Kee; Paul, Anand | Sejong Univ, Dept Unmanned Vehicle Engn, 209 Neungdong Ro, Seoul 05006, South Korea; Dongguk Univ, Div Elect & Elect Engn, 30 Pildong Ro 1 Gil, Seoul 100715, South Korea; Kyungpook Natl Univ, Dept Comp Sci & Engn, Daegu 41566, South Korea; Hohai Univ, Coll Internet Things Engn, Changzhou 213022, Peoples R China; Gachon Univ, Dept Software, 1342 Seongnamdaero, Seongnam 13120, Gyeonggi Do, South Korea | ; REHMAN, ATEEQ UR/AAI-6344-2020; Paul, Anand/V-6724-2017; Naqvi, Rizwan/AAW-9242-2020; Rehman, Abdul/D-5630-2019; Arsalan, Muhammad/JXQ-7133-2024 | 55975847900; 57203178070; 57200894071; 57210246601; 7102037423; 56650522400 | rizwanali@sejong.ac.kr;arsals@dongguk.edu;a.rehman.knu@knu.ac.kr;ateeq@hhu.edu.cn;wkloh2@gachon.ac.kr;anand@knu.ac.kr; | REMOTE SENSING | REMOTE SENS-BASEL | 2072-4292 | 12 | 3 | SCIE | ENVIRONMENTAL SCIENCES;GEOSCIENCES, MULTIDISCIPLINARY;IMAGING SCIENCE & PHOTOGRAPHIC TECHNOLOGY;REMOTE SENSING | 2020 | 4.848 | 13.3 | 3.92 | 2025-06-25 | 55 | 68 | emotions sensing; aggressive driving; normal driving; time series data; change in gaze; facial emotions; gaze tracking; deep learning | TRACKING; EEG | Aggressive driving; Change in gaze; Deep learning; Emotions sensing; Facial emotions; Gaze tracking; Normal driving; Time series data | Accidents; Classification (of information); Computer games; Convolutional neural networks; Eye tracking; Image processing; Infrared devices; Time series; Aggressive driving; Change in gaze; Emotions sensing; Facial emotions; Gaze tracking; Normal driving; Time-series data; Deep learning | English | 2020 | 2020-02 | 10.3390/rs12030587 | 바로가기 | 바로가기 | 바로가기 | 바로가기 | ||
○ | ○ | Article | Object-Based Building Change Detection by Fusing Pixel-Level Change Detection Results Generated from Morphological Building Index | Change detection (CD) is an important tool in remote sensing. CD can be categorized into pixel-based change detection (PBCD) and object-based change detection (OBCD). PBCD is traditionally used because of its simple and straightforward algorithms. However, with increasing interest in very-high-resolution (VHR) imagery and determining changes in small and complex objects such as buildings or roads, traditional methods showed limitations, for example, the large number of false alarms or noise in the results. Thus, researchers have focused on extending PBCD to OBCD. In this study, we proposed a method for detecting the newly built-up areas by extending PBCD results into an OBCD result through the Dempster-Shafer (D-S) theory. To this end, the morphological building index (MBI) was used to extract built-up areas in multitemporal VHR imagery. Then, three PBCD algorithms, change vector analysis, principal component analysis, and iteratively reweighted multivariate alteration detection, were applied to the MBI images. For the final CD result, the three binary change images were fused with the segmented image using the D-S theory. The results obtained from the proposed method were compared with those of PBCD, OBCD, and OBCD results generated by fusing the three binary change images using the major voting technique. Based on the accuracy assessment, the proposed method produced the highest F1-score and kappa values compared with other CD results. The proposed method can be used for detecting new buildings in built-up areas as well as changes related to demolished buildings with a low rate of false alarms and missed detections compared with other existing CD methods. | Javed, Aisha; Jung, Sejung; Lee, Won Hee; Han, Youkyung | Kyungpook Natl Univ, Dept Convergence & Fus Syst Engn, Sangju 37224, South Korea; Kyungpook Natl Univ, Dept Spatial Informat, Daegu 41566, South Korea; Kyungpook Natl Univ, Sch Convergence & Fus Syst Engn, Sangju 37224, South Korea | ; Jung, Sejung/NRB-6938-2025; Javed, Aisha/LQK-3075-2024 | 57215897698; 57209137546; 57190774365; 55457676600 | javedaisha123@knu.ac.kr;renai1226@knu.ac.kr;wlee33@knu.ac.kr;han602@knu.ac.kr; | REMOTE SENSING | REMOTE SENS-BASEL | 2072-4292 | 12 | 18 | SCIE | ENVIRONMENTAL SCIENCES;GEOSCIENCES, MULTIDISCIPLINARY;IMAGING SCIENCE & PHOTOGRAPHIC TECHNOLOGY;REMOTE SENSING | 2020 | 4.848 | 13.3 | 0.87 | 2025-06-25 | 23 | 23 | pixel-based changed detection (PBCD); object-based change detection (OBCD); morphological building index (MBI); very-high resolution (VHR) images; segmentation; Dempster-Shafer (D-S) theory | UNSUPERVISED CHANGE DETECTION; REMOTELY-SENSED IMAGES; CHANGE VECTOR ANALYSIS; SENSING IMAGES; SEGMENTATION; FRAMEWORK; LAND | Dempster-Shafer (D-S) theory; Morphological building index (MBI); Object-based change detection (OBCD); Pixel-based changed detection (PBCD); Segmentation; Very-high resolution (VHR) images | Binary images; Buildings; Errors; Iterative methods; Pixels; Remote sensing; Accuracy assessment; Alteration detections; Building change detection; Change vector analysis; Demolished buildings; Number of false alarms; Object based change detections; Very high resolution; Object detection | English | 2020 | 2020-09 | 10.3390/rs12182952 | 바로가기 | 바로가기 | 바로가기 | 바로가기 | ||
○ | ○ | Article | Object-Based Change Detection of Very High Resolution Images by Fusing Pixel-Based Change Detection Results Using Weighted Dempster-Shafer Theory | Change detection (CD), one of the primary applications of multi-temporal satellite images, is the process of identifying changes in the Earth's surface occurring over a period of time using images of the same geographic area on different dates. CD is divided into pixel-based change detection (PBCD) and object-based change detection (OBCD). Although PBCD is more popular due to its simple algorithms and relatively easy quantitative analysis, applying this method in very high resolution (VHR) images often results in misdetection or noise. Because of this, researchers have focused on extending the PBCD results to the OBCD map in VHR images. In this paper, we present a proposed weighted Dempster-Shafer theory (wDST) fusion method to generate the OBCD by combining multiple PBCD results. The proposed wDST approach automatically calculates and assigns a certainty weight for each object of the PBCD result while considering the stability of the object. Moreover, the proposed wDST method can minimize the tendency of the number of changed objects to decrease or increase based on the ratio of changed pixels to the total pixels in the image when the PBCD result is extended to the OBCD result. First, we performed co-registration between the VHR multitemporal images to minimize the geometric dissimilarity. Then, we conducted the image segmentation of the co-registered pair of multitemporal VHR imagery. Three change intensity images were generated using change vector analysis (CVA), iteratively reweighted-multivariate alteration detection (IRMAD), and principal component analysis (PCA). These three intensity images were exploited to generate different binary PBCD maps, after which the maps were fused with the segmented image using the wDST to generate the OBCD map. Finally, the accuracy of the proposed CD technique was assessed by using a manually digitized map. Two VHR multitemporal datasets were used to test the proposed approach. Experimental results confirmed the superiority of the proposed method by comparing the existing PBCD methods and the OBCD method using the majority voting technique. | Han, Youkyung; Javed, Aisha; Jung, Sejung; Liu, Sicong | Kyungpook Natl Univ, Sch Convergence & Fus Syst Engn, Sangju 37224, South Korea; Kyungpook Natl Univ, Dept Geospatial Informat, Daegu 41566, South Korea; Tongji Univ, Coll Surveying & Geoinformat, Shanghai 200092, Peoples R China | Liu, Sicong/J-7094-2013; Javed, Aisha/LQK-3075-2024; Jung, Sejung/NRB-6938-2025 | 55457676600; 57215897698; 57209137546; 38662862600 | han602@knu.ac.kr;javedaisha123@knu.ac.kr;renai1226@knu.ac.kr;sicong.liu@tongji.edu.cn; | REMOTE SENSING | REMOTE SENS-BASEL | 2072-4292 | 12 | 6 | SCIE | ENVIRONMENTAL SCIENCES;GEOSCIENCES, MULTIDISCIPLINARY;IMAGING SCIENCE & PHOTOGRAPHIC TECHNOLOGY;REMOTE SENSING | 2020 | 4.848 | 13.3 | 1.89 | 2025-06-25 | 38 | 40 | weighted Dempster-Shafer Theory (wDST); very high resolution (VHR); Pixel-Based Change Detection (PBCD); object-based change detection (OBCD) | UNSUPERVISED CHANGE DETECTION; CHANGE VECTOR ANALYSIS; ALGORITHMS | Object-based change detection (OBCD); Pixel-based change detection (PBCD); Very high resolution (VHR); Weighted dempster-shafer theory (wDST) | Image segmentation; Iterative methods; Pixels; Principal component analysis; Probabilistic logics; Alteration detections; Change detection; Change vector analysis; Dempster-Shafer theory; Multi-temporal satellite images; Object based change detections; Very high resolution; Very high resolution (VHR) image; Object detection | English | 2020 | 2020-03 | 10.3390/rs12060983 | 바로가기 | 바로가기 | 바로가기 | 바로가기 | ||
○ | ○ | Article | Raindrop-Aware GAN: Unsupervised Learning for Raindrop-Contaminated Coastal Video Enhancement | We propose an unsupervised network with adversarial learning, the Raindrop-aware GAN, which enhances the quality of coastal video images contaminated by raindrops. Raindrop removal from coastal videos faces two main difficulties: converting the degraded image into a clean one by visually removing the raindrops, and restoring the background coastal wave information in the raindrop regions. The components of the proposed network-a generator and a discriminator for adversarial learning-are trained on unpaired images degraded by raindrops and clean images free from raindrops. By creating raindrop masks and background-restored images, the generator restores the background information in the raindrop regions alone, preserving the input as much as possible. The proposed network was trained and tested on an open-access dataset and directly collected dataset from the coastal area. It was then evaluated by three metrics: the peak signal-to-noise ratio, structural similarity, and a naturalness-quality evaluator. The indices of metrics are 8.2% (+2.012), 0.2% (+0.002), and 1.6% (-0.196) better than the state-of-the-art method, respectively. In the visual assessment of the enhanced video image quality, our method better restored the image patterns of steep wave crests and breaking than the other methods. In both quantitative and qualitative experiments, the proposed method more effectively removed the raindrops in coastal video and recovered the damaged background wave information than state-of-the-art methods. | Kim, Jinah; Huh, Dong; Kim, Taekyung; Kim, Jaeil; Yoo, Jeseon; Shim, Jae-Seol | Korea Inst Ocean Sci & Technol, 385 Haeyang Ro, Busan 49111, South Korea; Kyungpook Natl Univ, 80 Daehak Ro, Daegu 41566, South Korea | Yoo, Jeseon/AAV-6568-2021 | 55720345100; 57208129293; 59471665700; 57211615348; 16417848400; 7201856370 | jakim@kiost.ac.kr;herick@knu.ac.kr;paperrune@knu.ac.kr;jaeilkim@knu.ac.kr;jyoo@kiost.ac.kr;jsshim@kiost.ac.kr; | REMOTE SENSING | REMOTE SENS-BASEL | 2072-4292 | 12 | 20 | SCIE | ENVIRONMENTAL SCIENCES;GEOSCIENCES, MULTIDISCIPLINARY;IMAGING SCIENCE & PHOTOGRAPHIC TECHNOLOGY;REMOTE SENSING | 2020 | 4.848 | 13.3 | 0.15 | 2025-06-25 | 2 | 2 | coastal video enhancement; raindrop removal; background information recovery; generative adversarial network; unsupervised learning | REMOVAL; NETWORK | Background information recovery; Coastal video enhancement; Generative adversarial network; Raindrop removal; Unsupervised learning | Drops; Image enhancement; Image quality; Image segmentation; Quality control; Restoration; Signal to noise ratio; Adversarial learning; Background information; Peak signal to noise ratio; Qualitative experiments; State-of-the-art methods; Structural similarity; Unsupervised network; Video image qualities; Image reconstruction | English | 2020 | 2020-10 | 10.3390/rs12203461 | 바로가기 | 바로가기 | 바로가기 | 바로가기 | ||
○ | ○ | Review | Review: Cost-Effective Unmanned Aerial Vehicle (UAV) Platform for Field Plant Breeding Application | Utilization of remote sensing is a new wave of modern agriculture that accelerates plant breeding and research, and the performance of farming practices and farm management. High-throughput phenotyping is a key advanced agricultural technology and has been rapidly adopted in plant research. However, technology adoption is not easy due to cost limitations in academia. This article reviews various commercial unmanned aerial vehicle (UAV) platforms as a high-throughput phenotyping technology for plant breeding. It compares known commercial UAV platforms that are cost-effective and manageable in field settings and demonstrates a general workflow for high-throughput phenotyping, including data analysis. The authors expect this article to create opportunities for academics to access new technologies and utilize the information for their research and breeding programs in more workable ways. | Jang, GyuJin; Kim, Jaeyoung; Yu, Ju-Kyung; Kim, Hak-Jin; Kim, Yoonha; Kim, Dong-Wook; Kim, Kyung-Hwan; Lee, Chang Woo; Chung, Yong Suk | Seoul Natl Univ, Coll Agr & Life Sci, Dept Biosyst & Biomat Sci & Engn, Seoul 08826, South Korea; Jeju Natl Univ, Dept Plant Resources & Environm, Jeju 63243, South Korea; Syngenta Crop Protect LLC, Seeds Res, Res Triangle Pk, NC 27703 USA; Kyungpook Natl Univ, Sch Appl Biosci, Plant Biosci, Daegu 41566, South Korea; RDA, Natl Inst Agr Sci, Jeonju 54874, South Korea; Kunsan Natl Univ, Sch Comp Informat & Commun Engn, Kunsan 54150, South Korea | Kim, Kyung/J-5382-2012; Chung, Yong/V-6909-2019; LEE, CHANG/AAK-7567-2020; Yu, Ju-Kyung/HCH-4393-2022 | 57215897437; 58060905400; 7405526081; 57191721745; 57224866763; 59617999600; 57202965738; 55700506600; 36983850100 | wkd8025@snu.ac.kr;baron7798@jejunu.ac.kr;ju-kyung.yu@syngenta.com;kimhj69@snu.ac.kr;kyh1229@knu.ac.kr;dwk8033@snu.ac.kr;biopiakim@korea.kr;yschung@jejunu.ac.kr; | REMOTE SENSING | REMOTE SENS-BASEL | 2072-4292 | 12 | 6 | SCIE | ENVIRONMENTAL SCIENCES;GEOSCIENCES, MULTIDISCIPLINARY;IMAGING SCIENCE & PHOTOGRAPHIC TECHNOLOGY;REMOTE SENSING | 2020 | 4.848 | 13.3 | 1.6 | 2025-06-25 | 70 | 77 | high-throughput phenotyping; remote sensing; commercial unmanned aerial vehicle (UAV) platform | CROP SURFACE MODELS; LEAF CHLOROPHYLL CONTENT; VEGETATION INDEXES; AREA INDEX; STOMATAL CONDUCTANCE; SPECTRAL REFLECTANCE; WATER-STRESS; BIOMASS; SENSORS; IMAGES | Commercial unmanned aerial vehicle (UAV) platform; High-throughput phenotyping; Remote sensing | Agriculture; Antennas; Commercial vehicles; Cost benefit analysis; Cost effectiveness; Remote sensing; Technology transfer; Agricultural technologies; Breeding program; Commercial unmanned aerial vehicle (UAV) platform; Farm management; Farming practices; High-throughput phenotyping; Modern agricultures; Technology adoption; Unmanned aerial vehicles (UAV) | English | 2020 | 2020-03 | 10.3390/rs12060998 | 바로가기 | 바로가기 | 바로가기 | 바로가기 | ||
○ | ○ | Article | Soil Moisture-Vegetation-Carbon Flux Relationship under Agricultural Drought Condition using Optical Multispectral Sensor | Agricultural drought is triggered by a depletion of moisture content in the soil, which hinders photosynthesis and thus increases carbon dioxide (CO2) concentrations in the atmosphere. The aim of this study is to analyze the relationship between soil moisture (SM) and vegetation activity toward quantifying CO2 concentration in the atmosphere. To this end, the MODerate resolution imaging spectroradiometer (MODIS), an optical multispectral sensor, was used to evaluate two regions in South Korea for validation. Vegetation activity was analyzed through MOD13A1 vegetation indices products, and MODIS gross primary productivity (GPP) product was used to calculate the CO2 flux based on its relationship with respiration. In the case of SM, it was calculated through the method of applying apparent thermal inertia (ATI) in combination with land surface temperature and albedo. To validate the SM and CO2 flux, flux tower data was used which are the observed measurement values for the extreme drought period of 2014 and 2015 in South Korea. These two variables were analyzed for temporal variation on flux tower data as daily time scale, and the relationship with vegetation index (VI) was synthesized and analyzed on a monthly scale. The highest correlation between SM and VI (correlation coefficient (r) = 0.82) was observed at a time lag of one month, and that between VI and CO2 (r = 0.81) at half month. This regional study suggests a potential capability of MODIS-based SM, VI, and CO2 flux, which can be applied to an assessment of the global view of the agricultural drought by using available satellite remote sensing products. | Sur, Chanyang; Kang, Do-Hyuk; Lim, Kyoung Jae; Yang, Jae E.; Shin, Yongchul; Jung, Younghun | Univ Maryland, Earth Syst Sci Interdisciplinary Ctr, College Pk, MD 20740 USA; NASA, Goddard Space Flight Ctr, Biospher Sci Lab, Code 916, Greenbelt, MD 20771 USA; NASA, Goddard Space Flight Ctr, Hydrol Sci Branch, Greenbelt, MD 20771 USA; Kangwon Natl Univ, Dept Reg Infrastruct Engn, Chunchon 24341, South Korea; Kangwon Natl Univ, Dept Biol Environm, Chunchon 24341, South Korea; Kyungpook Natl Univ, Sch Agr Civil & Bioind Engn, Daegu 41566, South Korea; Kyungpook Natl Univ, Dept Construct & Disaster Prevent Engn, Sangju 37224, South Korea | 55769196200; 54399809700; 35176071700; 15039618700; 55659438100; 55195880200 | cysurr@umd.edu;dk.kang@nasa.gov;kjlim@kangwon.ac.kr;yangjay@kangwon.ac.kr;ycshin@knu.ac.kr;y.jung@knu.ac.kr; | REMOTE SENSING | REMOTE SENS-BASEL | 2072-4292 | 12 | 9 | SCIE | ENVIRONMENTAL SCIENCES;GEOSCIENCES, MULTIDISCIPLINARY;IMAGING SCIENCE & PHOTOGRAPHIC TECHNOLOGY;REMOTE SENSING | 2020 | 4.848 | 13.3 | 0.36 | 2025-06-25 | 10 | 9 | agricultural drought; soil moisture; vegetation activity; carbon dioxide flux; remote sensing; optical multispectral sensor | APPARENT THERMAL INERTIA; INDEXES; PHOTOSYNTHESIS; PRODUCTIVITY; RETRIEVAL; PATTERNS; IMPACTS; MODELS; SMOS; NDVI | Agricultural drought; Carbon dioxide flux; Optical multispectral sensor; Remote sensing; Soil moisture; Vegetation activity | Agricultural robots; Drought; Forestry; Land surface temperature; Photosynthesis; Radiometers; Remote sensing; Soil moisture; Vegetation; Agricultural drought; Correlation coefficient; Gross primary productivity; Moderate resolution imaging spectroradiometer; Multispectral sensors; Potential capability; Satellite remote sensing; Vegetation activity; Carbon dioxide | English | 2020 | 2020-05 | 10.3390/rs12091359 | 바로가기 | 바로가기 | 바로가기 | 바로가기 |
페이지 이동: