본문 바로가기 주메뉴 바로가기
국회도서관 홈으로 정보검색 소장정보 검색

권호기사

권호기사 목록 테이블로 기사명, 저자명, 페이지, 원문, 기사목차 순으로 되어있습니다.
기사명 저자명 페이지 원문 목차
운영체제 지원 기반의 GlibC Memory Allocator에 대한 Use-After-Free 공격 방지 기법 = Operating system support-based prevention mechanism for use-after-free attacks on the glibc memory allocator 박찬영, 이재휴, 김대연, 문현곤 p. 541-549
분산학습 클러스터의 동적 스케일링 중 발생하는 학습 중단 원인 분석과 이의 완화 기법 = Analysis and mitigation of training suspension during dynamic scaling of a distributed machine learning cluster 임영훈, 유준열, 서의성 p. 550-560
문장의 의미적 유사도와 정보량을 사용한 다중 문서 요약 = Multi-document summarization use semantic similarity and information quantity of sentence 임연수, 권성구, 김봉민, 박성배 p. 561-572
트랜스포머 기반 시계열 데이터 분류작업을 위한 GASF와 CNN을 사용한 CLS 토큰 추가 임베딩 방법 = CLS token additional embedding method using GASF and CNN for transformer based time series data classification tasks 서재진, 이상원, 최원익 p. 573-580
PCC 기반 기상 변수 유사도를 고려한 제로 샷 태양광 발전율 예측 기법 = Zero-shot solar power efficiency prediction method considering pcc-based climate similarity 김동준, 박성우, 문재욱, 황인준 p. 581-587
적대적 예시에 대한 향상된 견고성을 위한 심층신경망 뉴런 가지치기 = Pruning deep neural networks neurons for improved robustness against adversarial examples 임규민, 고기혁, 이수영, 손수엘 p. 588-597
이진 분류기를 이용한 안면 인식 스마트 도어의 생체인증 성능개선 = Biometrics performance improvement of face recognition smart door using binary classifier 김태성, 은창수, 박종원 p. 598-605
문서 기반 대화 시스템의 외부 지식 검색을 위한 다중 작업 학습 기반 재순위화 모델 = Multi-task learning based re-ranker for external knowledge retrieval in document-grounded dialogue systems 이홍희, 고영중 p. 606-613
Mini-batching with similar-length sentences to quickly train NMT models = 번역 모델을 빠르게 학습하도록 유사 길이 문장들로 미니배치 구성 aniela N. Rim, ichard Kimera, eeyoul Choi p. 614-620
주기적 링크 품질 측정에 기반을 둔 RPL 네트워크에서의 선호 부모 변경 기법 제안 = A proposal of preferred parent change technique in rpl network based on periodic link quality measurement 신형택, 하유빈, 정상화 p. 621-632

참고문헌 (26건) : 자료제공( 네이버학술정보 )

참고문헌 목록에 대한 테이블로 번호, 참고문헌, 국회도서관 소장유무로 구성되어 있습니다.
번호 참고문헌 국회도서관 소장유무
1 Vaswani, Ashish, et al. "Attention is all you need,"Advances in neural information processing systems 30, 2017. 미소장
2 Yue, Zhihan, et al. "Ts2vec: Towards universal representation of time series," Proceedings of the AAAI Conference on Artificial Intelligence. Vol. 36. No. 8, 2022. 미소장
3 Devlin, Jacob, et al. "Bert: Pre-training of deep bidirectional transformers for language under standing," arXiv preprint arXiv:1810.04805, 2018. 미소장
4 Franceschi, Jean-Yves, Aymeric Dieuleveut, and Martin Jaggi, "Unsupervised scalable representation learning for multivariate time series," Advances in neural information processing systems 32, 2019. 미소장
5 Berndt, Donald J., and James Clifford, "Using dynamic time warping to find patterns in time series," KDD workshop. Vol. 10. No. 16, 1994. 미소장
6 Baydogan, Mustafa Gokce, George Runger, and Eugene Tuv. "A bag-of-features framework to classify time series," IEEE transactions on pattern analysis and machine intelligence 35.11, pp. 2796-2802, 2013. 미소장
7 Lines, Jason, and Anthony Bagnall. "Time series classification with ensembles of elastic distance measures," Data Mining and Knowledge Discovery 29, pp. 565-592, 2015. 미소장
8 Hills, Jon, et al. "Classification of time series by shapelet transformation," Data mining and knowledge discovery 28, pp. 851-881, 2014. 미소장
9 Bagnall, Anthony, et al. "Time-series classification with COTE: the collective of transformation-based ensembles," IEEE Transactions on Knowledge and Data Engineering 27.9, pp. 2522-2535, 2015. 미소장
10 Sak, Haşim, Andrew Senior, and Françoise Beaufays, "Long short-term memory based recurrent neural network architectures for large vocabulary speech recognition," arXiv preprint arXiv:1402.1128, 2014. 미소장
11 Chung, Junyoung, et al. "Empirical evaluation of gated recurrent neural networks on sequence modeling," arXiv preprint arXiv:1412.3555, 2014. 미소장
12 Wang, Zhiguang, Weizhong Yan, and Tim Oates. "Time series classification from scratch with deep neural networks: A strong baseline," 2017 International joint conference on neural networks (IJCNN). IEEE, 2017. 미소장
13 Karim, Fazle, et al. "LSTM fully convolutional networks for time series classification," IEEE access 6, pp. 1662-1669, 2017. 미소장
14 Ismail Fawaz, Hassan, et al., "Deep learning for time series classification: a review," Data mining and knowledge discovery 33.4, pp. 917-963, 2019. 미소장
15 KARIMI-BIDHENDI, Saeed; MUNSHI, Faramarz;MUNSHI, Ashfaq. Scalable classification of univariate and multivariate time series. In: 2018 IEEE International Conference on Big Data (Big Data). IEEE, pp. 1598-1605, 2018. 미소장
16 Tonekaboni, S., Eytan, D., and Goldenberg, "A. Unsupervised representation learning for time series with temporal neighborhood coding, " arXiv preprint arXiv:2106.00750, 2021. 미소장
17 Zerveas, G., Jayaraman, S., Patel, D., Bhamidipaty, A., and Eickhoff, C, "A transformer-based framework for multivariate time series representation learning," In Proceedings of the 27th ACM SIGKDD Conference on Knowledge Discovery &Data Mining, pp. 2114-2124, August. 2021. 미소장
18 Eldele, E., Ragab, M., Chen, Z., Wu, M., Kwoh, C. K., Li, X., and Guan, C, "Time-series representation learning via temporal and contextual contrasting,"arXiv preprint arXiv:2106.14112, 2021. 미소장
19 Mikolov, T., Chen, K., Corrado, G., and Dean, J, "Efficient estimation of word representations in vector space," arXiv preprint arXiv:1301.3781, 2013. 미소장
20 Pennington, J., Socher, R., & Manning, C. D, "Glove: Global vectors for word representation," In Proceedings of the 2014 conference on empirical methods in natural language processing (EMNLP), pp. 1532-1543, October. 2014, 미소장
21 Sarzynska-Wawer, J., Wawer, A., Pawlak, A., Szymanowska, J., Stefaniak, I., Jarkiewicz, M., and Okruszek, L, "Detecting formal thought disorder by deep contextualized word representations," Psychiatry Research, 304, 114135, 2021. 미소장
22 Bahdanau, D., Cho, K., and Bengio, Y, "Neural machine translation by jointly learning to align and translate," arXiv preprint arXiv:1409.0473, 2014. 미소장
23 Liu, Y., Ott, M., Goyal, N., Du, J., Joshi, M., Chen, D., ... and Stoyanov, V, "Roberta: A robustly optimized bert pretraining approach," arXiv preprint arXiv:1907.11692, 2019. 미소장
24 Radford, A., Narasimhan, K., Salimans, T., and Sutskever, I, "Improving language understanding by generative pre-training,", 2018. 미소장
25 Clark, K., Khandelwal, U., Levy, O., and Manning, C. D. "What does bert look at? an analysis of bert's attention," arXiv preprint arXiv:1906.04341, 2019. 미소장
26 Dau, H. A., Bagnall, A., Kamgar, K., Yeh, C. C. M., Zhu, Y., Gharghabi, S., ... and Keogh, E, "The UCR time series archive," IEEE/CAA Journal of Automatica Sinica, 6(6), 1293-1305, 2019 미소장