권호기사보기
기사명 | 저자명 | 페이지 | 원문 | 기사목차 |
---|
대표형(전거형, Authority) | 생물정보 | 이형(異形, Variant) | 소속 | 직위 | 직업 | 활동분야 | 주기 | 서지 | |
---|---|---|---|---|---|---|---|---|---|
연구/단체명을 입력해주세요. |
|
|
|
|
|
* 주제를 선택하시면 검색 상세로 이동합니다.
We introduce a high-performance named entity recognition (NER) model for written and spoken language. To overcome challenges related to labeled data scarcity and domain shifts, we use transfer learning to leverage our previously developed KorBERT as the base model. We also adopt a meta-pseudo-label method using a teacher/student framework with labeled and unlabeled data. Our model presents two modifications. First, the student model is updated with an average loss from both human- and pseudo-labeled data. Second, the influence of noisy pseudo-labeled data is mitigated by considering feedback scores and updating the teacher model only when below a threshold (0.0005). We achieve the target NER performance in the spoken language domain and improve that in the written language domain by proposing a straightforward rollback method that reverts to the best model based on scarce human-labeled data. Further improvement is achieved by adjusting the label vector weights in the named entity dictionary.
번호 | 참고문헌 | 국회도서관 소장유무 |
---|---|---|
1 | Self-training from labeled features for sentiment analysis | 미소장 |
2 | Importance weighting and unsupervised domain adaptation of POS taggers: a negative result | 미소장 |
3 | Domain Adaptation via Transfer Component Analysis | 미소장 |
4 | Effective self-training for parsing | 미소장 |
5 | Named Entity Recognition for Novel Types by Transfer Learning | 미소장 |
6 | Empower Sequence Labeling with Task-Aware Neural Language Model | 미소장 |
7 | A Multi-lingual Multi-task Architecture for Low-resource Sequence Labeling | 미소장 |
8 | Natural Language Processing (almost) from Scratch | 미소장 |
9 | Transfer Learning in Natural Language Processing | 미소장 |
10 | End-to-end Sequence Labeling via Bi-directional LSTM-CNNs-CRF | 미소장 |
11 | Label-Aware Double Transfer Learning for Cross-Specialty Medical Named Entity Recognition | 미소장 |
12 | Dynamic Data Selection for Neural Machine Translation | 미소장 |
13 | A Survey on Deep Learning for Named Entity Recognition | 미소장 |
14 | MetaNER: Named Entity Recognition with Meta-Learning | 미소장 |
15 | Dice Loss for Data-imbalanced NLP Tasks | 미소장 |
16 | A Small-Scale Korean-Specific BERT Language Model | 미소장 |
17 | Entity Enhanced BERT Pre-training for Chinese NER | 미소장 |
18 | LUKE: Deep Contextualized Entity Representations with Entity-aware Self-attention | 미소장 |
19 | EER-ASSL: Combining Rollback Learning and Deep Learning for Rapid Adaptive Object Detection | 미소장 |
20 | A Decade Survey of Transfer Learning (2010–2020) | 미소장 |
21 | Learning from Noisy Labels for Entity-Centric Information Extraction | 미소장 |
22 | Few-shot Learning for Named Entity Recognition Based on BERT and Two-level Model Fusion | 미소장 |
23 | Meta Self-training for Few-shot Neural Sequence Labeling | 미소장 |
24 | Improving Named Entity Recognition by External Context Retrieving and Cooperative Learning | 미소장 |
25 | Meta Pseudo Labels | 미소장 |
26 | A Survey on Deep Semi-Supervised Learning | 미소장 |
27 | Meta-Based Self-Training and Re-Weighting for Aspect-Based Sentiment Analysis | 미소장 |
*표시는 필수 입력사항입니다.
*전화번호 | ※ '-' 없이 휴대폰번호를 입력하세요 |
---|
기사명 | 저자명 | 페이지 | 원문 | 기사목차 |
---|
번호 | 발행일자 | 권호명 | 제본정보 | 자료실 | 원문 | 신청 페이지 |
---|
도서위치안내: 정기간행물실(524호) / 서가번호: 국내17
2021년 이전 정기간행물은 온라인 신청(원문 구축 자료는 원문 이용)
우편복사 목록담기를 완료하였습니다.
*표시는 필수 입력사항입니다.
저장 되었습니다.