Векторное представление слов с семантическими отношениями: экспериментальные наблюдения
https://doi.org/10.18255/1818-1015-2018-6-726-733
Аннотация
Ключевые слова
Об авторах
Мария Сергеевна КаряеваРоссия
аспирант
ул. Советская, 14, г. Ярославль, 150003
Павел Исаакович Браславский
Россия
канд. техн. наук, доцент
г. Екатеринбург, ул. Мира, 19, 620002
Валерий Анатольевич Соколов
Россия
доктор физ.-мат. наук, профессор
ул. Советская, 14, г. Ярославль, 150003
Список литературы
1. Mikolov T., Yih W., Zweig G., “Linguistic Regularities in Continuous Space Word Representations”, HLT-NAACL, 2013, 746–751.
2. Sienˇcnik S.K., “Adapting word2vec to named entity recognition”, Proceedings of the 20th nordic conference of computational linguistics, 2015, 239–243.
3. Lilleberg J., Zhu Y., Zhang Y., “Support vector machines and word2vec for text classification with semantic features”, Cognitive Informatics & Cognitive Computing, IEEE 14th International Conference, 2015, 136–140.
4. Ling W. et al., “Two/too simple adaptations of word2vec for syntax problems”, Proceedings of the 2015 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, 2015, 1299–1304.
5. Najafabadi M.M. et al., “Deep learning applications and challenges in big data analytics”, Journal of Big Data, 2 (2015), 1.
6. Kutuzov A., Andreev I., “Texts in, meaning out: neural language models in semantic similarity task for Russian”, 2015, https://arxiv.org/abs/1504.08183.
7. Hearst M. A., “Automatic acquisition of hyponyms from large text corpora”, Proceedings of the 14th conference on Computational linguistics – Association for Computational Linguistics, 2 (1992), 539–545.
8. Klaussner C., Zhekova D., “Lexico-syntactic patterns for automatic ontology building”, Proceedings of the Second Student Research Workshop associated with RANLP, 2011, 109– 114.
9. Maedche A., Pekar V., Staab S., “Ontology learning part one—on discovering taxonomic relations from the web”, Web Intelligence, 2003, 301–319.
10. Snow R., Jurafsky D., Ng A. Y., “Learning syntactic patterns for automatic hypernym discovery”, Advances in Neural Information Processing Systems, 2005, 1297–1304.
11. Panchenko A.,et al.,“Human and Machine Judgements for Russian Semantic Relatedness”, Analysis of Images, Social Networks and Texts: 5th International Conference, AIST 2016, (Yekaterinburg, Russia, April 7–9, 2016, Revised Selected Papers), 2017, 221–235.
12. Fu R., et al., “Learning semantic hierarchies via word embeddings”, Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics, 1 (2014), 1199–1209.
13. Ustalov D., Arefyev N., Biemann C., Panchenko A., “Negative sampling improves hypernymy extraction based on projection learning”, 2017, https://arxiv.org/pdf/ 1707.03903.pdf.
14. Wang C., Cao L., Zhou B., “Medical Synonym Extraction with Concept Space Models”, 2015, https://arxiv.org/pdf/1506.00528.pdf.
15. Rei M., Briscoe T., “Looking for hyponyms in vector space”, Proceedings of the Eighteenth Conference on Computational Natural Language Learning, 2014, 68–77.
16. Turney P., Pantel P., “From frequency to meaning: Vector space models of semantics”, Journal of artificial intelligence research, 37 (2010), 141–188.
17. Matsuo Y., Ishizuka M., “Keyword extraction from a single document using word cooccurrence statistical information”, International Journal on Artificial Intelligence Tools, 13:1 (2004), 157–169.
Рецензия
Для цитирования:
Каряева М.С., Браславский П.И., Соколов В.А. Векторное представление слов с семантическими отношениями: экспериментальные наблюдения. Моделирование и анализ информационных систем. 2018;25(6):726-733. https://doi.org/10.18255/1818-1015-2018-6-726-733
For citation:
Karyaeva M.S., Braslavski P.I., Sokolov V.A. Word Embedding for Semantically Relative Words: an Experimental Study. Modeling and Analysis of Information Systems. 2018;25(6):726-733. (In Russ.) https://doi.org/10.18255/1818-1015-2018-6-726-733