<span id="3dn8r"></span>
    1. <span id="3dn8r"><optgroup id="3dn8r"></optgroup></span><li id="3dn8r"><meter id="3dn8r"></meter></li>


        SciNCL

        SciNCL is a pre-trained BERT language model to generate document-level embeddings of research papers.
        It uses the citation graph neighborhood to generate samples for contrastive learning.
        Prior to the contrastive training, the model is initialized with weights from scibert-scivocab-uncased.
        The underlying citation embeddings are trained on the S2ORC citation graph.
        Paper: Neighborhood Contrastive Learning for Scientific Document Representations with Citation Embeddings (EMNLP 2022 paper).
        Code: https://github.com/malteos/scincl
        PubMedNCL: Working with biomedical papers? Try PubMedNCL.


        How to use the pretrained model

        from transformers import AutoTokenizer, AutoModel
        # load model and tokenizer
        tokenizer = AutoTokenizer.from_pretrained('malteos/scincl')
        model = AutoModel.from_pretrained('malteos/scincl')
        papers = [{'title': 'BERT', 'abstract': 'We introduce a new language representation model called BERT'},
        {'title': 'Attention is all you need', 'abstract': ' The dominant sequence transduction models are based on complex recurrent or convolutional neural networks'}]
        # concatenate title and abstract with [SEP] token
        title_abs = [d['title'] + tokenizer.sep_token + (d.get('abstract') or '') for d in papers]
        # preprocess the input
        inputs = tokenizer(title_abs, padding=True, truncation=True, return_tensors="pt", max_length=512)
        # inference
        result = model(**inputs)
        # take the first token ([CLS] token) in the batch as the embedding
        embeddings = result.last_hidden_state[:, 0, :]


        Triplet Mining Parameters

        Setting Value
        seed 4
        triples_per_query 5
        easy_positives_count 5
        easy_positives_strategy 5
        easy_positives_k 20-25
        easy_negatives_count 3
        easy_negatives_strategy random_without_knn
        hard_negatives_count 2
        hard_negatives_strategy knn
        hard_negatives_k 3998-4000

        數(shù)據(jù)統(tǒng)計(jì)

        數(shù)據(jù)評(píng)估

        malteos/scincl瀏覽人數(shù)已經(jīng)達(dá)到581,如你需要查詢?cè)撜镜南嚓P(guān)權(quán)重信息,可以點(diǎn)擊"5118數(shù)據(jù)""愛(ài)站數(shù)據(jù)""Chinaz數(shù)據(jù)"進(jìn)入;以目前的網(wǎng)站數(shù)據(jù)參考,建議大家請(qǐng)以愛(ài)站數(shù)據(jù)為準(zhǔn),更多網(wǎng)站價(jià)值評(píng)估因素如:malteos/scincl的訪問(wèn)速度、搜索引擎收錄以及索引量、用戶體驗(yàn)等;當(dāng)然要評(píng)估一個(gè)站的價(jià)值,最主要還是需要根據(jù)您自身的需求以及需要,一些確切的數(shù)據(jù)則需要找malteos/scincl的站長(zhǎng)進(jìn)行洽談提供。如該站的IP、PV、跳出率等!

        關(guān)于malteos/scincl特別聲明

        本站OpenI提供的malteos/scincl都來(lái)源于網(wǎng)絡(luò),不保證外部鏈接的準(zhǔn)確性和完整性,同時(shí),對(duì)于該外部鏈接的指向,不由OpenI實(shí)際控制,在2023年 6月 6日 下午2:57收錄時(shí),該網(wǎng)頁(yè)上的內(nèi)容,都屬于合規(guī)合法,后期網(wǎng)頁(yè)的內(nèi)容如出現(xiàn)違規(guī),可以直接聯(lián)系網(wǎng)站管理員進(jìn)行刪除,OpenI不承擔(dān)任何責(zé)任。

        相關(guān)導(dǎo)航

        Trae官網(wǎng)

        暫無(wú)評(píng)論

        暫無(wú)評(píng)論...
        主站蜘蛛池模板: 最新亚洲成av人免费看| 亚洲日韩精品射精日| v片免费在线观看| 国产精一品亚洲二区在线播放| 亚洲美女视频一区| 在人线av无码免费高潮喷水| 青青视频免费在线| 亚洲无线电影官网| 免费无码一区二区三区| 亚洲av无码一区二区三区在线播放| 毛片免费在线视频| 久久国产精品免费一区二区三区| 亚洲国产精品人人做人人爱| 污污视频网站免费观看| 免费成人午夜视频| 午夜成人无码福利免费视频| 亚洲乱码卡一卡二卡三| 国产精品成人免费一区二区 | 亚洲午夜电影在线观看高清 | 亚洲人成网站免费播放| 一级a性色生活片久久无少妇一级婬片免费放| 精品少妇人妻AV免费久久洗澡| 中文字幕 亚洲 有码 在线| 亚洲色精品88色婷婷七月丁香 | 一区二区免费视频| 免费无码午夜福利片| 久久久青草青青国产亚洲免观| 久久成人永久免费播放| 亚洲视频一区二区三区四区| 久久久久亚洲AV无码专区首| 亚洲高清视频一视频二视频三| 国产va免费观看| 亚洲成av人无码亚洲成av人| 亚洲国产精品成人精品软件| 午夜视频在线观看免费完整版| 另类专区另类专区亚洲| 亚洲人成图片网站| 亚洲欧洲综合在线| 亚洲伊人tv综合网色| 亚洲免费观看视频| 亚洲永久精品ww47|