Zaɓi Harshe

SELMA: Tsarin Harshe Mai Ƙarfin Magana don Mu'amalar Mataimakan Na'ura

Binciken SELMA, babban tsarin harshe (LLM) mai haɗa sauti da rubutu don ayyukan mataimakan na'ura guda ɗaya kamar gano kalmar faɗakarwa, gano maganar da aka nufa, da fahimtar magana ta atomatik (ASR).
agi-friend.com | PDF Size: 0.6 MB
Kima: 4.5/5
Kimarku
Kun riga kun ƙididdige wannan takarda
Murfin Takardar PDF - SELMA: Tsarin Harshe Mai Ƙarfin Magana don Mu'amalar Mataimakan Na'ura

1. Gabatarwa & Bayyani

Wannan takarda tana binciken takardar bincike "SELMA: Tsarin Harshe Mai Ƙarfin Magana don Mu'amalar Mataimakan Na'ura." Aikin ya gabatar da SELMA, sabon tsarin da'irori biyu da aka tsara don sauƙaƙa da haɓaka tsarin sarrafa mataimakan na'ura masu amfani da murya (VAs). Tsarin VA na gargajiya, kamar yadda aka nuna a Hoto na 1(a) na takardar, yana da sarkakiya, yana haɗa da samfura na musamman da yawa don ayyuka masu jeri kamar Gano Kalmar Faɗakarwa (VT), Gano Maganar da aka Nufa ga Na'ura (DDSD), da Fahimtar Magana ta Atomatik (ASR). Wannan hanyar tsari sau da yawa tana haifar da yaduwar kuskure, jinkiri, da ƙara nauyin lissafi.

SELMA ta gabatar da canjin tsari ta hanyar haɗa shigarwar sauti da rubutu cikin Babban Tsarin Harshe (LLM) guda ɗaya, mai ƙarewa-zuwa-ƙarshe. An horar da shi don ɗaukar manyan ayyuka guda uku—ganon VT, DDSD, da ASR—lokaci guda a cikin samfuri ɗaya mai haɗaka. Sabon abu na asali yana cikin amfani da dabarun daidaitawa masu inganci, musamman Daidaitawar Low-Rank (LoRA), wanda aka yi amfani da shi a cikin na'urar ɓoye sauti da kuma ginshiƙin LLM. Wannan yana ba SELMA damar yin amfani da ƙarfin fahimtar mahallin na LLMs yayin da ake iya daidaitawa ga shigarwar da'irori biyu tare da ƙananan sigogi masu iya horarwa.

Muhimman Fahimta

SELMA ta maye gurbin tsarin da ya rabu, mai yawan samfura da LLM guda ɗaya mai haɗaka, ta cimma mafi girman aiki da sauƙin tsari don manyan ayyukan mataimakan na'ura.

2. Hanyoyi & Tsari

Tsarin SELMA an gina shi akan tushen LLM da aka riga aka horar. Tsarin yana ɗaukar duka sifofin igiyoyin sauti (wanda na'urar ɓoye sauti ke sarrafawa) da alamun rubutu. Makullin ingancinsa da ingancinsa shine haɗin dabarun waɗannan hanyoyin da kuma hanyar horarwa.

2.1 Tsarin Samfurin

Samfurin yana karɓar jerin haɗe-haɗe na sifofin sauti (daga na'urar ɓoyewa) da alamun rubutu. LLM mai tushen transformer ɗaya yana sarrafa wannan jeri ɗaya. Manyan fitarwa na musamman na aiki an haɗa su zuwa ɓoyayyun ɓoyayyun ƙarshe na LLM don samar da hasashe na VT, DDSD, da ASR lokaci guda. Wannan ya bambanta sosai da tsarin gargajiya da aka nuna a Hoto na 1(b), inda samfura daban-daban ke aiki a jere.

2.2 Daidaitawar Low-Rank (LoRA)

Don daidaita babban LLM da na'urar ɓoye sauti yadda ya kamata, SELMA tana amfani da LoRA. Maimakon sabunta dukkan ma'auni, LoRA tana shigar da matrices na rarraba matsayi masu iya horarwa cikin yadudduka na transformer. Don matrix ma'auni $W \in \mathbb{R}^{d \times k}$, ana wakiltar sabuntawa kamar $W' = W + BA$, inda $B \in \mathbb{R}^{d \times r}$, $A \in \mathbb{R}^{r \times k}$, kuma matsayi $r \ll \min(d, k)$. Wannan yana rage yawan sigogi masu iya horarwa sosai, yana sa ya yiwu a daidaita manyan samfura zuwa sabbin ayyuka masu da'irori biyu tare da ƙarancin bayanai.

2.3 Dabarar Haɗa Siffofi

Don ayyuka kamar VT da DDSD waɗanda ke buƙatar fahimtar dukan furci maimakon cikakkun bayanai kowane alama, SELMA tana aiwatar da tsarin haɗa siffofi (misali, ma'anar haɗawa) akan jerin abubuwan haɗa sauti kafin a ciyar da su cikin LLM. Wannan yana taimaka wa samfurin gane manyan tsarin sauti masu mahimmanci don ayyukan ganowa.

3. Sakamakon Gwaji

Takardar ta gabatar da ƙwaƙƙwaran shaida na gwaji na fifikon SELMA akan samfuran gargajiya, na musamman na aiki.

3.1 Ma'aunin Aiki

An taƙaita mahimman sakamako a ƙasa:

Gano Kalmar Faɗakarwa (VT)

64% ci gaba na EER

Ragewar ƙimar kuskure daidai (EER) sosai idan aka kwatanta da samfuran VT na musamman.

Maganar da aka Nufa ga Na'ura (DDSD)

22% ci gaba na EER

Babban riba wajen gane niyyar mai amfani daidai ba tare da kalmar faɗakarwa ba.

Fahimtar Magana ta Atomatik (ASR)

WER Kusa da Tsarin Asali

Yana riƙe da gasar ƙimar kuskuren kalma (WER) yayin yin wasu ayyuka.

3.2 Kwatance da Tsarin Asali

An yi gwajin SELMA da manyan samfuran da aka keɓe don kowane aiki na musamman. Sakamakon ya nuna cewa samfurin haɗaka ba wai kawai ya yi daidai ba har ma sau da yawa ya wuce aikin waɗannan tsare-tsaren na musamman. Wannan yana ƙalubalantar zato na daɗewa cewa samfuran na musamman na aiki sun fi girma a zahiri. Sauƙaƙa daga tsarin a Hoto na 1(a) zuwa hanyar haɗakar SELMA a Hoto na 1(b) yana zuwa tare da fa'idar aiki bayyananne, ba daidaito ba.

4. Binciken Fasaha & Muhimman Fahimta

Muhimman Fahimta: Takardar SELMA wani yunƙuri ne mai ƙarfi a kan ƙarar tsari a cikin AI na gefe. Ta tabbatar da cewa LLM guda ɗaya, wanda aka daidaita yadda ya kamata, zai iya fi na'urar Rube Goldberg na samfuran musamman don ayyuka masu haɗin kai kamar VT, DDSD, da ASR. Masana'antu sun daɗe suna manne da akidar tsari, kuma SELMA tana nuna hanyar haɗaka.

Tsarin Ma'ana: Hujja tana da kyau: 1) Tsare-tsaren gargajiya suna da sarkakiya kuma suna iya haifar da kuskure. 2) LLMs sune samfuran jeri masu ƙarfi waɗanda zasu iya, bisa ka'ida, sarrafa jerin da'irori biyu. 3) Matsalolin shine daidaitawa mai inganci. 4) Magani: Yi amfani da LoRA don daidaita sigogi mai inganci da haɗa siffofi mai hankali don jagorantar hankalin samfurin. 5) Sakamako: Tsarin mafi sauƙi, mafi kyawun aiki. Gudun daga matsala zuwa magani yana da haɗin kai kuma yana da goyan baya ta bayanai.

Ƙarfi & Kurakurai: Babban ƙarfi shine babban ci gaban aiki akan ayyukan ganowa (64% da 22% ribar EER ba ƙaramin abu bane). Yin amfani da LoRA zaɓi ne mai hankali, mai amfani don turawa akan na'ura, yana daidaitawa da yanayin da ake gani a cikin sauran binciken AI mai inganci daga cibiyoyi kamar CRFM na Stanford. Babban aibi, wanda marubutan suka yarda da shi, shine yanayin baƙar fata na yanke shawara na LLM don ayyuka masu mahimmanci na aminci kamar VT. Idan samfurin ya gaza, gano *dalilin* yana da wahala fiye da a cikin tsarin ƙa'ida ko samfuri mai sauƙi. Bugu da ƙari, buƙatun horo da bayanai don irin wannan samfurin haɗaka suna da yawa, mai yiwuwa suna haifar da babban shinge ga shiga.

Fahimta Mai Aiki: Ga ƙungiyoyin samfur, saƙon yana bayyananne: fara ƙirƙira ginshiƙai masu haɗaka, na tushen LLM don ayyukan mu'amala masu da'irori biyu. Zamanin haɗa samfura daban-daban guda biyar don furucin mai amfani guda ɗaya yana ƙarewa. Ya kamata fifikon bincike ya karkata daga gina mafi kyawun abubuwan keɓe zuwa ƙirƙirar mafi kyawun tsarin horo da ma'auni na kimantawa don waɗannan samfuran haɗaka, tabbatar da cewa suna da ƙarfi, fassara, da adalci. Kamar yadda aka gani a cikin juyin halitta na samfura kamar GPT da BERT, yanayin yana nuni zuwa ga gama gari, ba na musamman ba, don fahimtar harshe na asali (kuma yanzu sauti).

Misalin Tsarin Bincike: Kimanta Tsarin Haɗaka da na Tsari

Yanayi: Ƙungiya tana yanke shawara tsakanin samfurin haɗaka mai kama da SELMA da tsarin tsari na gargajiya don sabon lasifikar mai magana mai hankali.

Aiwatar Tsarin:

  1. Aiki: Kwatanta EER don VT/DDSD da WER don ASR akan bayanai na cikin yanki da na hayaniya na waje. SELMA tana iya cin nasara akan ayyukan haɗaka.
  2. Jinkiri & Lissafi: Yi bayanin jinkiri na ƙarshe-zuwa-ƙarshe da ƙarar ƙwaƙwalwar ajiya. Samfurin haɗaka na iya samun ƙarancin jinkiri saboda ƙarancin matakai na jeri amma yana iya buƙatar ƙarin ƙwaƙwalwar ajiya don LLM.
  3. Ci gaba & Kulawa: Kimanta farashin horarwa/kula da samfuri ɗaya mai sarkakiya da samfura 3-5 masu sauƙi. Samfuran haɗaka suna sauƙaƙa tushen lambar amma suna buƙatar ƙwararrun LLM.
  4. Amini & Gyara: Kimanta sauƙin ƙara kariya ko gano gazawa. Tsarin tsari yana ba da mafi yawan wuraren sarrafawa.
Tsarin yana haifar da daidaito: zaɓi SELMA don matsakaicin daidaito da sauƙi a cikin yanayi masu sarrafawa; yi la'akari da hanyar tsari idan fassara da sabuntawa na ƙari sun fi mahimmanci.

5. Aikace-aikace na Gaba & Jagorori

Hanyar SELMA tana da tasiri fiye da mataimakan na'ura. Babban ra'ayi na LLM mai da'irori biyu a matsayin hanyar haɗin gwiwa guda ɗaya don ayyukan fahimta na jiri yana iya zama gama gari.

  • Ƙara Da'irori Biyu: Maimaitawa na gaba na iya haɗa shigarwar gani (misali, daga gilashin AR) don mu'amala mai sanin mahalli, tantance ko mai amfani yana kallon na'ura yayin magana.
  • Taimako Mai Tsari: Ta ci gaba da sarrafa sauti/rubutu na yanayi (tare da masu gadin sirri da suka dace), irin waɗannan samfuran zasu iya motsawa daga aiwatar da umarni mai mayar da martani zuwa shawara mai tsari, kama da hangen nesa na Lissafin Yanayi na Google.
  • Gama Gari na Yanki: Za a iya daidaita tsarin don wasu yankuna da ke buƙatar fahimtar da'irori biyu na jeri, kamar daidaita abun cikin bidiyo (sauti+gani+rubutu) ko hanyoyin sadarwar murya na mota da aka haɗa da tsarin sa ido na direba.
  • Koyo akan Na'ura: Aikin gaba dole ne ya magance keɓancewa da ci gaba da koyo akan na'ura ta amfani da dabarun kamar ma'ajiyar sake kunnawa ko koyo na tarayya, daidaita samfurin haɗaka zuwa tsarin magana da ƙamus na mai amfani ɗaya ba tare da lalata sirri ba.
  • Iyakar Inganci: Bincike zai tura zuwa ga mafi ingantaccen samfuran tushe (misali, bisa Tsarin Gauran Malamai) da dabarun daidaitawa fiye da LoRA don sanya waɗannan samfuran haɗaka masu ƙarfi suyi aiki akan na'urori masu ƙarancin albarkatu.

6. Nassoshi

  1. Hu, E. J., et al. "LoRA: Low-Rank Adaptation of Large Language Models." arXiv preprint arXiv:2106.09685 (2021).
  2. Radford, A., et al. "Robust Speech Recognition via Large-Scale Weak Supervision." Proceedings of ICML (2023).
  3. Bommasani, R., et al. "On the Opportunities and Risks of Foundation Models." Stanford University Center for Research on Foundation Models (CRFM) (2021).
  4. Brown, T., et al. "Language Models are Few-Shot Learners." Advances in Neural Information Processing Systems 33 (2020).
  5. Vaswani, A., et al. "Attention is All You Need." Advances in Neural Information Processing Systems 30 (2017).
  6. Google AI Blog. "The Path to Ambient Computing." (2020). [Online]. Available: https://blog.google/products/assistant/path-ambient-computing/