728x90 300x250 코딩/LLM25 llama2를 context 8k까지 확장하는 방법 RoPE, exllama class ExllamaHF(PreTrainedModel): def __init__(self, config: ExLlamaConfig): super().__init__(PretrainedConfig()) self.ex_config = config self.ex_model = ExLlama(self.ex_config) self.ex_cache = ExLlamaCache(self.ex_model) self.generation_config = GenerationConfig() self.lora = None ...중략... @classmethod def from_pretrained(cls, pretrained_model_name_or_path: Optional[Union[str, os.PathLike]], *m.. 2023. 8. 24. 이전 1 ··· 4 5 6 7 다음 728x90 300x250