Skip to content

Simple question about llama adapter v1 transformer forward function #145

@yestaehyung

Description

@yestaehyung

Hello, first of all, thank you for the excellent work.

From my understanding of the paper, in Llama Adapter v1, the adaption prompt is inserted into the topmost L layers of the transformer.
However, in the code below, if self.adapter_layer is 30, doesn't it insert the adapter from the 3rd to the 32nd layer of the transformer?

Could you please explain why -1 * self.adapter_layer was used here?

https://github.com/OpenGVLab/LLaMA-Adapter/blob/8c50ee5d5d393c9bee5fcfda6aaea31d3ca3c40c/alpaca_finetuning_v1/llama/model.py

for layer in self.layers[: -1 * self.adapter_layer]:
                h = layer(h, start_pos, freqs_cis, mask)
for layer in self.layers[-1 * self.adapter_layer :]:
            h = layer(h, start_pos, freqs_cis, mask, adapter[adapter_index].half())
            adapter_index = adapter_index + 1

I really appreciate any help you can provide.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions