Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Mistral support to Bedrock #703

Open
wants to merge 1 commit into
base: dev
Choose a base branch
from
Open

Conversation

collindutter
Copy link
Member

Closes #664

@collindutter collindutter force-pushed the feature/bedrock-mistral branch 4 times, most recently from 51cf7d7 to 36f066f Compare March 22, 2024 19:56
@collindutter collindutter requested a review from a team March 25, 2024 17:34
See this thread more more information: https://github.com/griptape-ai/griptape/issues/244

Returns:
BedrockLlamaTokenizer: The tokenizer for this driver.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

BedrockLlamaTokenizer -> looks like a copy/paste error

Comment on lines +40 to +44
if self._tokenizer:
return self._tokenizer
else:
self._tokenizer = BedrockMistralTokenizer(model=self.prompt_driver.model)
return self._tokenizer
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Super nit:

if self._tokenizer is None:
    self._tokenizer = BedrockMistralTokenizer(model=self.prompt_driver.model)
return self._tokenizer

Args:
prompt_stack: The `PromptStack` to convert.
"""
system_input = next((i for i in prompt_stack.inputs if i.is_system()), None)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why does this assume a max of one system input?

Should you raise an exception if this assumption violated (rather than ignoring silently)?

Comment on lines +60 to +65
prompt_lines = [self.BOS_TOKEN]
for prompt_input in non_system_inputs:
if prompt_input.is_assistant():
prompt_lines.append(f"{prompt_input.content}{self.EOS_TOKEN} ")
else:
prompt_lines.append(f"[INST] {prompt_input.content} [/INST]")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Won't this end up with unbalanced BOS/EOS tokens? I didn't quickly get a great understanding of the mistrial docs on hugging face. Is unbalanced BOS/EOS tokens ok?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Add Support For Mistral on AWS Bedrock
2 participants