- 1.122.0 (latest)
 - 1.121.0
 - 1.120.0
 - 1.119.0
 - 1.118.0
 - 1.117.0
 - 1.116.0
 - 1.115.0
 - 1.114.0
 - 1.113.0
 - 1.112.0
 - 1.111.0
 - 1.110.0
 - 1.109.0
 - 1.108.0
 - 1.107.0
 - 1.106.0
 - 1.105.0
 - 1.104.0
 - 1.103.0
 - 1.102.0
 - 1.101.0
 - 1.100.0
 - 1.99.0
 - 1.98.0
 - 1.97.0
 - 1.96.0
 - 1.95.1
 - 1.94.0
 - 1.93.1
 - 1.92.0
 - 1.91.0
 - 1.90.0
 - 1.89.0
 - 1.88.0
 - 1.87.0
 - 1.86.0
 - 1.85.0
 - 1.84.0
 - 1.83.0
 - 1.82.0
 - 1.81.0
 - 1.80.0
 - 1.79.0
 - 1.78.0
 - 1.77.0
 - 1.76.0
 - 1.75.0
 - 1.74.0
 - 1.73.0
 - 1.72.0
 - 1.71.1
 - 1.70.0
 - 1.69.0
 - 1.68.0
 - 1.67.1
 - 1.66.0
 - 1.65.0
 - 1.63.0
 - 1.62.0
 - 1.60.0
 - 1.59.0
 - 1.58.0
 - 1.57.0
 - 1.56.0
 - 1.55.0
 - 1.54.1
 - 1.53.0
 - 1.52.0
 - 1.51.0
 - 1.50.0
 - 1.49.0
 - 1.48.0
 - 1.47.0
 - 1.46.0
 - 1.45.0
 - 1.44.0
 - 1.43.0
 - 1.39.0
 - 1.38.1
 - 1.37.0
 - 1.36.4
 - 1.35.0
 - 1.34.0
 - 1.33.1
 - 1.32.0
 - 1.31.1
 - 1.30.1
 - 1.29.0
 - 1.28.1
 - 1.27.1
 - 1.26.1
 - 1.25.0
 - 1.24.1
 - 1.23.0
 - 1.22.1
 - 1.21.0
 - 1.20.0
 - 1.19.1
 - 1.18.3
 - 1.17.1
 - 1.16.1
 - 1.15.1
 - 1.14.0
 - 1.13.1
 - 1.12.1
 - 1.11.0
 - 1.10.0
 - 1.9.0
 - 1.8.1
 - 1.7.1
 - 1.6.2
 - 1.5.0
 - 1.4.3
 - 1.3.0
 - 1.2.0
 - 1.1.1
 - 1.0.1
 - 0.9.0
 - 0.8.0
 - 0.7.1
 - 0.6.0
 - 0.5.1
 - 0.4.0
 - 0.3.1
 
ChatModel(model_id: str, endpoint_name: typing.Optional[str] = None)ChatModel represents a language model that is capable of chat.
Examples::
chat_model = ChatModel.from_pretrained("chat-bison@001")
chat = chat_model.start_chat(
    context="My name is Ned. You are my personal assistant. My favorite movies are Lord of the Rings and Hobbit.",
    examples=[
        InputOutputTextPair(
            input_text="Who do you work for?",
            output_text="I work for Ned.",
        ),
        InputOutputTextPair(
            input_text="What do I like?",
            output_text="Ned likes watching movies.",
        ),
    ],
    temperature=0.3,
)
chat.send_message("Do you know any cool events this weekend?")
Methods
ChatModel
ChatModel(model_id: str, endpoint_name: typing.Optional[str] = None)Creates a LanguageModel.
This constructor should not be called directly.
Use LanguageModel.from_pretrained(model_name=...) instead.
from_pretrained
from_pretrained(model_name: str) -> vertexai._model_garden._model_garden_models.TLoads a _ModelGardenModel.
| Exceptions | |
|---|---|
| Type | Description | 
ValueError | 
        If model_name is unknown. | 
ValueError | 
        If model does not support this class. | 
get_tuned_model
get_tuned_model(
    tuned_model_name: str,
) -> vertexai.language_models._language_models._LanguageModelLoads the specified tuned language model.
list_tuned_model_names
list_tuned_model_names() -> typing.Sequence[str]Lists the names of tuned models.
start_chat
start_chat(
    *,
    context: typing.Optional[str] = None,
    examples: typing.Optional[
        typing.List[vertexai.language_models.InputOutputTextPair]
    ] = None,
    max_output_tokens: typing.Optional[int] = None,
    temperature: typing.Optional[float] = None,
    top_k: typing.Optional[int] = None,
    top_p: typing.Optional[float] = None,
    message_history: typing.Optional[
        typing.List[vertexai.language_models.ChatMessage]
    ] = None,
    stop_sequences: typing.Optional[typing.List[str]] = None
) -> vertexai.language_models.ChatSessionStarts a chat session with the model.
tune_model
tune_model(
    training_data: typing.Union[str, pandas.core.frame.DataFrame],
    *,
    train_steps: typing.Optional[int] = None,
    learning_rate_multiplier: typing.Optional[float] = None,
    tuning_job_location: typing.Optional[str] = None,
    tuned_model_location: typing.Optional[str] = None,
    model_display_name: typing.Optional[str] = None,
    default_context: typing.Optional[str] = None
) -> _LanguageModelTuningJobTunes a model based on training data.
This method launches and returns an asynchronous model tuning job. Usage:
tuning_job = model.tune_model(...)
... do some other work
tuned_model = tuning_job.get_tuned_model()  # Blocks until tuning is complete
| Exceptions | |
|---|---|
| Type | Description | 
ValueError | 
        If the "tuning_job_location" value is not supported | 
ValueError | 
        If the "tuned_model_location" value is not supported | 
RuntimeError | 
        If the model does not support tuning |