Settings for experiments 0

Started: Fri Dec 5 19:44:36 2025 -- up 53 hr 32 min 35 sec
Built on Dec 5 2025 16:11:08 (1764979868)
Built at rbex-enqueue-targets@oqs20.prod.google.com:/google/src/cloud/buildrabbit-username/buildrabbit-client/google3
Built as //cloud/ai/platform/dataplane/cardolan:vertex-genai-dataplane
Build label: cloud-ml.vertex-genai-dataplane_20251205.04_p0
Built target blaze-out/k8-opt/bin/cloud/ai/platform/dataplane/cardolan/vertex-genai-dataplane
Build options: fdo=XFDO
Built for gcc-4.X.Y-crosstool-v18-llvm-grtev4-k8.k8
Built from changelist 840844128 with baseline 840844128 in a mint client based on //depot/google3
Task BNS: /bns/yudfwra/borg/yudfwra/bns/cloud-ml-vertex-genai-dataplane-staging-jobs/staging-qual-us.vertex-genai-dataplane/1
Ports:
esf.1.staging-qual-us.vertex-genai-dataplane.cloud-ml-vertex-genai-dataplane-staging-jobs.yudfwra.borg.google.com:8111
1.staging-qual-us.vertex-genai-dataplane.cloud-ml-vertex-genai-dataplane-staging-jobs.yudfwra.borg.google.com:25948
borgenvelope.1.staging-qual-us.vertex-genai-dataplane.cloud-ml-vertex-genai-dataplane-staging-jobs.yudfwra.borg.google.com:25949

Profiling Links:
 censusprofilez?seconds=30 (CPU profile with go/census tags, 30 seconds): esf:811125948borgenvelope:25949
 censusheapz (heap usage with go/census tags): esf:811125948borgenvelope:25949
 peakheapz (peak heap usage): esf:811125948borgenvelope:25949
 deltacontentionz?seconds=10 (contention, 10 seconds): esf:811125948borgenvelope:25949
 threadz (thread stacks): esf:811125948borgenvelope:25949
 mmapz (mmap() usage): esf:811125948borgenvelope:25949
 contentionz (legacy contention): esf:811125948borgenvelope:25949
Active RPC Experiments
Stubby: stubby_default_subsetting, go/inline-psp, 1rpc,
gRPC: channelz_use_v2_for_v1_api, channelz_use_v2_for_v1_service, channelz_zviz, chttp2_bound_write_size, deprecate_keep_grpc_initialized, error_flatten, event_engine_channelz_socket_info, event_engine_client, event_engine_dns, event_engine_dns_non_client_channel, event_engine_listener, event_engine_callback_cq, event_engine_secure_endpoint, google_no_envelope_resolver, graceful_external_connection_failure, lbns_support_in_address_resolver, loas2_protect_memory_optimization, max_inflight_pings_strict_limit, monitoring_experiment, namecheck_core_lib, privacy_context_single_encoding, prod2cloud_w3c_trace, rr_wrr_connect_from_random_index, tsi_frame_protector_without_locks,
Running on yudfwra-ca2.prod.google.com
Platform: arcadia milan
Process size: 14331MiB Memory usage: 1446MiB Load avg (1m): 53.32
View process information, endpoints
View variables, flags, streamz, request logs
Links: code, g3doc, continuous pprof, automon
Distributed traces: view, change parameters
Remote Logs: INFO WARNING ERROR STDOUT STDERR 
See all variables

Flag values for default experiment (0)

	EnableAsyncGenerateContent__enabled: 'name: "EnableAsyncGenerateContent__enabled"
type: BOOL
base_value: "FALSE"
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 667
}
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 1109
}
id: 0
'
	EnhanceRagGenerationPrompt__enabled: 'name: "EnhanceRagGenerationPrompt__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	EnableInferenceGatewayForDeepseekR31__enabled: 'name: "EnableInferenceGatewayForDeepseekR31__enabled"
type: BOOL
base_value: "FALSE"
modifier {
  value_operator: OVERRIDE
  base_value: "FALSE"
  condition_group {
  }
  condition_index: 854
}
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 766
}
modifier {
  value_operator: OVERRIDE
  base_value: "FALSE"
  condition_group {
  }
  condition_index: 0
}
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 1110
}
id: 0
'
	EnableOpenMaasUsageInsightsCategorization__enabled: 'name: "EnableOpenMaasUsageInsightsCategorization__enabled"
type: BOOL
base_value: "FALSE"
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 692
}
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 1111
}
id: 0
'
	OpenMaasRequestIdHeaderInjection__enabled: 'name: "OpenMaasRequestIdHeaderInjection__enabled"
type: BOOL
base_value: "FALSE"
modifier {
  value_operator: OVERRIDE
  base_value: "FALSE"
  condition_group {
  }
  condition_index: 860
}
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 775
}
modifier {
  value_operator: OVERRIDE
  base_value: "FALSE"
  condition_group {
  }
  condition_index: 0
}
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 1115
}
id: 0
'
	CustomUnaryChatCompletionServiceDirectory__enabled: 'name: "CustomUnaryChatCompletionServiceDirectory__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	EnableInferenceGatewayForQwen3Coder__enabled: 'name: "EnableInferenceGatewayForQwen3Coder__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	MaasLoraMemcacheSignedUrl__enabled: 'name: "MaasLoraMemcacheSignedUrl__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	GenAiBatchExplicitCaching__enabled: 'name: "GenAiBatchExplicitCaching__enabled"
type: BOOL
base_value: "FALSE"
modifier {
  value_operator: OVERRIDE
  base_value: "FALSE"
  condition_group {
  }
  condition_index: 859
}
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 773
}
id: 0
'
	BatchPredictionEnableFileDataSupport__enabled: 'name: "BatchPredictionEnableFileDataSupport__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	V1pTopLevelRetriesConfig__enabled: 'name: "V1pTopLevelRetriesConfig__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	V1pTopLevelRetriesConfig__retries_config: 'name: "V1pTopLevelRetriesConfig__retries_config"
type: PROTO_BINARY_BASE64
sub_type: "cloud_ai_platform_dataplane_prediction_proto.CardolanRetriesConfig"
base_value: ""
id: 0
'
	EnablePerBaseModelPredictLongrunningQuota__enabled: 'name: "EnablePerBaseModelPredictLongrunningQuota__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	EnableLlmRerankerForRag__enabled: 'name: "EnableLlmRerankerForRag__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	VpcscSupportedDbCheck__enabled: 'name: "VpcscSupportedDbCheck__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	BatchTrafficModelServerPrefillQueueTimeout__queueing_seconds: 'name: "BatchTrafficModelServerPrefillQueueTimeout__queueing_seconds"
type: INT
base_value: "-1"
modifier {
  value_operator: OVERRIDE
  base_value: "-1"
  condition_group {
  }
  condition_index: 852
}
modifier {
  value_operator: OVERRIDE
  base_value: "300"
  condition_group {
  }
  condition_index: 763
}
id: 0
'
	FlexTrafficModelServerPrefillQueueTimeout__queueing_seconds: 'name: "FlexTrafficModelServerPrefillQueueTimeout__queueing_seconds"
type: INT
base_value: "-1"
modifier {
  value_operator: OVERRIDE
  base_value: "-1"
  condition_group {
  }
  condition_index: 857
}
modifier {
  value_operator: OVERRIDE
  base_value: "600"
  condition_group {
  }
  condition_index: 770
}
id: 0
'
	EnablePromptmineForStreamingGenerateContent__enabled: 'name: "EnablePromptmineForStreamingGenerateContent__enabled"
type: BOOL
base_value: "FALSE"
modifier {
  value_operator: OVERRIDE
  base_value: "FALSE"
  condition_group {
  }
  condition_index: 404
}
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 19
}
id: 0
'
	FilterOutThoughtSignatureBasedOnHttpHeader__enabled: 'name: "FilterOutThoughtSignatureBasedOnHttpHeader__enabled"
type: BOOL
base_value: "FALSE"
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 29
}
id: 0
'
	ReturnModelLifecycleStatus__enabled: 'name: "ReturnModelLifecycleStatus__enabled"
type: BOOL
base_value: "FALSE"
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 80
}
id: 0
'
	TunedModelUseBasemodelInfoBlockModels__list: 'name: "TunedModelUseBasemodelInfoBlockModels__list"
type: PROTO
sub_type: "experiments.proto.StringListParam"
base_value: "element: \"gemini-1.5-pro-002\"\nelement: \"gemini-1.5-flash-002\"\nelement: \"gemini-2.0-flash-001\"\nelement: \"gemini-2.0-flash-lite-001\"\nelement: \"gemini-1.0-pro-002\"\nelement: \"gemini-1.5-flash\"\nelement: \"gemini-1.5-pro\"\nelement: \"gemini-1.5-pro-preview-0409\"\nelement: \"gemini-1.5-flash-002\"\n"
id: 0
'
	TunedModelUseBasemodelInfo__enabled: 'name: "TunedModelUseBasemodelInfo__enabled"
type: BOOL
base_value: "FALSE"
modifier {
  value_operator: OVERRIDE
  base_value: "FALSE"
  condition_group {
  }
  condition_index: 869
}
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 792
}
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 1125
}
id: 0
'
	V1pClearDynamicEndpointOverridesOnRetries__enabled: 'name: "V1pClearDynamicEndpointOverridesOnRetries__enabled"
type: BOOL
base_value: "FALSE"
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 91
}
id: 0
'
	CheckGenaiCacheTenantProjectExists__enabled: 'name: "CheckGenaiCacheTenantProjectExists__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	DynamicSessionExperimentalFeature__allowed_projects: 'name: "DynamicSessionExperimentalFeature__allowed_projects"
type: PROTO
sub_type: "experiments.proto.Int64ListParam"
base_value: ""
id: 0
'
	DynamicSessionExperimentalFeature__enabled: 'name: "DynamicSessionExperimentalFeature__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	EnableDataplaneSpannerStaleRead__enabled: 'name: "EnableDataplaneSpannerStaleRead__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	EnableGranularBillingOnGemini__enabled: 'name: "EnableGranularBillingOnGemini__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	GeminiBidiOverride__enabled: 'name: "GeminiBidiOverride__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	GeminiFlashBillingFeature__enabled: 'name: "GeminiFlashBillingFeature__enabled"
type: BOOL
base_value: "TRUE"
id: 0
'
	GeminiMmForwardEucToLlmServer__enabled: 'name: "GeminiMmForwardEucToLlmServer__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	GeminiPreviewOptinFeature__enabled: 'name: "GeminiPreviewOptinFeature__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	GroundingEndUserCreds__enabled: 'name: "GroundingEndUserCreds__enabled"
type: BOOL
base_value: "FALSE"
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 147
}
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 40
}
id: 0
'
	GroundingLoraConfig__base_model_config_map: 'name: "GroundingLoraConfig__base_model_config_map"
type: PROTO_BINARY_BASE64
sub_type: "cloud_ai_platform_dataplane_prediction_proto.GroundingLoraConfigs"
base_value: ""
id: 0
'
	GroundingLoraConfig__enabled: 'name: "GroundingLoraConfig__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	MmMultithread__enabled: 'name: "MmMultithread__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	PreciseStyleConfig__base_model_config_map: 'name: "PreciseStyleConfig__base_model_config_map"
type: PROTO_BINARY_BASE64
sub_type: "cloud_ai_platform_dataplane_prediction_proto.GroundingLoraConfigs"
base_value: ""
id: 0
'
	PtDynamicTokenCounts__enabled: 'name: "PtDynamicTokenCounts__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	PtTrafficTypeField__enabled: 'name: "PtTrafficTypeField__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	SafetyConfigurabilityFeature__enabled: 'name: "SafetyConfigurabilityFeature__enabled"
type: BOOL
base_value: "TRUE"
id: 0
'
	UseToolApiForGrounding__enabled_model_prefixes: 'name: "UseToolApiForGrounding__enabled_model_prefixes"
type: PROTO
sub_type: "experiments.proto.StringListParam"
base_value: "element: \"gemini-2.0\"\nelement: \"gemini-2.5\"\nelement: \"gemini-3.0\"\n"
id: 0
'
	V1pStreamGenerateContentTopLevelRetries__enabled: 'name: "V1pStreamGenerateContentTopLevelRetries__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	V1pStreamGenerateContentTopLevelRetries__retries_config: 'name: "V1pStreamGenerateContentTopLevelRetries__retries_config"
type: PROTO_BINARY_BASE64
sub_type: "cloud_ai_platform_dataplane_prediction_proto.CardolanRetriesConfig"
base_value: ""
id: 0
'
	OpenMaasStreamRawPredictLongRequest__enabled: 'name: "OpenMaasStreamRawPredictLongRequest__enabled"
type: BOOL
base_value: "FALSE"
modifier {
  value_operator: OVERRIDE
  base_value: "FALSE"
  condition_group {
  }
  condition_index: 811
}
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 645
}
modifier {
  value_operator: OVERRIDE
  base_value: "FALSE"
  condition_group {
  }
  condition_index: 0
}
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 776
}
id: 0
'
	RagConfig__rag_config: 'name: "RagConfig__rag_config"
type: PROTO_BINARY_BASE64
sub_type: "cloud_nl_llm.config_proto.RagConfig"
base_value: "augment_prompt {\n  query_generation {\n    query_generation_model: \"/vertex/ulm-24b-vlc-rep-1\"\n    query_generation_model_signature: \"serving_default\"\n    prompt_instruction: \"Your task is to reformulate the prompt to a concise, fully specified, and context-independent query that can be effectively handled by Google Search. You should include time information to the query if the prompt is time sensitive. You should include location information to the query if the prompt is location sensitive, but you must not make up location information if it is not provided. For example:\\nPrompt: \\nwhat is the weather today?\\nOutput: what is the weather today?\\nPrompt: \\nI\\\'m living in London. Suggest me some weekend getaway ideas?\\nOutput: Weekend getaway ideas in London?\\nPrompt: \\nToday is 2023-10-28. Where is EMNLP this year?\\nOutput: Where is EMNLP in 2023?\\nNow reformulate this prompt today.\\nPrompt: \"\n    num_queries: 1\n    query_model_tokenizer {\n      type: ULM_V0_VOCAB_SPM\n    }\n    temperature: 0\n    top_k: 1\n    top_p: 1\n    max_decoding_steps: 256\n    max_num_input_tokens: 4000\n    strip_prefix: true\n  }\n  retrieval {\n    num_results_per_query: 5\n  }\n  final_prompt {\n    final_prompt_config_for_gemini_mm_001 {\n      gemini_multi_modal_config {\n        construct_option: USE_PROMPT_BLOCK_LIST\n        prompt_block_list {\n          preamble_prompt_block {\n            prompt_template: \"Today is ${today}.\\n\\nThe AI assistant now have the new capability to learn real-time information from internet by using the user\\\'s provided search results.\\n\\nGiven a user query and a list of results, write a response as brief as possible.\\n\\nThe response should always exclude contents which are not in the provided sources.\\n\\nAlways be a respectful AI assistant.\"\n          }\n          prompt_blocks {\n            user_prompt_template: \"Remember the sources of this query: ${search_query}\\n\\nSources\\n##facts-begin##\\n[${fact_index}] ${fact_title}: ${fact_snippet}\\n${fact_url}\\n##facts-end##\\n\\nDo not include inline citations like [1] [2] [3] in the response.\\n\\nIf any information could change rapidly, include a disclaimer to suggest users to check the internet for the most up-to-date information.\"\n          }\n          prompt_blocks {\n            user_prompt_template: \"${original_prompt}\"\n          }\n        }\n      }\n      gemini_multi_modal_config_for_rag {\n        construct_option: USE_PROMPT_BLOCK_LIST\n        prompt_block_list {\n          preamble_prompt_block {\n            prompt_template: \"Today is ${today}.\\n\\nThe AI assistant now have the new capability to learn real-time information from internet by using the user\\\'s provided search results.\\n\\nGiven a user query and a list of results, write a response as brief as possible.\\n\\nThe response should always exclude contents which are not in the provided sources.\\n\\nAlways be a respectful AI assistant.\"\n          }\n          prompt_blocks {\n            user_prompt_template: \"Remember the sources of this query: ${search_query}\\n\\nSources\\n##facts-begin##\\n[${fact_index}] ${fact_title}: ${fact_snippet}\\n${fact_url}\\n##facts-end##\\n\\nDo not include inline citations like [1] [2] [3] in the response.\"\n          }\n          prompt_blocks {\n            user_prompt_template: \"${original_prompt}\"\n          }\n        }\n      }\n    }\n    final_prompt_config_for_gemini_mm_002 {\n      gemini_multi_modal_config {\n        construct_option: USE_PROMPT_BLOCK_LIST\n        prompt_block_list {\n          preamble_prompt_block {\n            prompt_template: \"Today is ${today}.\\n\\nThe AI assistant now have the new capability to learn real-time information from internet by using the user\\\'s provided search results.\\n\\nGiven a user query and a list of results, write a response as brief as possible.\\n\\nThe response should always exclude contents which are not in the provided sources.\\n\\nAlways be a respectful AI assistant.\"\n          }\n          prompt_blocks {\n            user_prompt_template: \"Remember the sources of this query: ${search_query}\\n\\nSources\\n##facts-begin##\\n[${fact_index}] ${fact_title}: ${fact_snippet}\\n${fact_url}\\n##facts-end##\\n\\nDo not include inline citations like [1] [2] [3] in the response.\\n\\nIf any information could change rapidly, include a disclaimer to suggest users to check the internet for the most up-to-date information.\"\n          }\n          prompt_blocks {\n            user_prompt_template: \"Respond in the same language as the last message.\"\n          }\n          prompt_blocks {\n            user_prompt_template: \"${original_prompt}\"\n          }\n        }\n      }\n      gemini_multi_modal_config_for_rag {\n        construct_option: USE_PROMPT_BLOCK_LIST\n        prompt_block_list {\n          preamble_prompt_block {\n            prompt_template: \"Today is ${today}.\\n\\nThe AI assistant now have the new capability to learn real-time information from internet by using the user\\\'s provided search results.\\n\\nGiven a user query and a list of results, write a response as brief as possible.\\n\\nThe response should always exclude contents which are not in the provided sources.\\n\\nAlways be a respectful AI assistant.\"\n          }\n          prompt_blocks {\n            user_prompt_template: \"Remember the sources of this query: ${search_query}\\n\\nSources\\n##facts-begin##\\n[${fact_index}] ${fact_title}: ${fact_snippet}\\n${fact_url}\\n##facts-end##\\n\\nDo not include inline citations like [1] [2] [3] in the response.\"\n          }\n          prompt_blocks {\n            user_prompt_template: \"Respond in the same language as the last message.\"\n          }\n          prompt_blocks {\n            user_prompt_template: \"${original_prompt}\"\n          }\n        }\n      }\n    }\n    final_prompt_config_for_3rd_party_model_default {\n      gemini_multi_modal_config_for_rag {\n        prompt_template: \"Remember today is ${today}. Given a user query and a list of sources,\\nwrite a response supported by the given sources.\\nSources:\\n##facts-begin##\\n[${fact_index}] ${fact_title}: ${fact_snippet}\\nLink: ${fact_url}\\n##facts-end##\\nUser query:\\n${original_prompt}\"\n        construct_option: USE_PROMPT_TEMPLATE\n      }\n    }\n    enhanced_final_prompt_with_citations {\n      gemini_multi_modal_config_for_rag {\n        prompt_template: \"Remember today is ${today}. You are an expert at answering user queries. You are provided with a user query and a list of sources.\\nFirst, carefully evaluate the relevance of each provided source to the user query. If a source is irrelevant, disregard it.\\n\\n**Sources:**\\n##facts-begin##\\n[${fact_index}] ${fact_title}: ${fact_snippet}\\nLink: ${fact_url}\\n##facts-end##\\n\\n**User Query:**\\n${original_prompt}\\n\\n**Instructions:**\\n\\n1.  **Relevance Check:** Determine if the provided sources are directly relevant to answering the user query.\\n2.  **Source-Based Response & Citation:** If relevant sources are found, synthesize information from them to provide a comprehensive and accurate response. You must cite your sources using the following format: [${fact_index}]. Place the citation immediately after the relevant sentence, with no space between the last word and the brackets. Include the full fact index and fact url at the end of your response for each cited source, each starting on a new line.\\n3.  **System Instruction Fallback:** If NO relevant sources are found, state that \\\"The provided RAG sources are not relevant to the user query.\\\" Then, answer the query based on your own knowledge and system instructions. Do not mention that you used your own knowledge, just answer the query.\\n4.  **Conciseness and Accuracy:** Prioritize providing accurate and concise information. Do not hallucinate. Do not repeat the user query in your response.\\n\\n**Here is an example:**\\n##example-begin##\\n**Sources:**\\n##facts-begin##\\n[1] Intro to RAG: Retrieval augmented generation (RAG) is a technique for improving the quality of large language model (LLM) responses by incorporating relevant information from a knowledge base. \\nLink: gs://rag/file1.pdf\\n[2] Cooking 101: Cooking with a wok is recommended because it\\\'s high-heat capability allow for rapid, even cooking.\\nLink: gs://cooking_wok/file2.pdf\\n[3] RAG deep-dive: RAG is useful because it helps LLM ground its responses using external knowledge.\\nLink: gs://rag/file3.pdf\\n##facts-end##\\n\\n**User Query:**\\nWhat is RAG and why is it useful?\\n\\nOutput:\\nRAG is a technique for improving the quality of large language model (LLM) responses by incorporating relevant information from a knowledge base[1]. It is useful because it helps LLM ground its responses using external knowledge[3].\\n\\n[1] gs://rag/file1.pdf\\n[3] gs://rag/file3.pdf\\n##example-end##\\n\"\n        construct_option: USE_PROMPT_TEMPLATE\n      }\n    }\n  }\n}\ncorrobate_content {\n  corroboration {\n    nli {\n      stub_name: \"vertex-us-central1\"\n      nli_model: \"projects/265104255505/locations/us-central1/endpoints/924464978587549696\"\n      nli_batch_size: 16\n    }\n    corroboration_threshold: 0.6\n    max_citations_per_claim: 1\n    citation_threshold: 0.6\n  }\n}\n"
id: 0
'
	CheetahOnboardedModelGardenPlaygroundPublisherEndpoint__cheetah_sd_endpoint: 'name: "CheetahOnboardedModelGardenPlaygroundPublisherEndpoint__cheetah_sd_endpoint"
type: STRING
base_value: "projects/cheetah-autopush/locations/us-central1/namespaces/namespace-cheetah-cluster/services/service-cheetah-cluster/endpoints/endpoint-cheetah-cluster"
modifier {
  value_operator: OVERRIDE
  base_value: "projects/cheetah-prod/locations/us-central1/namespaces/namespace-cheetah-cluster/services/service-cheetah-cluster/endpoints/endpoint-cheetah-cluster"
  condition_group {
  }
  condition_index: 0
}
id: 0
'
	CheetahOnboardedModelGardenPlaygroundPublisherEndpoint__onboarded_publisher_endpoints: 'name: "CheetahOnboardedModelGardenPlaygroundPublisherEndpoint__onboarded_publisher_endpoints"
type: PROTO
sub_type: "experiments.proto.StringListParam"
base_value: ""
id: 0
'
	CodeyText2sqlFeature__enabled: 'name: "CodeyText2sqlFeature__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	EnablePredictApiAllowlistFor1p__enabled: 'name: "EnablePredictApiAllowlistFor1p__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	Musicgeneration__billing_enabled: 'name: "Musicgeneration__billing_enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	Musicgeneration__enabled: 'name: "Musicgeneration__enabled"
type: BOOL
base_value: "TRUE"
id: 0
'
	RetryHarpoonNoResponseError__enabled: 'name: "RetryHarpoonNoResponseError__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	TestFeature__enabled: 'name: "TestFeature__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	AnthropicLongRequest__allowlisted_endpoints: 'name: "AnthropicLongRequest__allowlisted_endpoints"
type: PROTO
sub_type: "experiments.proto.Int64ListParam"
base_value: "element: 1000000000000000000\nelement: 1000000000000000000\nelement: 1000000000000000000\nelement: 2056572728080596992\nelement: 2783849092319543296\nelement: 2094853324913246208\nelement: 8318773034357882880\nelement: 5084803672836145152\nelement: 3262791959332257792\nelement: 2956613155368730624\nelement: 2964551629321273344\nelement: 6093889265320591360\nelement: 5730894297503367168\nelement: 9082071598546026496\nelement: 8267030017154744320\nelement: 3070263075262169088\nelement: 426386211205021696\nelement: 4661465307863318528\nelement: 3367500650668621824\nelement: 7000902795352080384\nelement: 5831145568700727296\nelement: 3310708676070735872\nelement: 2124753444118986752\nelement: 6349059525848334336\nelement: 3866481017585926144\nelement: 5373590401871511552\nelement: 8704261811056148480\nelement: 7616642501046173696\nelement: 7616642501046173696\nelement: 1840516494199357440\nelement: 2868920505983827968\nelement: 4057358435191095296\nelement: 1818926483876347904\nelement: 7796843660745637888\nelement: 9123774975075942400\nelement: 4095025504535445504\nelement: 3733908502578462720\nelement: 8368426979468247040\nelement: 4994452404335280128\nelement: 5122655460133961728\nelement: 775986928372678656\nelement: 5130022188040060928\nelement: 7974181692166373376\nelement: 6775085097239445504\nelement: 2949677436020719616\nelement: 7331037758661591040\nelement: 1243851116223922176\nelement: 2435554594965684224\nelement: 6428620187234205696\nelement: 4061862034818465792\nelement: 5221769836308201472\nelement: 8527952922519011328\nelement: 2667727470286864384\nelement: 6238483840466157568\nelement: 1192064118555672576\nelement: 4206348857824509952\nelement: 216713741834649600\nelement: 115998476730368000\nelement: 1541519700188463104\nelement: 1539962791723532288\nelement: 1895188610378891264\nelement: 13497604742578176\nelement: 8232593312972800000\nelement: 7752608108936953856\nelement: 8131216141868597248\nelement: 4918792810204758016\nelement: 5070362687116935168\nelement: 1609210034040864768\nelement: 4625183623169966080\nelement: 5637006999607574528\nelement: 4533871381506424832\nelement: 6365083808311541760\nelement: 3632522535381237760\nelement: 8539418629773459456\nelement: 4691251077859770368\nelement: 2314546743259168768\nelement: 4667226748792864768\nelement: 1405492519646527488\nelement: 4735407464831254528\nelement: 723922853774229504\nelement: 2561409093927436288\nelement: 2121727588119347200\nelement: 9149291341421740032\nelement: 6072734661602181120\nelement: 1234567890\nelement: 1034290797039583232\nelement: 8223287046555303936\nelement: 3815094242150187008\nelement: 5464937827907141632\nelement: 8292536487895891968\nelement: 646171988527677440\nelement: 4854594525282172928\nelement: 5228701157609701376\nelement: 8159700090097762304\nelement: 6464930459229880320\nelement: 2843321676265947136\nelement: 2100911633982291968\nelement: 260606246015467520\nelement: 604047499044323328\nelement: 7759282144517554176\nelement: 450195035992883200\nelement: 2978308718808006656\nelement: 2680367455959777280\nelement: 4939617560434835456\nelement: 9170281018395983872\nelement: 5196177603660087296\nelement: 74445733293457408\nelement: 2187660902390562816\nelement: 4325243448182439936\nelement: 12345\nelement: 8296426560034963456\nelement: 2470457492077805568\nelement: 2855647201613316096\nelement: 2000000000000000000\nelement: 2000000000000000001\n"
id: 0
'
	DisaggregationLora__skip_loading_weights: 'name: "DisaggregationLora__skip_loading_weights"
type: BOOL
base_value: "FALSE"
id: 0
'
	EnableAutomlTextDeprecation__enabled: 'name: "EnableAutomlTextDeprecation__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	GenaiAllowlistBasedFeature__bypass_safetycat: 'name: "GenaiAllowlistBasedFeature__bypass_safetycat"
type: BOOL
base_value: "FALSE"
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 373
}
id: 0
'
	GenaiAllowlistBasedFeature__disable_csam_filter: 'name: "GenaiAllowlistBasedFeature__disable_csam_filter"
type: BOOL
base_value: "FALSE"
id: 0
'
	ImplicitCacheFeature__blocked_model_ids: 'name: "ImplicitCacheFeature__blocked_model_ids"
type: PROTO
sub_type: "experiments.proto.StringListParam"
base_value: "element: \"gemini-1.0-pro-001\"\nelement: \"gemini-1.0-pro-002\"\nelement: \"gemini-2.5-flash-preview-image-generation\"\nelement: \"gemini-2.5-flash-image-generation\"\nelement: \"gemini-3.0-pro-image-preview\"\n"
id: 0
'
	ImplicitCacheFeature__opt_out_folders: 'name: "ImplicitCacheFeature__opt_out_folders"
type: PROTO
sub_type: "experiments.proto.Int64ListParam"
base_value: ""
id: 0
'
	ImplicitCacheFeature__opt_out_organizations: 'name: "ImplicitCacheFeature__opt_out_organizations"
type: PROTO
sub_type: "experiments.proto.Int64ListParam"
base_value: "element: 433637338589\nelement: 654067872289\nelement: 219343716865\nelement: 700634052600\n"
id: 0
'
	ImplicitCacheFeature__projects_blocked_for_no_secondary_project: 'name: "ImplicitCacheFeature__projects_blocked_for_no_secondary_project"
type: PROTO
sub_type: "experiments.proto.Int64ListParam"
base_value: "element: 871119526263\nelement: 620953470579\nelement: 891916320499\nelement: 890378836652\nelement: 847059550923\nelement: 388607615003\nelement: 50385276824\nelement: 702376180097\nelement: 337989173682\nelement: 376707758841\nelement: 477064308833\nelement: 252242268409\nelement: 24929004092\nelement: 542083300573\nelement: 1009623253686\nelement: 714204944639\nelement: 820995109441\n"
id: 0
'
	PinnacleSpillover__is_spillover_request: 'name: "PinnacleSpillover__is_spillover_request"
type: BOOL
base_value: "FALSE"
id: 0
'
	PtCharacterQuota__enabled: 'name: "PtCharacterQuota__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	RhudaurUnaryGrpcProjectAllowlist__enabled: 'name: "RhudaurUnaryGrpcProjectAllowlist__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	EnableDebugMetricSloExclude__enabled: 'name: "EnableDebugMetricSloExclude__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	EnableDebugMetricSloExclude__slo_excluded_regex: 'name: "EnableDebugMetricSloExclude__slo_excluded_regex"
type: PROTO
sub_type: "experiments.proto.StringListParam"
base_value: "element: \"(syntax error|Peer is not within the traffic residency constraints)\"\nelement: \"^gemini-2.5-pro.*::(945553600981|524636045653|743006342172|89950862312)::.*cancelled\"\nelement: \"(invalid keyword arguments|Model emitted tool code when neither function calling nor tool code decoding is enabled)\"\nelement: \"(612735420533|801452371447)\"\n"
id: 0
'
	EnableGenerativeAiRequestThroughput__enabled: 'name: "EnableGenerativeAiRequestThroughput__enabled"
type: BOOL
base_value: "FALSE"
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 14
}
id: 0
'
	EnableQuotaCheckInfoExtension__enabled: 'name: "EnableQuotaCheckInfoExtension__enabled"
type: BOOL
base_value: "FALSE"
modifier {
  value_operator: OVERRIDE
  base_value: "FALSE"
  condition_group {
  }
  condition_index: 856
}
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 768
}
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 1113
}
id: 0
'
	GrpcToStubby__bidi_generate_content_using_stubby: 'name: "GrpcToStubby__bidi_generate_content_using_stubby"
type: BOOL
base_value: "FALSE"
id: 0
'
	GrpcToStubby__stream_generate_multi_modal_using_stubby: 'name: "GrpcToStubby__stream_generate_multi_modal_using_stubby"
type: BOOL
base_value: "FALSE"
id: 0
'
	GenAiBatchCachingModelBlocklist__explicit_cache_model_blocklist: 'name: "GenAiBatchCachingModelBlocklist__explicit_cache_model_blocklist"
type: PROTO
sub_type: "experiments.proto.StringListParam"
base_value: "element: \"gemini-pro\"\nelement: \"gemini-ultra\"\nelement: \"gemini-1.0-pro\"\nelement: \"gemini-1.0-pro-001\"\nelement: \"gemini-1.0-pro-002\"\nelement: \"gemini-1.0-pro-vision\"\nelement: \"gemini-1.5-flash-001\"\nelement: \"gemini-1.5-flash-001-batchx\"\nelement: \"gemini-1.5-flash-002\"\nelement: \"gemini-1.5-flash-8b-002\"\nelement: \"gemini-1.5-flash-preview-0514\"\nelement: \"gemini-1.5-pro-001\"\nelement: \"gemini-1.5-pro-002\"\nelement: \"gemini-1.5-pro-preview-0215\"\nelement: \"gemini-1.5-pro-preview-0409\"\nelement: \"gemini-1.5-pro-preview-0514\"\nelement: \"gemini-2.0-flash-001\"\nelement: \"gemini-2.0-flash-lite-001\"\nelement: \"gemini-2.0-flash-lite-preview-02-05\"\nelement: \"gemini-2.5-flash-001\"\nelement: \"gemini-2.5-flash-preview-04-17\"\nelement: \"gemini-2.5-pro-001\"\nelement: \"gemini-2.5-pro-preview-03-25\"\nelement: \"gemini-2.5-pro-preview-05-06\"\nelement: \"gemini-2.5-pro-preview-06-05\"\n"
modifier {
  value_operator: OVERRIDE
  base_value: "element: \"gemini-pro\"\nelement: \"gemini-ultra\"\nelement: \"gemini-1.0-pro\"\nelement: \"gemini-1.0-pro-001\"\nelement: \"gemini-1.0-pro-002\"\nelement: \"gemini-1.0-pro-vision\"\nelement: \"gemini-1.5-flash-001\"\nelement: \"gemini-1.5-flash-001-batchx\"\nelement: \"gemini-1.5-flash-002\"\nelement: \"gemini-1.5-flash-8b-002\"\nelement: \"gemini-1.5-flash-preview-0514\"\nelement: \"gemini-1.5-pro-001\"\nelement: \"gemini-1.5-pro-002\"\nelement: \"gemini-1.5-pro-preview-0215\"\nelement: \"gemini-1.5-pro-preview-0409\"\nelement: \"gemini-1.5-pro-preview-0514\"\nelement: \"gemini-2.0-flash-001\"\nelement: \"gemini-2.0-flash-lite-001\"\nelement: \"gemini-2.0-flash-lite-preview-02-05\"\nelement: \"gemini-2.5-flash-001\"\nelement: \"gemini-2.5-flash-preview-04-17\"\nelement: \"gemini-2.5-pro-001\"\nelement: \"gemini-2.5-pro-preview-03-25\"\nelement: \"gemini-2.5-pro-preview-05-06\"\nelement: \"gemini-2.5-pro-preview-06-05\"\n"
  condition_group {
  }
  condition_index: 381
}
id: 0
'
	GenAiBatchCachingModelBlocklist__model_blocklist: 'name: "GenAiBatchCachingModelBlocklist__model_blocklist"
type: PROTO
sub_type: "experiments.proto.StringListParam"
base_value: "element: \"gemini-pro\"\nelement: \"gemini-ultra\"\nelement: \"gemini-1.0-pro\"\nelement: \"gemini-1.0-pro-001\"\nelement: \"gemini-1.0-pro-002\"\nelement: \"gemini-1.0-pro-vision\"\nelement: \"gemini-1.5-flash-001\"\nelement: \"gemini-1.5-flash-001-batchx\"\nelement: \"gemini-1.5-flash-002\"\nelement: \"gemini-1.5-flash-8b-002\"\nelement: \"gemini-1.5-flash-preview-0514\"\nelement: \"gemini-1.5-pro-001\"\nelement: \"gemini-1.5-pro-002\"\nelement: \"gemini-1.5-pro-preview-0215\"\nelement: \"gemini-1.5-pro-preview-0409\"\nelement: \"gemini-1.5-pro-preview-0514\"\nelement: \"gemini-2.0-flash-001\"\nelement: \"gemini-2.0-flash-lite-001\"\nelement: \"gemini-2.0-flash-lite-preview-02-05\"\nelement: \"gemini-2.5-flash-preview-04-17\"\nelement: \"gemini-2.5-pro-preview-03-25\"\nelement: \"gemini-2.5-pro-preview-05-06\"\nelement: \"gemini-2.5-pro-preview-06-05\"\n"
modifier {
  value_operator: OVERRIDE
  base_value: ""
  condition_group {
  }
  condition_index: 381
}
id: 0
'
	PinnacleAllowedPartnerModelFeaturesOrgPolicy__enabled: 'name: "PinnacleAllowedPartnerModelFeaturesOrgPolicy__enabled"
type: BOOL
base_value: "FALSE"
modifier {
  value_operator: OVERRIDE
  base_value: "FALSE"
  condition_group {
  }
  condition_index: 863
}
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 779
}
modifier {
  value_operator: OVERRIDE
  base_value: "FALSE"
  condition_group {
  }
  condition_index: 0
}
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 1118
}
id: 0
'
	PinnacleConfigurableWebSearchPrecharge__enabled: 'name: "PinnacleConfigurableWebSearchPrecharge__enabled"
type: BOOL
base_value: "FALSE"
modifier {
  value_operator: OVERRIDE
  base_value: "FALSE"
  condition_group {
  }
  condition_index: 864
}
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 780
}
modifier {
  value_operator: OVERRIDE
  base_value: "FALSE"
  condition_group {
  }
  condition_index: 0
}
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 1119
}
id: 0
'
	PinnacleDisableCsamFiltering__enabled: 'name: "PinnacleDisableCsamFiltering__enabled"
type: BOOL
base_value: "FALSE"
modifier {
  value_operator: OVERRIDE
  base_value: "FALSE"
  condition_group {
  }
  condition_index: 865
}
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 781
}
modifier {
  value_operator: OVERRIDE
  base_value: "FALSE"
  condition_group {
  }
  condition_index: 0
}
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 1120
}
id: 0
'
	PinnacleMCPTool__enabled: 'name: "PinnacleMCPTool__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	PinnacleWebFetchTool__enabled: 'name: "PinnacleWebFetchTool__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	PinnacleWebSearchTool__enabled: 'name: "PinnacleWebSearchTool__enabled"
type: BOOL
base_value: "FALSE"
modifier {
  value_operator: OVERRIDE
  base_value: "FALSE"
  condition_group {
  }
  condition_index: 868
}
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 787
}
modifier {
  value_operator: OVERRIDE
  base_value: "FALSE"
  condition_group {
  }
  condition_index: 0
}
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 1124
}
id: 0
'
	PinnacleTrustQosOverride__trust_qos_override: 'name: "PinnacleTrustQosOverride__trust_qos_override"
type: INT
base_value: "0"
id: 0
'
	FlexApiConfig__enabled: 'name: "FlexApiConfig__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	FlexApiConfig__flex_api_config: 'name: "FlexApiConfig__flex_api_config"
type: PROTO_BINARY_BASE64
sub_type: "cloud_ai_platform_dataplane_prediction_proto.FlexApiConfig"
base_value: "max_payload_size_bytes: 1073741824\n"
id: 0
'
	StreamAbortErrorRewrite__enabled: 'name: "StreamAbortErrorRewrite__enabled"
type: BOOL
base_value: "FALSE"
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 84
}
id: 0
'
	CaptureIntegrationTestInfo__enabled: 'name: "CaptureIntegrationTestInfo__enabled"
type: BOOL
base_value: "FALSE"
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 4
}
id: 0
'
	EnableCacheAsideInEndpointsModule__enabled: 'name: "EnableCacheAsideInEndpointsModule__enabled"
type: BOOL
base_value: "FALSE"
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 13
}
id: 0
'
	FineTunedEndpointsProjectAllowlisting__enabled: 'name: "FineTunedEndpointsProjectAllowlisting__enabled"
type: BOOL
base_value: "FALSE"
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 30
}
id: 0
'
	FinetunePredictionAccessCheck__enabled: 'name: "FinetunePredictionAccessCheck__enabled"
type: BOOL
base_value: "FALSE"
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 31
}
id: 0
'
	TunedModelUseBaseModelToQCDMap__textproto: 'name: "TunedModelUseBaseModelToQCDMap__textproto"
type: STRING
base_value: "\n        quota_custom_dimension_map {\n          key: \"gemini-2.5-flash\"\n          value {\n            base_model_id_and_version: {\n              base_model_id: \"gemini-2.5-flash-ga\"\n              base_model_version: \"default[-env]\"\n            }\n          }\n        }\n        quota_custom_dimension_map {\n          key: \"gemini-2.5-pro\"\n          value {\n            base_model_id_and_version: {\n              base_model_id: \"gemini-2.5-pro-ga\"\n              base_model_version: \"default[-env]\"\n            }\n          }\n        }\n        quota_custom_dimension_map {\n          key: \"gemini-2.5-flash-lite\"\n          value {\n            base_model_id_and_version: {\n              base_model_id: \"gemini-2.5-flash-lite\"\n              base_model_version: \"default[-env]\"\n            }\n          }\n        }\n        "
id: 0
'
	TunedModelUseBaseModelToQCD__enabled: 'name: "TunedModelUseBaseModelToQCD__enabled"
type: BOOL
base_value: "FALSE"
modifier {
  value_operator: OVERRIDE
  base_value: "FALSE"
  condition_group {
  }
  condition_index: 850
}
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 652
}
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 791
}
id: 0
'
	CheckOrgPolicyAllPublisherModels__enabled: 'name: "CheckOrgPolicyAllPublisherModels__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	CheckOrgPolicy__enabled: 'name: "CheckOrgPolicy__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	CheckOrgPolicy__enabled_for_allowed_projects: 'name: "CheckOrgPolicy__enabled_for_allowed_projects"
type: BOOL
base_value: "FALSE"
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 682
}
id: 0
'
	EnableModelRegistryDependencyRemoval__enabled: 'name: "EnableModelRegistryDependencyRemoval__enabled"
type: BOOL
base_value: "TRUE"
id: 0
'
	EnablePredictionAccessRuleGwsLog__enabled: 'name: "EnablePredictionAccessRuleGwsLog__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	EnableServingSpecCache__enabled: 'name: "EnableServingSpecCache__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	EnableServingSpecsSpannerStaleRead__enabled: 'name: "EnableServingSpecsSpannerStaleRead__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	GoogleSearchGroundingOptOut__folders: 'name: "GoogleSearchGroundingOptOut__folders"
type: PROTO
sub_type: "experiments.proto.Int64ListParam"
base_value: ""
id: 0
'
	GoogleSearchGroundingOptOut__orgs: 'name: "GoogleSearchGroundingOptOut__orgs"
type: PROTO
sub_type: "experiments.proto.Int64ListParam"
base_value: "element: 669358817895\nelement: 595994669073\nelement: 261070191423\nelement: 472823025646\n"
id: 0
'
	GoogleSearchGroundingOptOut__projects: 'name: "GoogleSearchGroundingOptOut__projects"
type: PROTO
sub_type: "experiments.proto.Int64ListParam"
base_value: ""
id: 0
'
	GroundingCrmSetting__enabled: 'name: "GroundingCrmSetting__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	OrgLevelAllowlist__enabled: 'name: "OrgLevelAllowlist__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	OrgPolicyForGroundingSources__enabled: 'name: "OrgPolicyForGroundingSources__enabled"
type: BOOL
base_value: "FALSE"
modifier {
  value_operator: OVERRIDE
  base_value: "FALSE"
  condition_group {
  }
  condition_index: 861
}
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 777
}
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 1116
}
id: 0
'
	EnableLiveApiAllowedModels__allowed_models: 'name: "EnableLiveApiAllowedModels__allowed_models"
type: PROTO
sub_type: "experiments.proto.StringListParam"
base_value: "element: \"gemini-2.0-flash-live-001\"\nelement: \"gemini-2.0-flash-lite-001\"\nelement: \"gemini-2.0-flash-live-preview-04-09\"\nelement: \"chirp3-v1\"\nelement: \"gemini-2.5-flash-live-001\"\nelement: \"gemini-2.5-flash-live-preview-05-20\"\nelement: \"gemini-2.5-flash-preview-dialog-0520\"\nelement: \"gemini-2.5-flash-preview-dialog\"\nelement: \"gemini-live-2.5-flash\"\nelement: \"gemini-2.5-flash-auto-test\"\nelement: \"gemini-2.5-flash-manual-test\"\nelement: \"gemini-2.5-flash-preview-native-audio-dialog\"\nelement: \"gemini-live-2.5-flash-preview-native-audio\"\nelement: \"gemini-live-2.5-flash-native-audio-preview-07-29\"\nelement: \"gemini-live-2.5-flash-preview-native-audio-09-09\"\nelement: \"gemini-live-2.5-flash-preview-native-audio-09-17\"\nelement: \"gemini-live-2.5-flash-preview-native-audio-09-2025\"\nelement: \"gemini-live-2.5-flash-native-audio\"\nelement: \"gemini-live-2.5-flash-native-audio-exp\"\nelement: \"gemini-2.5-flash-s2st-exp-11-2025\"\n"
id: 0
'
	EnableLiveApiAllowedModels__blocked_models_for_generate_content: 'name: "EnableLiveApiAllowedModels__blocked_models_for_generate_content"
type: PROTO
sub_type: "experiments.proto.StringListParam"
base_value: "element: \"gemini-2.0-flash-live-001\"\nelement: \"gemini-2.0-flash-live-preview-04-09\"\nelement: \"gemini-2.5-flash-live-001\"\nelement: \"gemini-2.5-flash-live-preview-05-20\"\nelement: \"gemini-2.5-flash-preview-dialog-0520\"\nelement: \"gemini-2.5-flash-preview-dialog\"\nelement: \"gemini-live-2.5-flash\"\nelement: \"gemini-2.5-flash-preview-native-audio-dialog\"\nelement: \"gemini-live-2.5-flash-preview-native-audio\"\nelement: \"gemini-live-2.5-flash-native-audio-preview-07-29\"\nelement: \"gemini-live-2.5-flash-preview-native-audio-09-09\"\nelement: \"gemini-live-2.5-flash-preview-native-audio-09-17\"\nelement: \"gemini-live-2.5-flash-preview-native-audio-09-2025\"\nelement: \"gemini-live-2.5-flash-native-audio\"\nelement: \"gemini-live-2.5-flash-native-audio-exp\"\nelement: \"gemini-2.5-flash-s2st-exp-11-2025\"\n"
id: 0
'
	EnableLiveApiAllowedModels__enabled: 'name: "EnableLiveApiAllowedModels__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	GenaiBillingV2EnabledBaseModelToBillingProfileIdMap__billing_v2_enabled_base_model_to_billing_profile_id_map: 'name: "GenaiBillingV2EnabledBaseModelToBillingProfileIdMap__billing_v2_enabled_base_model_to_billing_profile_id_map"
type: STRING
base_value: ""
id: 0
'
	GenaiLongContextBillingFeature__long_context_billing_enabled_models_window_size_128k: 'name: "GenaiLongContextBillingFeature__long_context_billing_enabled_models_window_size_128k"
type: PROTO
sub_type: "experiments.proto.StringListParam"
base_value: "element: \"gemini-1.5-flash\"\nelement: \"gemini-1.5-pro\"\nelement: \"gemini-2.0-flash\"\nelement: \"gemini-2.0-flash-preview-image-generation\"\n"
id: 0
'
	GenaiLongContextBillingFeature__long_context_billing_enabled_models_window_size_200k: 'name: "GenaiLongContextBillingFeature__long_context_billing_enabled_models_window_size_200k"
type: PROTO
sub_type: "experiments.proto.StringListParam"
base_value: "element: \"gemini-2.5-flash-preview\"\nelement: \"gemini-2.5-pro-preview\"\nelement: \"gemini-2.5-flash\"\nelement: \"gemini-2.5-pro\"\nelement: \"gemini-2.5-flash-ga\"\nelement: \"gemini-2.5-pro-ga\"\nelement: \"gemini-2.5-flash-image\"\nelement: \"gemini-2.5-flash-image-ga\"\nelement: \"computer-use-preview\"\nelement: \"gemini-2.5-flash-manual-test\"\nelement: \"gemini-3-flash\"\nelement: \"gemini-3.0-pro\"\n"
id: 0
'
	DsqOrgLimitDenylist__denylist: 'name: "DsqOrgLimitDenylist__denylist"
type: PROTO
sub_type: "experiments.proto.StringListParam"
base_value: "element: \"organizations/1009801580386\"\nelement: \"organizations/1009911481441\"\nelement: \"organizations/1010302833757\"\nelement: \"organizations/1015867442176\"\nelement: \"organizations/1020741216091\"\nelement: \"organizations/1022647817921\"\nelement: \"organizations/1023843506312\"\nelement: \"organizations/1032833794561\"\nelement: \"organizations/1035901152068\"\nelement: \"organizations/1037670760632\"\nelement: \"organizations/1038829057055\"\nelement: \"organizations/1047376210326\"\nelement: \"organizations/1051569445158\"\nelement: \"organizations/1053470820561\"\nelement: \"organizations/1057308932815\"\nelement: \"organizations/1064420216623\"\nelement: \"organizations/1066591588807\"\nelement: \"organizations/1082365578311\"\nelement: \"organizations/1082881800234\"\nelement: \"organizations/1092843478\"\nelement: \"organizations/1097930952118\"\nelement: \"organizations/117883603492\"\nelement: \"organizations/127988581020\"\nelement: \"organizations/131124864614\"\nelement: \"organizations/132648631348\"\nelement: \"organizations/132681594866\"\nelement: \"organizations/134289361963\"\nelement: \"organizations/138507926542\"\nelement: \"organizations/138921233419\"\nelement: \"organizations/139404252524\"\nelement: \"organizations/139548389842\"\nelement: \"organizations/139686217949\"\nelement: \"organizations/143569286330\"\nelement: \"organizations/147531538846\"\nelement: \"organizations/152247601662\"\nelement: \"organizations/158267417662\"\nelement: \"organizations/162685228972\"\nelement: \"organizations/168607774456\"\nelement: \"organizations/168750395057\"\nelement: \"organizations/172926890473\"\nelement: \"organizations/183649412752\"\nelement: \"organizations/191087036201\"\nelement: \"organizations/191223837169\"\nelement: \"organizations/196835236469\"\nelement: \"organizations/197534888130\"\nelement: \"organizations/199295775674\"\nelement: \"organizations/199490750663\"\nelement: \"organizations/203358828897\"\nelement: \"organizations/203670184675\"\nelement: \"organizations/2048386265\"\nelement: \"organizations/220912960087\"\nelement: \"organizations/221360738765\"\nelement: \"organizations/225313812765\"\nelement: \"organizations/226103390169\"\nelement: \"organizations/227765910365\"\nelement: \"organizations/231382700574\"\nelement: \"organizations/233505007937\"\nelement: \"organizations/233526418605\"\nelement: \"organizations/242575240052\"\nelement: \"organizations/250166811713\"\nelement: \"organizations/250443461763\"\nelement: \"organizations/250916129461\"\nelement: \"organizations/255758916152\"\nelement: \"organizations/255908317825\"\nelement: \"organizations/259300546097\"\nelement: \"organizations/260901610912\"\nelement: \"organizations/263399396574\"\nelement: \"organizations/269666726474\"\nelement: \"organizations/280647637787\"\nelement: \"organizations/281140095463\"\nelement: \"organizations/283138423919\"\nelement: \"organizations/292095750786\"\nelement: \"organizations/302681460499\"\nelement: \"organizations/304136206388\"\nelement: \"organizations/307305912835\"\nelement: \"organizations/308097475064\"\nelement: \"organizations/311517893934\"\nelement: \"organizations/311978066853\"\nelement: \"organizations/318973726235\"\nelement: \"organizations/319215453412\"\nelement: \"organizations/320851063511\"\nelement: \"organizations/321254609680\"\nelement: \"organizations/32136287506\"\nelement: \"organizations/321997540064\"\nelement: \"organizations/324784508963\"\nelement: \"organizations/326595275970\"\nelement: \"organizations/340238846692\"\nelement: \"organizations/34424656084\"\nelement: \"organizations/348237442628\"\nelement: \"organizations/350605338743\"\nelement: \"organizations/350979609089\"\nelement: \"organizations/352790554748\"\nelement: \"organizations/363385237196\"\nelement: \"organizations/366839382793\"\nelement: \"organizations/373557989320\"\nelement: \"organizations/378823928906\"\nelement: \"organizations/379193874099\"\nelement: \"organizations/379860269779\"\nelement: \"organizations/380471816720\"\nelement: \"organizations/381377178038\"\nelement: \"organizations/382735677332\"\nelement: \"organizations/384350719952\"\nelement: \"organizations/390865792109\"\nelement: \"organizations/392169565828\"\nelement: \"organizations/396249640222\"\nelement: \"organizations/403286592892\"\nelement: \"organizations/408066853528\"\nelement: \"organizations/408710614801\"\nelement: \"organizations/413422510926\"\nelement: \"organizations/413804088605\"\nelement: \"organizations/42278113357\"\nelement: \"organizations/42768659627\"\nelement: \"organizations/430199081660\"\nelement: \"organizations/430586623214\"\nelement: \"organizations/431560455620\"\nelement: \"organizations/43405512128\"\nelement: \"organizations/436824295055\"\nelement: \"organizations/439979396701\"\nelement: \"organizations/441187628316\"\nelement: \"organizations/445873321181\"\nelement: \"organizations/470916834186\"\nelement: \"organizations/473182925470\"\nelement: \"organizations/474444709631\"\nelement: \"organizations/485752873670\"\nelement: \"organizations/487663624827\"\nelement: \"organizations/493595111382\"\nelement: \"organizations/494667196037\"\nelement: \"organizations/496896363320\"\nelement: \"organizations/503062733663\"\nelement: \"organizations/5033357130\"\nelement: \"organizations/503829759623\"\nelement: \"organizations/505723047412\"\nelement: \"organizations/514751170291\"\nelement: \"organizations/521631759585\"\nelement: \"organizations/525307871097\"\nelement: \"organizations/529974653550\"\nelement: \"organizations/531195336225\"\nelement: \"organizations/531591688136\"\nelement: \"organizations/532616220254\"\nelement: \"organizations/53989931248\"\nelement: \"organizations/540592142017\"\nelement: \"organizations/541175756436\"\nelement: \"organizations/543675126372\"\nelement: \"organizations/545631219932\"\nelement: \"organizations/54643501348\"\nelement: \"organizations/549103496072\"\nelement: \"organizations/551534052094\"\nelement: \"organizations/552306434765\"\nelement: \"organizations/555217898458\"\nelement: \"organizations/556010801533\"\nelement: \"organizations/55940165682\"\nelement: \"organizations/565738043073\"\nelement: \"organizations/577142479039\"\nelement: \"organizations/584176229162\"\nelement: \"organizations/585994414067\"\nelement: \"organizations/592408874637\"\nelement: \"organizations/595994669073\"\nelement: \"organizations/59644976444\"\nelement: \"organizations/599510563883\"\nelement: \"organizations/600466516300\"\nelement: \"organizations/607598962355\"\nelement: \"organizations/611307578288\"\nelement: \"organizations/613962546144\"\nelement: \"organizations/614831433169\"\nelement: \"organizations/621002717677\"\nelement: \"organizations/621388623072\"\nelement: \"organizations/622925888631\"\nelement: \"organizations/628553795380\"\nelement: \"organizations/637987714668\"\nelement: \"organizations/642708779950\"\nelement: \"organizations/642823832414\"\nelement: \"organizations/649389749562\"\nelement: \"organizations/650727870094\"\nelement: \"organizations/651190608786\"\nelement: \"organizations/658938597457\"\nelement: \"organizations/661595339816\"\nelement: \"organizations/66278518872\"\nelement: \"organizations/664039355083\"\nelement: \"organizations/674375320999\"\nelement: \"organizations/67654422471\"\nelement: \"organizations/676993294933\"\nelement: \"organizations/678556080379\"\nelement: \"organizations/679844191709\"\nelement: \"organizations/683655960516\"\nelement: \"organizations/701374442558\"\nelement: \"organizations/703839445546\"\nelement: \"organizations/704070403148\"\nelement: \"organizations/705241220036\"\nelement: \"organizations/707541429585\"\nelement: \"organizations/721679228784\"\nelement: \"organizations/722026089310\"\nelement: \"organizations/722478225383\"\nelement: \"organizations/723901964575\"\nelement: \"organizations/727165143864\"\nelement: \"organizations/745849308593\"\nelement: \"organizations/753868874903\"\nelement: \"organizations/754862027369\"\nelement: \"organizations/757044605357\"\nelement: \"organizations/766273397686\"\nelement: \"organizations/766905427223\"\nelement: \"organizations/767040907622\"\nelement: \"organizations/767403461635\"\nelement: \"organizations/76740709494\"\nelement: \"organizations/771197454704\"\nelement: \"organizations/773567514706\"\nelement: \"organizations/773863481222\"\nelement: \"organizations/778115052724\"\nelement: \"organizations/778357758552\"\nelement: \"organizations/78200021690\"\nelement: \"organizations/784919459617\"\nelement: \"organizations/796508071153\"\nelement: \"organizations/799325545314\"\nelement: \"organizations/800913124495\"\nelement: \"organizations/806711562222\"\nelement: \"organizations/807596908187\"\nelement: \"organizations/808840416738\"\nelement: \"organizations/809067657803\"\nelement: \"organizations/818562151839\"\nelement: \"organizations/825417849120\"\nelement: \"organizations/826065770220\"\nelement: \"organizations/833841093242\"\nelement: \"organizations/835771383018\"\nelement: \"organizations/84585637340\"\nelement: \"organizations/851492098837\"\nelement: \"organizations/863762365098\"\nelement: \"organizations/864731287123\"\nelement: \"organizations/871437962476\"\nelement: \"organizations/884436213798\"\nelement: \"organizations/888193160914\"\nelement: \"organizations/890266518737\"\nelement: \"organizations/89033304070\"\nelement: \"organizations/892857337257\"\nelement: \"organizations/896426431482\"\nelement: \"organizations/900134926431\"\nelement: \"organizations/901235868036\"\nelement: \"organizations/902703590188\"\nelement: \"organizations/903552208814\"\nelement: \"organizations/905031005731\"\nelement: \"organizations/908505481705\"\nelement: \"organizations/914605254163\"\nelement: \"organizations/916152866672\"\nelement: \"organizations/938542059030\"\nelement: \"organizations/952898125935\"\nelement: \"organizations/953419769142\"\nelement: \"organizations/956776603191\"\nelement: \"organizations/959645619243\"\nelement: \"organizations/959935394837\"\nelement: \"organizations/965783314391\"\nelement: \"organizations/973762422266\"\nelement: \"organizations/975681337979\"\nelement: \"organizations/975730288261\"\nelement: \"organizations/977445592689\"\nelement: \"organizations/992524860932\"\nelement: \"organizations/992617106989\"\nelement: \"organizations/994621639283\"\nelement: \"organizations/99808423033\"\n"
id: 0
'
	EnablePtDemandServiceFetchGlobalLimits__enabled: 'name: "EnablePtDemandServiceFetchGlobalLimits__enabled"
type: BOOL
base_value: "FALSE"
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 20
}
id: 0
'
	GeCacheRoutingAlg1ForPT__enabled: 'name: "GeCacheRoutingAlg1ForPT__enabled"
type: BOOL
base_value: "FALSE"
modifier {
  value_operator: OVERRIDE
  base_value: "FALSE"
  condition_group {
  }
  condition_index: 858
}
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 772
}
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 1114
}
id: 0
'
	GeDsqRetryDiffLocation__enabled: 'name: "GeDsqRetryDiffLocation__enabled"
type: BOOL
base_value: "FALSE"
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 38
}
id: 0
'
	PinnacleQsRegionalizationPaygoTiers__enabled: 'name: "PinnacleQsRegionalizationPaygoTiers__enabled"
type: BOOL
base_value: "FALSE"
modifier {
  value_operator: OVERRIDE
  base_value: "FALSE"
  condition_group {
  }
  condition_index: 802
}
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 646
}
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 784
}
id: 0
'
	PinnacleQsRegionalizationPt__enabled: 'name: "PinnacleQsRegionalizationPt__enabled"
type: BOOL
base_value: "FALSE"
modifier {
  value_operator: OVERRIDE
  base_value: "FALSE"
  condition_group {
  }
  condition_index: 803
}
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 647
}
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 785
}
id: 0
'
	ReadPTMetadata__enabled: 'name: "ReadPTMetadata__enabled"
type: BOOL
base_value: "TRUE"
id: 0
'
	UseUtaAdmitRecommendation__enabled: 'name: "UseUtaAdmitRecommendation__enabled"
type: BOOL
base_value: "FALSE"
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 90
}
id: 0
'
	EnableExplicitCacheProvisionedThroughput__enabled: 'name: "EnableExplicitCacheProvisionedThroughput__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	EnableExplicitCacheProvisionedThroughput__model_allowlist: 'name: "EnableExplicitCacheProvisionedThroughput__model_allowlist"
type: PROTO
sub_type: "experiments.proto.StringListParam"
base_value: "element: \"gemini-2.5-pro\"\nelement: \"gemini-2.5-flash\"\nelement: \"gemini-2.5-flash-preview-09-2025\"\nelement: \"gemini-2.5-flash-lite-preview-09-2025\"\nelement: \"gemini-2.5-flash-lite\"\nelement: \"gemini-2.5-pro-preview\"\nelement: \"gemini-3-pro-preview\"\n"
id: 0
'
	EnableMigrateLlamaToOpenMaas2__enabled: 'name: "EnableMigrateLlamaToOpenMaas2__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	EnableOrgPtQuotaServer__enabled: 'name: "EnableOrgPtQuotaServer__enabled"
type: BOOL
base_value: "FALSE"
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 397
}
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 18
}
id: 0
'
	EnablePtByFlag__enabled: 'name: "EnablePtByFlag__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	EnableSidechannelQuotaCheckMetadata__enabled: 'name: "EnableSidechannelQuotaCheckMetadata__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	GeminiFlashLiveApiPt__enabled: 'name: "GeminiFlashLiveApiPt__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	GenaiDsqEnterpriseTier__enabled: 'name: "GenaiDsqEnterpriseTier__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	GenaiDsqResourceBasedEnterpriseTier__resource_allowlist: 'name: "GenaiDsqResourceBasedEnterpriseTier__resource_allowlist"
type: PROTO_BINARY_BASE64
sub_type: "cloud_aiplatform.dataplane.prediction.DsqEnterpriseTierAncestorAllowlist"
base_value: ""
id: 0
'
	GlobalEndpointDsq__enable_global_routing_based_dsq: 'name: "GlobalEndpointDsq__enable_global_routing_based_dsq"
type: BOOL
base_value: "FALSE"
id: 0
'
	GlobalEndpointDsq__retries_config: 'name: "GlobalEndpointDsq__retries_config"
type: PROTO_BINARY_BASE64
sub_type: "cloud_ai_platform_dataplane_prediction_proto.CardolanRetriesConfig"
base_value: "config_id: \"ge_retries_config\"\nmodel_ids: \"gemini-[2-5].*\"\nrequest_types: \"dedicated-critical_plus\"\nrequest_types: \"shared-critical\"\nrequest_types: \"shared-sheddable_plus\"\nretry_strategy {\n  min_delay {\n    seconds: 5\n  }\n  max_delay {\n    seconds: 15\n  }\n  max_retries: 2\n  request_deadline_fraction: 1\n}\nretry_thresholds {\n  threshold_type: PER_MODEL_RETRY_RATE\n  threshold: 1\n  threshold_duration {\n    seconds: 10\n  }\n}\nretry_thresholds {\n  threshold_type: PER_MODEL_RETRY_RATE_LONG_CONTEXT\n  threshold: 1\n  threshold_duration {\n    seconds: 10\n  }\n}\nretry_threshold_fallback_behavior: RETRY_THRESHOLD_FALLBACK_BEHAVIOR_OPEN\n"
id: 0
'
	PinnacleTrustQosOverrideProjects__enabled: 'name: "PinnacleTrustQosOverrideProjects__enabled"
type: PROTO_BINARY_BASE64
sub_type: "cloud_ai_platform_dataplane_prediction_proto.TrustQosProjectsAllowlist"
base_value: ""
id: 0
'
	PtGeminiFineTunedQuota__enabled: 'name: "PtGeminiFineTunedQuota__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	QsInvalidRequestFailClose__enabled: 'name: "QsInvalidRequestFailClose__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	ReportOnlineUsageToBatch__enabled: 'name: "ReportOnlineUsageToBatch__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	ScheduledProvisionedThroughput__enabled: 'name: "ScheduledProvisionedThroughput__enabled"
type: BOOL
base_value: "FALSE"
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 666
}
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 81
}
id: 0
'
	V1pDsqRetries__enabled: 'name: "V1pDsqRetries__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	V1pDsqRetries__retries_config: 'name: "V1pDsqRetries__retries_config"
type: PROTO_BINARY_BASE64
sub_type: "cloud_ai_platform_dataplane_prediction_proto.CardolanRetriesConfig"
base_value: ""
id: 0
'
	EmbeddingPayGoCriticalityOverrides__criticality_override: 'name: "EmbeddingPayGoCriticalityOverrides__criticality_override"
type: STRING
base_value: ""
modifier {
  value_operator: OVERRIDE
  base_value: ""
  condition_group {
  }
  condition_index: 840
}
modifier {
  value_operator: OVERRIDE
  base_value: "CRITICAL_PLUS"
  condition_group {
  }
  condition_index: 640
}
modifier {
  value_operator: OVERRIDE
  base_value: ""
  condition_group {
  }
  condition_index: 853
}
modifier {
  value_operator: OVERRIDE
  base_value: "SHEDDABLE"
  condition_group {
  }
  condition_index: 764
}
id: 0
'
	EnablePtSessionWithoutThrottle__enabled: 'name: "EnablePtSessionWithoutThrottle__enabled"
type: BOOL
base_value: "FALSE"
modifier {
  value_operator: OVERRIDE
  base_value: "FALSE"
  condition_group {
  }
  condition_index: 855
}
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 767
}
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 1112
}
id: 0
'
	GroundingQuotaEnforcement__enabled: 'name: "GroundingQuotaEnforcement__enabled"
type: BOOL
base_value: "FALSE"
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 41
}
id: 0
'
	LiveApiNewSessionPrechargeToken__token_count: 'name: "LiveApiNewSessionPrechargeToken__token_count"
type: INT
base_value: "2999"
id: 0
'
	MapsGroundingQuotaEnforcement__enabled: 'name: "MapsGroundingQuotaEnforcement__enabled"
type: BOOL
base_value: "FALSE"
modifier {
  value_operator: OVERRIDE
  base_value: "FALSE"
  condition_group {
  }
  condition_index: 807
}
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 644
}
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 774
}
id: 0
'
	ParallelAiSearchQuotaEnforcement__enabled: 'name: "ParallelAiSearchQuotaEnforcement__enabled"
type: BOOL
base_value: "FALSE"
modifier {
  value_operator: OVERRIDE
  base_value: "FALSE"
  condition_group {
  }
  condition_index: 862
}
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 778
}
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 1117
}
id: 0
'
	PinnacleDisablePtOverageServiceId__enabled: 'name: "PinnacleDisablePtOverageServiceId__enabled"
type: BOOL
base_value: "FALSE"
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 384
}
modifier {
  value_operator: OVERRIDE
  base_value: "FALSE"
  condition_group {
  }
  condition_index: 0
}
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 782
}
id: 0
'
	PinnacleRejectWebSearchToolRequestsRestrictedByVpcSc__enabled: 'name: "PinnacleRejectWebSearchToolRequestsRestrictedByVpcSc__enabled"
type: BOOL
base_value: "FALSE"
modifier {
  value_operator: OVERRIDE
  base_value: "FALSE"
  condition_group {
  }
  condition_index: 1126
}
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 1122
}
modifier {
  value_operator: OVERRIDE
  base_value: "FALSE"
  condition_group {
  }
  condition_index: 139
}
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 1188
}
id: 0
'
	PinnacleWebSearchDenylist__enabled: 'name: "PinnacleWebSearchDenylist__enabled"
type: BOOL
base_value: "FALSE"
modifier {
  value_operator: OVERRIDE
  base_value: "FALSE"
  condition_group {
  }
  condition_index: 838
}
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 648
}
id: 0
'
	PriorityPaygoAllowlist__enabled: 'name: "PriorityPaygoAllowlist__enabled"
type: BOOL
base_value: "FALSE"
modifier {
  value_operator: OVERRIDE
  base_value: "FALSE"
  condition_group {
  }
  condition_index: 827
}
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 650
}
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 789
}
id: 0
'
	PriorityPaygoAllowlist__model_allowlist: 'name: "PriorityPaygoAllowlist__model_allowlist"
type: PROTO
sub_type: "experiments.proto.StringListParam"
base_value: ""
modifier {
  value_operator: OVERRIDE
  base_value: ""
  condition_group {
  }
  condition_index: 827
}
modifier {
  value_operator: OVERRIDE
  base_value: "element: \"gemini-2.5-flash-manual-test\"\nelement: \"gemini-2.5-flash\"\nelement: \"gemini-2.5-pro\"\nelement: \"gemini-2.5-flash-lite\"\nelement: \"gemini-3-pro-preview\"\n"
  condition_group {
  }
  condition_index: 650
}
modifier {
  value_operator: OVERRIDE
  base_value: "element: \"gemini-2.5-flash-manual-test\"\nelement: \"gemini-2.5-flash\"\nelement: \"gemini-2.5-pro\"\nelement: \"gemini-2.5-flash-lite\"\nelement: \"gemini-3-pro-preview\"\n"
  condition_group {
  }
  condition_index: 789
}
id: 0
'
	RegionalHarpoonMigration__enabled: 'name: "RegionalHarpoonMigration__enabled"
type: BOOL
base_value: "FALSE"
modifier {
  value_operator: OVERRIDE
  base_value: "FALSE"
  condition_group {
  }
  condition_index: 798
}
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 651
}
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 790
}
id: 0
'
	AddDisplayNamesInMetricLogging__enabled: 'name: "AddDisplayNamesInMetricLogging__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	BaseModelConfigFeature__base_model_config_map: 'name: "BaseModelConfigFeature__base_model_config_map"
type: PROTO_BINARY_BASE64
sub_type: "cloud_ai_platform_dataplane_prediction_proto.BaseModelConfigMap"
base_value: "base_model_config_map {\n  key: \"MedLM-Large-1.5\"\n  value {\n    output_chars_to_input_chars_ratio: 3\n  }\n}\nbase_model_config_map {\n  key: \"MedLM-large\"\n  value {\n    output_chars_to_input_chars_ratio: 3\n  }\n}\nbase_model_config_map {\n  key: \"MedLM-medium\"\n  value {\n    output_chars_to_input_chars_ratio: 2\n  }\n}\nbase_model_config_map {\n  key: \"ai21-jamba-1.5-large\"\n  value {\n    output_chars_to_input_chars_ratio: 4\n  }\n}\nbase_model_config_map {\n  key: \"ai21-jamba-1.5-mini\"\n  value {\n    output_chars_to_input_chars_ratio: 2\n  }\n}\nbase_model_config_map {\n  key: \"anthropic-anthropic-claude2\"\n  value {\n  }\n}\nbase_model_config_map {\n  key: \"anthropic-claude-2p0\"\n  value {\n  }\n}\nbase_model_config_map {\n  key: \"anthropic-claude-3-5-haiku\"\n  value {\n    output_chars_to_input_chars_ratio: 5\n    input_cache_read_chars_to_input_chars_ratio: 0.1\n    input_cache_write_chars_to_input_chars_ratio: 1.25\n    input_token_to_precharge_tokens_ratio: 1\n    max_output_token_to_precharge_tokens_ratio: 1\n    cache_input_token_to_precharge_tokens_ratio: 0.1115\n    input_image_to_input_tokens_ratio: 1600\n    input_cache_write_one_hour_chars_to_input_chars_ratio: 2\n  }\n}\nbase_model_config_map {\n  key: \"anthropic-claude-3-5-haiku-staging\"\n  value {\n    output_chars_to_input_chars_ratio: 5\n    input_cache_read_chars_to_input_chars_ratio: 0.1\n    input_cache_write_chars_to_input_chars_ratio: 1.25\n    input_token_to_precharge_tokens_ratio: 1\n    max_output_token_to_precharge_tokens_ratio: 1\n    cache_input_token_to_precharge_tokens_ratio: 0.1125\n    input_image_to_input_tokens_ratio: 1600\n  }\n}\nbase_model_config_map {\n  key: \"anthropic-claude-3-5-sonnet\"\n  value {\n    output_chars_to_input_chars_ratio: 5\n    input_cache_read_chars_to_input_chars_ratio: 0.1\n    input_cache_write_chars_to_input_chars_ratio: 1.25\n    input_token_to_precharge_tokens_ratio: 1\n    max_output_token_to_precharge_tokens_ratio: 1\n    cache_input_token_to_precharge_tokens_ratio: 0.1125\n    input_image_to_input_tokens_ratio: 1600\n  }\n}\nbase_model_config_map {\n  key: \"anthropic-claude-3-5-sonnet-v2\"\n  value {\n    output_chars_to_input_chars_ratio: 5\n    input_cache_read_chars_to_input_chars_ratio: 0.1\n    input_cache_write_chars_to_input_chars_ratio: 1.25\n    input_token_to_precharge_tokens_ratio: 1\n    max_output_token_to_precharge_tokens_ratio: 1\n    cache_input_token_to_precharge_tokens_ratio: 0.1125\n    input_image_to_input_tokens_ratio: 1600\n  }\n}\nbase_model_config_map {\n  key: \"anthropic-claude-3-5-sonnet-v2-staging\"\n  value {\n    output_chars_to_input_chars_ratio: 5\n    input_cache_read_chars_to_input_chars_ratio: 0.1\n    input_cache_write_chars_to_input_chars_ratio: 1.25\n    input_token_to_precharge_tokens_ratio: 1\n    max_output_token_to_precharge_tokens_ratio: 1\n    cache_input_token_to_precharge_tokens_ratio: 0.1125\n    input_image_to_input_tokens_ratio: 1600\n  }\n}\nbase_model_config_map {\n  key: \"anthropic-claude-3-7-sonnet\"\n  value {\n    output_chars_to_input_chars_ratio: 5\n    input_cache_read_chars_to_input_chars_ratio: 0.1\n    input_cache_write_chars_to_input_chars_ratio: 1.25\n    input_token_to_precharge_tokens_ratio: 1\n    max_output_token_to_precharge_tokens_ratio: 0.25\n    cache_input_token_to_precharge_tokens_ratio: 0.1125\n    input_image_to_input_tokens_ratio: 1600\n  }\n}\nbase_model_config_map {\n  key: \"anthropic-claude-3-haiku\"\n  value {\n    output_chars_to_input_chars_ratio: 5\n    input_cache_read_chars_to_input_chars_ratio: 0.1\n    input_cache_write_chars_to_input_chars_ratio: 1.25\n    input_token_to_precharge_tokens_ratio: 1\n    max_output_token_to_precharge_tokens_ratio: 1\n    cache_input_token_to_precharge_tokens_ratio: 0.1115\n    input_image_to_input_tokens_ratio: 1600\n    input_cache_write_one_hour_chars_to_input_chars_ratio: 2\n  }\n}\nbase_model_config_map {\n  key: \"anthropic-claude-3-opus\"\n  value {\n    output_chars_to_input_chars_ratio: 5\n    input_cache_read_chars_to_input_chars_ratio: 0.1\n    input_cache_write_chars_to_input_chars_ratio: 1.25\n    input_token_to_precharge_tokens_ratio: 1\n    max_output_token_to_precharge_tokens_ratio: 1\n    cache_input_token_to_precharge_tokens_ratio: 0.1125\n    input_image_to_input_tokens_ratio: 1600\n  }\n}\nbase_model_config_map {\n  key: \"anthropic-claude-3-sonnet\"\n  value {\n    output_chars_to_input_chars_ratio: 5\n    input_cache_read_chars_to_input_chars_ratio: 1\n    input_cache_write_chars_to_input_chars_ratio: 1\n    input_token_to_precharge_tokens_ratio: 1\n    input_image_to_input_tokens_ratio: 1600\n  }\n}\nbase_model_config_map {\n  key: \"anthropic-claude-3-sonnet-staging\"\n  value {\n    output_chars_to_input_chars_ratio: 5\n    input_cache_read_chars_to_input_chars_ratio: 0.1\n    input_cache_write_chars_to_input_chars_ratio: 1.25\n    input_token_to_precharge_tokens_ratio: 1\n    max_output_token_to_precharge_tokens_ratio: 0.25\n    cache_input_token_to_precharge_tokens_ratio: 0.1125\n    input_image_to_input_tokens_ratio: 1600\n    paygo_output_chars_to_input_chars_ratio: 1\n    paygo_input_cache_write_chars_to_input_chars_ratio: 1\n    context_window_size_based_burndown_configs {\n      key: 200000\n      value {\n        chars_to_input_chars_ratio: 1\n        output_chars_to_input_chars_ratio: 5\n        input_cache_read_chars_to_input_chars_ratio: 0.1\n        input_cache_write_chars_to_input_chars_ratio: 1.25\n        input_cache_write_one_hour_chars_to_input_chars_ratio: 2\n      }\n    }\n    context_window_size_based_burndown_configs {\n      key: 1000000\n      value {\n        chars_to_input_chars_ratio: 2\n        output_chars_to_input_chars_ratio: 7.5\n        input_cache_read_chars_to_input_chars_ratio: 0.2\n        input_cache_write_chars_to_input_chars_ratio: 2.5\n        input_cache_write_one_hour_chars_to_input_chars_ratio: 4\n      }\n    }\n    input_cache_write_one_hour_chars_to_input_chars_ratio: 2\n    paygo_input_cache_write_one_hour_chars_to_input_chars_ratio: 1\n  }\n}\nbase_model_config_map {\n  key: \"anthropic-claude-haiku-4-5\"\n  value {\n    output_chars_to_input_chars_ratio: 5\n    input_cache_read_chars_to_input_chars_ratio: 0.1\n    input_cache_write_chars_to_input_chars_ratio: 1.25\n    input_token_to_precharge_tokens_ratio: 1\n    max_output_token_to_precharge_tokens_ratio: 0.25\n    cache_input_token_to_precharge_tokens_ratio: 0.1125\n    input_image_to_input_tokens_ratio: 1600\n    paygo_output_chars_to_input_chars_ratio: 1\n    paygo_input_cache_write_chars_to_input_chars_ratio: 1\n    context_window_size_based_burndown_configs {\n      key: 200000\n      value {\n        chars_to_input_chars_ratio: 1\n        output_chars_to_input_chars_ratio: 5\n        input_cache_read_chars_to_input_chars_ratio: 0.1\n        input_cache_write_chars_to_input_chars_ratio: 1.25\n        input_cache_write_one_hour_chars_to_input_chars_ratio: 2\n      }\n    }\n    context_window_size_based_burndown_configs {\n      key: 1000000\n      value {\n        chars_to_input_chars_ratio: 2\n        output_chars_to_input_chars_ratio: 7.5\n        input_cache_read_chars_to_input_chars_ratio: 0.2\n        input_cache_write_chars_to_input_chars_ratio: 2.5\n        input_cache_write_one_hour_chars_to_input_chars_ratio: 4\n      }\n    }\n    input_cache_write_one_hour_chars_to_input_chars_ratio: 2\n    paygo_input_cache_write_one_hour_chars_to_input_chars_ratio: 1\n  }\n}\nbase_model_config_map {\n  key: \"anthropic-claude-instant-1p2\"\n  value {\n  }\n}\nbase_model_config_map {\n  key: \"anthropic-claude-opus-4\"\n  value {\n    output_chars_to_input_chars_ratio: 5\n    input_cache_read_chars_to_input_chars_ratio: 0.1\n    input_cache_write_chars_to_input_chars_ratio: 1.25\n    input_token_to_precharge_tokens_ratio: 1\n    max_output_token_to_precharge_tokens_ratio: 0.25\n    cache_input_token_to_precharge_tokens_ratio: 0.1125\n    input_image_to_input_tokens_ratio: 1600\n    paygo_output_chars_to_input_chars_ratio: 1\n    paygo_input_cache_write_chars_to_input_chars_ratio: 1\n    input_cache_write_one_hour_chars_to_input_chars_ratio: 2\n    paygo_input_cache_write_one_hour_chars_to_input_chars_ratio: 1\n  }\n}\nbase_model_config_map {\n  key: \"anthropic-claude-opus-4-1\"\n  value {\n    output_chars_to_input_chars_ratio: 5\n    input_cache_read_chars_to_input_chars_ratio: 0.1\n    input_cache_write_chars_to_input_chars_ratio: 1.25\n    input_token_to_precharge_tokens_ratio: 1\n    max_output_token_to_precharge_tokens_ratio: 0.25\n    cache_input_token_to_precharge_tokens_ratio: 0.1125\n    input_image_to_input_tokens_ratio: 1600\n    paygo_output_chars_to_input_chars_ratio: 1\n    paygo_input_cache_write_chars_to_input_chars_ratio: 1\n    input_cache_write_one_hour_chars_to_input_chars_ratio: 2\n    paygo_input_cache_write_one_hour_chars_to_input_chars_ratio: 1\n  }\n}\nbase_model_config_map {\n  key: \"anthropic-claude-opus-4-5\"\n  value {\n    output_chars_to_input_chars_ratio: 5\n    input_cache_read_chars_to_input_chars_ratio: 0.1\n    input_cache_write_chars_to_input_chars_ratio: 1.25\n    input_token_to_precharge_tokens_ratio: 1\n    max_output_token_to_precharge_tokens_ratio: 0.25\n    cache_input_token_to_precharge_tokens_ratio: 0.1125\n    input_image_to_input_tokens_ratio: 1600\n    paygo_output_chars_to_input_chars_ratio: 1\n    paygo_input_cache_write_chars_to_input_chars_ratio: 1\n    context_window_size_based_burndown_configs {\n      key: 200000\n      value {\n        chars_to_input_chars_ratio: 1\n        output_chars_to_input_chars_ratio: 5\n        input_cache_read_chars_to_input_chars_ratio: 0.1\n        input_cache_write_chars_to_input_chars_ratio: 1.25\n        input_cache_write_one_hour_chars_to_input_chars_ratio: 2\n      }\n    }\n    context_window_size_based_burndown_configs {\n      key: 1000000\n      value {\n        chars_to_input_chars_ratio: 2\n        output_chars_to_input_chars_ratio: 7.5\n        input_cache_read_chars_to_input_chars_ratio: 0.2\n        input_cache_write_chars_to_input_chars_ratio: 2.5\n        input_cache_write_one_hour_chars_to_input_chars_ratio: 4\n      }\n    }\n    input_cache_write_one_hour_chars_to_input_chars_ratio: 2\n    paygo_input_cache_write_one_hour_chars_to_input_chars_ratio: 1\n  }\n}\nbase_model_config_map {\n  key: \"anthropic-claude-sonnet-4\"\n  value {\n    output_chars_to_input_chars_ratio: 5\n    input_cache_read_chars_to_input_chars_ratio: 0.1\n    input_cache_write_chars_to_input_chars_ratio: 1.25\n    input_token_to_precharge_tokens_ratio: 1\n    max_output_token_to_precharge_tokens_ratio: 0.25\n    cache_input_token_to_precharge_tokens_ratio: 0.1125\n    input_image_to_input_tokens_ratio: 1600\n    paygo_output_chars_to_input_chars_ratio: 1\n    paygo_input_cache_write_chars_to_input_chars_ratio: 1\n    context_window_size_based_burndown_configs {\n      key: 200000\n      value {\n        chars_to_input_chars_ratio: 1\n        output_chars_to_input_chars_ratio: 5\n        input_cache_read_chars_to_input_chars_ratio: 0.1\n        input_cache_write_chars_to_input_chars_ratio: 1.25\n        input_cache_write_one_hour_chars_to_input_chars_ratio: 2\n      }\n    }\n    context_window_size_based_burndown_configs {\n      key: 1000000\n      value {\n        chars_to_input_chars_ratio: 2\n        output_chars_to_input_chars_ratio: 7.5\n        input_cache_read_chars_to_input_chars_ratio: 0.2\n        input_cache_write_chars_to_input_chars_ratio: 2.5\n        input_cache_write_one_hour_chars_to_input_chars_ratio: 4\n      }\n    }\n    input_cache_write_one_hour_chars_to_input_chars_ratio: 2\n    paygo_input_cache_write_one_hour_chars_to_input_chars_ratio: 1\n  }\n}\nbase_model_config_map {\n  key: \"anthropic-claude-sonnet-4-5\"\n  value {\n    output_chars_to_input_chars_ratio: 5\n    input_cache_read_chars_to_input_chars_ratio: 0.1\n    input_cache_write_chars_to_input_chars_ratio: 1.25\n    input_token_to_precharge_tokens_ratio: 1\n    max_output_token_to_precharge_tokens_ratio: 0.25\n    cache_input_token_to_precharge_tokens_ratio: 0.1125\n    input_image_to_input_tokens_ratio: 1600\n    paygo_output_chars_to_input_chars_ratio: 1\n    paygo_input_cache_write_chars_to_input_chars_ratio: 1\n    context_window_size_based_burndown_configs {\n      key: 200000\n      value {\n        chars_to_input_chars_ratio: 1\n        output_chars_to_input_chars_ratio: 5\n        input_cache_read_chars_to_input_chars_ratio: 0.1\n        input_cache_write_chars_to_input_chars_ratio: 1.25\n        input_cache_write_one_hour_chars_to_input_chars_ratio: 2\n      }\n    }\n    context_window_size_based_burndown_configs {\n      key: 1000000\n      value {\n        chars_to_input_chars_ratio: 2\n        output_chars_to_input_chars_ratio: 7.5\n        input_cache_read_chars_to_input_chars_ratio: 0.2\n        input_cache_write_chars_to_input_chars_ratio: 2.5\n        input_cache_write_one_hour_chars_to_input_chars_ratio: 4\n      }\n    }\n    input_cache_write_one_hour_chars_to_input_chars_ratio: 2\n    paygo_input_cache_write_one_hour_chars_to_input_chars_ratio: 1\n  }\n}\nbase_model_config_map {\n  key: \"anthropic-count-tokens\"\n  value {\n  }\n}\nbase_model_config_map {\n  key: \"anthropic-maas-infra\"\n  value {\n    output_chars_to_input_chars_ratio: 5\n    input_cache_read_chars_to_input_chars_ratio: 0.1\n    input_cache_write_chars_to_input_chars_ratio: 1.25\n    input_token_to_precharge_tokens_ratio: 1\n    max_output_token_to_precharge_tokens_ratio: 1\n    cache_input_token_to_precharge_tokens_ratio: 0.1125\n    input_image_to_input_tokens_ratio: 1600\n  }\n}\nbase_model_config_map {\n  key: \"anthropic-marketplace-publisher-model-138\"\n  value {\n  }\n}\nbase_model_config_map {\n  key: \"anthropic-spillover-api\"\n  value {\n  }\n}\nbase_model_config_map {\n  key: \"anthropic-spillover-api-tpu-001\"\n  value {\n  }\n}\nbase_model_config_map {\n  key: \"anthropic-spillover-api-tpu-002\"\n  value {\n  }\n}\nbase_model_config_map {\n  key: \"anthropic-spillover-api-tpu-003\"\n  value {\n  }\n}\nbase_model_config_map {\n  key: \"anthropic-spillover-api-tpu-004\"\n  value {\n  }\n}\nbase_model_config_map {\n  key: \"anthropic-spillover-api-tpu-005\"\n  value {\n  }\n}\nbase_model_config_map {\n  key: \"anthropic-spillover-api-tpu-006\"\n  value {\n  }\n}\nbase_model_config_map {\n  key: \"chat-bison\"\n  value {\n    output_chars_to_input_chars_ratio: 2\n  }\n}\nbase_model_config_map {\n  key: \"chat-bison-32k\"\n  value {\n    output_chars_to_input_chars_ratio: 2\n  }\n}\nbase_model_config_map {\n  key: \"code-bison\"\n  value {\n    output_chars_to_input_chars_ratio: 2\n  }\n}\nbase_model_config_map {\n  key: \"code-bison-32k\"\n  value {\n    output_chars_to_input_chars_ratio: 2\n  }\n}\nbase_model_config_map {\n  key: \"code-gecko\"\n  value {\n    output_chars_to_input_chars_ratio: 2\n  }\n}\nbase_model_config_map {\n  key: \"codechat-bison\"\n  value {\n    output_chars_to_input_chars_ratio: 2\n  }\n}\nbase_model_config_map {\n  key: \"codechat-bison-32k\"\n  value {\n    output_chars_to_input_chars_ratio: 2\n  }\n}\nbase_model_config_map {\n  key: \"deepseek-ocr-maas\"\n  value {\n    default_burndown_config {\n      output_tokens_to_input_tokens_ratio: 4\n      input_token_to_input_tokens_ratio: 1\n    }\n    use_usage_metadata_for_token_count: true\n    use_normalized_pt_token_count: true\n  }\n}\nbase_model_config_map {\n  key: \"deepseek-r1-0528-maas\"\n  value {\n    default_burndown_config {\n      output_tokens_to_input_tokens_ratio: 4\n      input_token_to_input_tokens_ratio: 1\n    }\n    use_usage_metadata_for_token_count: true\n    use_normalized_pt_token_count: true\n  }\n}\nbase_model_config_map {\n  key: \"deepseek-v3.1-maas\"\n  value {\n    default_burndown_config {\n      output_tokens_to_input_tokens_ratio: 3\n      input_token_to_input_tokens_ratio: 1\n    }\n    use_usage_metadata_for_token_count: true\n    use_normalized_pt_token_count: true\n  }\n}\nbase_model_config_map {\n  key: \"default\"\n  value {\n    output_chars_to_input_chars_ratio: 1\n  }\n}\nbase_model_config_map {\n  key: \"dummy-anthropic-test-model-2\"\n  value {\n    input_image_to_input_chars_ratio: 100\n    input_videosec_to_input_chars_ratio: 200\n    output_chars_to_input_chars_ratio: 5\n    input_cache_read_chars_to_input_chars_ratio: 0.1\n    input_cache_write_chars_to_input_chars_ratio: 1.25\n    input_token_to_precharge_tokens_ratio: 1\n    max_output_token_to_precharge_tokens_ratio: 0.25\n    cache_input_token_to_precharge_tokens_ratio: 0.1125\n    input_image_to_input_tokens_ratio: 1600\n  }\n}\nbase_model_config_map {\n  key: \"gemini-1.5-flash\"\n  value {\n    input_image_to_input_chars_ratio: 1067\n    input_videosec_to_input_chars_ratio: 1067\n    output_chars_to_input_chars_ratio: 4\n    input_audiosec_to_input_chars_ratio: 107\n    tier_burndown_configs {\n      key: \"128k\"\n      value {\n        chars_to_input_chars_ratio: 1\n        input_image_to_input_chars_ratio: 1067\n        input_videosec_to_input_chars_ratio: 1067\n        input_audiosec_to_input_chars_ratio: 107\n        output_chars_to_input_chars_ratio: 4\n      }\n    }\n    tier_burndown_configs {\n      key: \"1m\"\n      value {\n        chars_to_input_chars_ratio: 2\n        input_image_to_input_chars_ratio: 2134\n        input_videosec_to_input_chars_ratio: 2134\n        input_audiosec_to_input_chars_ratio: 214\n        output_chars_to_input_chars_ratio: 4\n      }\n    }\n  }\n}\nbase_model_config_map {\n  key: \"gemini-1.5-pro\"\n  value {\n    input_image_to_input_chars_ratio: 1052\n    input_videosec_to_input_chars_ratio: 1052\n    output_chars_to_input_chars_ratio: 3\n    input_audiosec_to_input_chars_ratio: 100\n    tier_burndown_configs {\n      key: \"128k\"\n      value {\n        chars_to_input_chars_ratio: 1\n        input_image_to_input_chars_ratio: 1052\n        input_videosec_to_input_chars_ratio: 1052\n        input_audiosec_to_input_chars_ratio: 100\n        output_chars_to_input_chars_ratio: 3\n      }\n    }\n    tier_burndown_configs {\n      key: \"1m\"\n      value {\n        chars_to_input_chars_ratio: 2\n        input_image_to_input_chars_ratio: 2104\n        input_videosec_to_input_chars_ratio: 2104\n        input_audiosec_to_input_chars_ratio: 200\n        output_chars_to_input_chars_ratio: 3\n      }\n    }\n  }\n}\nbase_model_config_map {\n  key: \"gemini-2.0-flash\"\n  value {\n    input_image_to_input_chars_ratio: 1\n    input_videosec_to_input_chars_ratio: 1\n    output_chars_to_input_chars_ratio: 4\n    input_audiosec_to_input_chars_ratio: 7\n    tier_burndown_configs {\n      key: \"128k\"\n      value {\n        chars_to_input_chars_ratio: 1\n        input_image_to_input_chars_ratio: 1\n        input_videosec_to_input_chars_ratio: 1\n        input_audiosec_to_input_chars_ratio: 7\n        output_chars_to_input_chars_ratio: 4\n      }\n    }\n    tier_burndown_configs {\n      key: \"1m\"\n      value {\n        chars_to_input_chars_ratio: 1\n        input_image_to_input_chars_ratio: 1\n        input_videosec_to_input_chars_ratio: 1\n        input_audiosec_to_input_chars_ratio: 7\n        output_chars_to_input_chars_ratio: 4\n      }\n    }\n    default_burndown_config {\n      output_tokens_to_input_tokens_ratio: 4\n      input_image_to_input_tokens_ratio: 1\n      input_video_to_input_tokens_ratio: 1\n      input_audio_to_input_tokens_ratio: 7\n      input_token_to_input_tokens_ratio: 1\n    }\n    context_window_size_based_burndown_configs {\n      key: 128000\n      value {\n        output_tokens_to_input_tokens_ratio: 4\n        input_image_to_input_tokens_ratio: 1\n        input_video_to_input_tokens_ratio: 1\n        input_audio_to_input_tokens_ratio: 7\n        input_token_to_input_tokens_ratio: 1\n      }\n    }\n    context_window_size_based_burndown_configs {\n      key: 1000000\n      value {\n        output_tokens_to_input_tokens_ratio: 4\n        input_image_to_input_tokens_ratio: 1\n        input_video_to_input_tokens_ratio: 1\n        input_audio_to_input_tokens_ratio: 7\n        input_token_to_input_tokens_ratio: 1\n      }\n    }\n    use_usage_metadata_for_token_count: true\n  }\n}\nbase_model_config_map {\n  key: \"gemini-2.0-flash-lite\"\n  value {\n    input_image_to_input_chars_ratio: 1\n    input_videosec_to_input_chars_ratio: 1\n    output_chars_to_input_chars_ratio: 4\n    input_audiosec_to_input_chars_ratio: 1\n    tier_burndown_configs {\n      key: \"128k\"\n      value {\n        chars_to_input_chars_ratio: 1\n        input_image_to_input_chars_ratio: 1\n        input_videosec_to_input_chars_ratio: 1\n        input_audiosec_to_input_chars_ratio: 1\n        output_chars_to_input_chars_ratio: 4\n      }\n    }\n    tier_burndown_configs {\n      key: \"1m\"\n      value {\n        chars_to_input_chars_ratio: 1\n        input_image_to_input_chars_ratio: 1\n        input_videosec_to_input_chars_ratio: 1\n        input_audiosec_to_input_chars_ratio: 1\n        output_chars_to_input_chars_ratio: 4\n      }\n    }\n    default_burndown_config {\n      output_tokens_to_input_tokens_ratio: 4\n      input_image_to_input_tokens_ratio: 1\n      input_video_to_input_tokens_ratio: 1\n      input_audio_to_input_tokens_ratio: 1\n      input_token_to_input_tokens_ratio: 1\n    }\n    context_window_size_based_burndown_configs {\n      key: 128000\n      value {\n        output_tokens_to_input_tokens_ratio: 4\n        input_image_to_input_tokens_ratio: 1\n        input_video_to_input_tokens_ratio: 1\n        input_audio_to_input_tokens_ratio: 1\n        input_token_to_input_tokens_ratio: 1\n      }\n    }\n    context_window_size_based_burndown_configs {\n      key: 1000000\n      value {\n        output_tokens_to_input_tokens_ratio: 4\n        input_image_to_input_tokens_ratio: 1\n        input_video_to_input_tokens_ratio: 1\n        input_audio_to_input_tokens_ratio: 1\n        input_token_to_input_tokens_ratio: 1\n      }\n    }\n    use_usage_metadata_for_token_count: true\n  }\n}\nbase_model_config_map {\n  key: \"gemini-2.0-flash-live\"\n  value {\n    input_image_to_input_chars_ratio: 1\n    input_videosec_to_input_chars_ratio: 1\n    output_chars_to_input_chars_ratio: 4\n    input_audiosec_to_input_chars_ratio: 7\n    tier_burndown_configs {\n      key: \"128k\"\n      value {\n        chars_to_input_chars_ratio: 1\n        input_image_to_input_chars_ratio: 1\n        input_videosec_to_input_chars_ratio: 1\n        input_audiosec_to_input_chars_ratio: 7\n        output_chars_to_input_chars_ratio: 4\n        output_tokens_to_input_tokens_ratio: 4\n        input_video_to_input_tokens_ratio: 1\n        input_audio_to_input_tokens_ratio: 7\n        output_audio_to_input_tokens_ratio: 80\n      }\n    }\n    tier_burndown_configs {\n      key: \"1m\"\n      value {\n        chars_to_input_chars_ratio: 1\n        input_image_to_input_chars_ratio: 1\n        input_videosec_to_input_chars_ratio: 1\n        input_audiosec_to_input_chars_ratio: 7\n        output_chars_to_input_chars_ratio: 4\n        output_tokens_to_input_tokens_ratio: 4\n        input_video_to_input_tokens_ratio: 1\n        input_audio_to_input_tokens_ratio: 7\n        output_audio_to_input_tokens_ratio: 80\n      }\n    }\n    output_tokens_to_input_tokens_ratio: 4\n    input_video_to_input_tokens_ratio: 1\n    input_audio_to_input_tokens_ratio: 7\n    output_audio_to_input_tokens_ratio: 80\n    default_burndown_config {\n      output_tokens_to_input_tokens_ratio: 4\n      input_image_to_input_tokens_ratio: 1\n      input_video_to_input_tokens_ratio: 1\n      input_audio_to_input_tokens_ratio: 7\n      output_audio_to_input_tokens_ratio: 80\n      input_token_to_input_tokens_ratio: 1\n    }\n    context_window_size_based_burndown_configs {\n      key: 128000\n      value {\n        output_tokens_to_input_tokens_ratio: 4\n        input_image_to_input_tokens_ratio: 1\n        input_video_to_input_tokens_ratio: 1\n        input_audio_to_input_tokens_ratio: 7\n        output_audio_to_input_tokens_ratio: 80\n        input_token_to_input_tokens_ratio: 1\n      }\n    }\n    context_window_size_based_burndown_configs {\n      key: 1000000\n      value {\n        output_tokens_to_input_tokens_ratio: 4\n        input_image_to_input_tokens_ratio: 1\n        input_video_to_input_tokens_ratio: 1\n        input_audio_to_input_tokens_ratio: 7\n        output_audio_to_input_tokens_ratio: 80\n        input_token_to_input_tokens_ratio: 1\n      }\n    }\n    use_usage_metadata_for_token_count: true\n  }\n}\nbase_model_config_map {\n  key: \"gemini-2.0-flash-preview-image-generation\"\n  value {\n    input_image_to_input_chars_ratio: 1\n    input_videosec_to_input_chars_ratio: 1\n    output_chars_to_input_chars_ratio: 4\n    input_audiosec_to_input_chars_ratio: 7\n    tier_burndown_configs {\n      key: \"32k\"\n      value {\n        chars_to_input_chars_ratio: 1\n        input_image_to_input_chars_ratio: 1\n        input_videosec_to_input_chars_ratio: 1\n        input_audiosec_to_input_chars_ratio: 7\n        output_chars_to_input_chars_ratio: 4\n        output_image_to_input_tokens_ratio: 200\n      }\n    }\n    output_image_to_input_tokens_ratio: 200\n  }\n}\nbase_model_config_map {\n  key: \"gemini-2.0-thinking\"\n  value {\n    input_image_to_input_chars_ratio: 1\n    input_videosec_to_input_chars_ratio: 1\n    output_chars_to_input_chars_ratio: 4\n    input_audiosec_to_input_chars_ratio: 7\n    tier_burndown_configs {\n      key: \"128k\"\n      value {\n        chars_to_input_chars_ratio: 1\n        input_image_to_input_chars_ratio: 1\n        input_videosec_to_input_chars_ratio: 1\n        input_audiosec_to_input_chars_ratio: 7\n        output_chars_to_input_chars_ratio: 4\n        output_thinking_tokens_to_input_tokens_ratio: 0.25\n      }\n    }\n    tier_burndown_configs {\n      key: \"1m\"\n      value {\n        chars_to_input_chars_ratio: 1\n        input_image_to_input_chars_ratio: 1\n        input_videosec_to_input_chars_ratio: 1\n        input_audiosec_to_input_chars_ratio: 7\n        output_chars_to_input_chars_ratio: 4\n      }\n    }\n    output_thinking_tokens_to_input_tokens_ratio: 0.25\n    default_burndown_config {\n      output_thinking_tokens_to_input_tokens_ratio: 0.25\n      output_tokens_to_input_tokens_ratio: 4\n      input_image_to_input_tokens_ratio: 1\n      input_video_to_input_tokens_ratio: 1\n      input_audio_to_input_tokens_ratio: 7\n      input_token_to_input_tokens_ratio: 1\n    }\n    context_window_size_based_burndown_configs {\n      key: 128000\n      value {\n        output_thinking_tokens_to_input_tokens_ratio: 0.25\n        output_tokens_to_input_tokens_ratio: 4\n        input_image_to_input_tokens_ratio: 1\n        input_video_to_input_tokens_ratio: 1\n        input_audio_to_input_tokens_ratio: 7\n        input_token_to_input_tokens_ratio: 1\n      }\n    }\n    context_window_size_based_burndown_configs {\n      key: 1000000\n      value {\n        output_tokens_to_input_tokens_ratio: 4\n        input_image_to_input_tokens_ratio: 1\n        input_video_to_input_tokens_ratio: 1\n        input_audio_to_input_tokens_ratio: 7\n        input_token_to_input_tokens_ratio: 1\n      }\n    }\n    use_usage_metadata_for_token_count: true\n  }\n}\nbase_model_config_map {\n  key: \"gemini-2.5-flash\"\n  value {\n    input_image_to_input_chars_ratio: 1\n    input_videosec_to_input_chars_ratio: 1\n    output_chars_to_input_chars_ratio: 4\n    input_audiosec_to_input_chars_ratio: 7\n    tier_burndown_configs {\n      key: \"1m\"\n      value {\n        chars_to_input_chars_ratio: 1\n        input_image_to_input_chars_ratio: 1\n        input_videosec_to_input_chars_ratio: 1\n        input_audiosec_to_input_chars_ratio: 7\n        output_chars_to_input_chars_ratio: 4\n        output_thinking_tokens_to_input_tokens_ratio: 24\n        output_tokens_to_input_tokens_ratio_thinking_on: 24\n      }\n    }\n    tier_burndown_configs {\n      key: \"200k\"\n      value {\n        chars_to_input_chars_ratio: 1\n        input_image_to_input_chars_ratio: 1\n        input_videosec_to_input_chars_ratio: 1\n        input_audiosec_to_input_chars_ratio: 7\n        output_chars_to_input_chars_ratio: 4\n        output_thinking_tokens_to_input_tokens_ratio: 24\n        output_tokens_to_input_tokens_ratio_thinking_on: 24\n      }\n    }\n    output_thinking_tokens_to_input_tokens_ratio: 24\n    output_tokens_to_input_tokens_ratio_thinking_on: 24\n    cached_input_token_discount_ratio: 0.9\n    default_burndown_config {\n      output_thinking_tokens_to_input_tokens_ratio: 24\n      output_tokens_to_input_tokens_ratio: 4\n      input_image_to_input_tokens_ratio: 1\n      input_video_to_input_tokens_ratio: 1\n      input_audio_to_input_tokens_ratio: 7\n      output_tokens_to_input_tokens_ratio_thinking_on: 24\n      input_token_to_input_tokens_ratio: 1\n    }\n    context_window_size_based_burndown_configs {\n      key: 200000\n      value {\n        output_thinking_tokens_to_input_tokens_ratio: 24\n        output_tokens_to_input_tokens_ratio: 4\n        input_image_to_input_tokens_ratio: 1\n        input_video_to_input_tokens_ratio: 1\n        input_audio_to_input_tokens_ratio: 7\n        output_tokens_to_input_tokens_ratio_thinking_on: 24\n        input_token_to_input_tokens_ratio: 1\n      }\n    }\n    context_window_size_based_burndown_configs {\n      key: 1000000\n      value {\n        output_thinking_tokens_to_input_tokens_ratio: 24\n        output_tokens_to_input_tokens_ratio: 4\n        input_image_to_input_tokens_ratio: 1\n        input_video_to_input_tokens_ratio: 1\n        input_audio_to_input_tokens_ratio: 7\n        output_tokens_to_input_tokens_ratio_thinking_on: 24\n        input_token_to_input_tokens_ratio: 1\n      }\n    }\n    use_usage_metadata_for_token_count: true\n    pt_explicit_cached_input_token_discount_ratio: 0.9\n  }\n}\nbase_model_config_map {\n  key: \"gemini-2.5-flash-ga\"\n  value {\n    input_image_to_input_chars_ratio: 1\n    input_videosec_to_input_chars_ratio: 1\n    output_chars_to_input_chars_ratio: 9\n    input_audiosec_to_input_chars_ratio: 4\n    tier_burndown_configs {\n      key: \"1m\"\n      value {\n        chars_to_input_chars_ratio: 1\n        input_image_to_input_chars_ratio: 1\n        input_videosec_to_input_chars_ratio: 1\n        input_audiosec_to_input_chars_ratio: 4\n        output_chars_to_input_chars_ratio: 9\n        output_thinking_tokens_to_input_tokens_ratio: 9\n        output_tokens_to_input_tokens_ratio: 9\n        input_image_to_input_tokens_ratio: 1\n        input_video_to_input_tokens_ratio: 1\n        input_audio_to_input_tokens_ratio: 4\n        output_tokens_to_input_tokens_ratio_thinking_on: 9\n      }\n    }\n    tier_burndown_configs {\n      key: \"200k\"\n      value {\n        chars_to_input_chars_ratio: 1\n        input_image_to_input_chars_ratio: 1\n        input_videosec_to_input_chars_ratio: 1\n        input_audiosec_to_input_chars_ratio: 4\n        output_chars_to_input_chars_ratio: 9\n        output_thinking_tokens_to_input_tokens_ratio: 9\n        output_tokens_to_input_tokens_ratio: 9\n        input_image_to_input_tokens_ratio: 1\n        input_video_to_input_tokens_ratio: 1\n        input_audio_to_input_tokens_ratio: 4\n        output_tokens_to_input_tokens_ratio_thinking_on: 9\n      }\n    }\n    input_image_to_input_tokens_ratio: 1\n    output_thinking_tokens_to_input_tokens_ratio: 9\n    output_tokens_to_input_tokens_ratio: 9\n    input_video_to_input_tokens_ratio: 1\n    input_audio_to_input_tokens_ratio: 4\n    output_tokens_to_input_tokens_ratio_thinking_on: 9\n    cached_input_token_discount_ratio: 0.9\n    default_burndown_config {\n      output_thinking_tokens_to_input_tokens_ratio: 9\n      output_tokens_to_input_tokens_ratio: 9\n      input_image_to_input_tokens_ratio: 1\n      input_video_to_input_tokens_ratio: 1\n      input_audio_to_input_tokens_ratio: 4\n      output_tokens_to_input_tokens_ratio_thinking_on: 9\n      input_token_to_input_tokens_ratio: 1\n    }\n    context_window_size_based_burndown_configs {\n      key: 200000\n      value {\n        output_thinking_tokens_to_input_tokens_ratio: 9\n        output_tokens_to_input_tokens_ratio: 9\n        input_image_to_input_tokens_ratio: 1\n        input_video_to_input_tokens_ratio: 1\n        input_audio_to_input_tokens_ratio: 4\n        output_tokens_to_input_tokens_ratio_thinking_on: 9\n        input_token_to_input_tokens_ratio: 1\n      }\n    }\n    context_window_size_based_burndown_configs {\n      key: 1000000\n      value {\n        output_thinking_tokens_to_input_tokens_ratio: 9\n        output_tokens_to_input_tokens_ratio: 9\n        input_image_to_input_tokens_ratio: 1\n        input_video_to_input_tokens_ratio: 1\n        input_audio_to_input_tokens_ratio: 4\n        output_tokens_to_input_tokens_ratio_thinking_on: 9\n        input_token_to_input_tokens_ratio: 1\n      }\n    }\n    use_usage_metadata_for_token_count: true\n    pt_explicit_cached_input_token_discount_ratio: 0.9\n  }\n}\nbase_model_config_map {\n  key: \"gemini-2.5-flash-image-ga\"\n  value {\n    input_image_to_input_chars_ratio: 1\n    input_image_to_input_tokens_ratio: 1\n    output_tokens_to_input_tokens_ratio: 9\n    output_image_to_input_tokens_ratio: 100\n    default_burndown_config {\n      output_tokens_to_input_tokens_ratio: 9\n      input_image_to_input_tokens_ratio: 1\n      output_image_to_input_tokens_ratio: 100\n      input_token_to_input_tokens_ratio: 1\n    }\n    use_usage_metadata_for_token_count: true\n  }\n}\nbase_model_config_map {\n  key: \"gemini-2.5-flash-lite\"\n  value {\n    input_image_to_input_chars_ratio: 1\n    input_videosec_to_input_chars_ratio: 1\n    output_chars_to_input_chars_ratio: 4\n    input_audiosec_to_input_chars_ratio: 3\n    tier_burndown_configs {\n      key: \"1m\"\n      value {\n        chars_to_input_chars_ratio: 1\n        input_image_to_input_chars_ratio: 1\n        input_videosec_to_input_chars_ratio: 1\n        input_audiosec_to_input_chars_ratio: 3\n        output_chars_to_input_chars_ratio: 4\n        output_thinking_tokens_to_input_tokens_ratio: 4\n        output_tokens_to_input_tokens_ratio: 4\n        input_image_to_input_tokens_ratio: 1\n        input_video_to_input_tokens_ratio: 1\n        input_audio_to_input_tokens_ratio: 3\n        output_tokens_to_input_tokens_ratio_thinking_on: 4\n      }\n    }\n    tier_burndown_configs {\n      key: \"200k\"\n      value {\n        chars_to_input_chars_ratio: 1\n        input_image_to_input_chars_ratio: 1\n        input_videosec_to_input_chars_ratio: 1\n        input_audiosec_to_input_chars_ratio: 3\n        output_chars_to_input_chars_ratio: 4\n        output_thinking_tokens_to_input_tokens_ratio: 4\n        output_tokens_to_input_tokens_ratio: 4\n        input_image_to_input_tokens_ratio: 1\n        input_video_to_input_tokens_ratio: 1\n        input_audio_to_input_tokens_ratio: 3\n        output_tokens_to_input_tokens_ratio_thinking_on: 4\n      }\n    }\n    input_image_to_input_tokens_ratio: 1\n    output_thinking_tokens_to_input_tokens_ratio: 4\n    output_tokens_to_input_tokens_ratio: 4\n    input_video_to_input_tokens_ratio: 1\n    input_audio_to_input_tokens_ratio: 3\n    output_tokens_to_input_tokens_ratio_thinking_on: 4\n    cached_input_token_discount_ratio: 0.9\n    default_burndown_config {\n      output_thinking_tokens_to_input_tokens_ratio: 4\n      output_tokens_to_input_tokens_ratio: 4\n      input_image_to_input_tokens_ratio: 1\n      input_video_to_input_tokens_ratio: 1\n      input_audio_to_input_tokens_ratio: 3\n      output_tokens_to_input_tokens_ratio_thinking_on: 4\n      input_token_to_input_tokens_ratio: 1\n    }\n    context_window_size_based_burndown_configs {\n      key: 200000\n      value {\n        output_thinking_tokens_to_input_tokens_ratio: 4\n        output_tokens_to_input_tokens_ratio: 4\n        input_image_to_input_tokens_ratio: 1\n        input_video_to_input_tokens_ratio: 1\n        input_audio_to_input_tokens_ratio: 3\n        output_tokens_to_input_tokens_ratio_thinking_on: 4\n        input_token_to_input_tokens_ratio: 1\n      }\n    }\n    context_window_size_based_burndown_configs {\n      key: 1000000\n      value {\n        output_thinking_tokens_to_input_tokens_ratio: 4\n        output_tokens_to_input_tokens_ratio: 4\n        input_image_to_input_tokens_ratio: 1\n        input_video_to_input_tokens_ratio: 1\n        input_audio_to_input_tokens_ratio: 3\n        output_tokens_to_input_tokens_ratio_thinking_on: 4\n        input_token_to_input_tokens_ratio: 1\n      }\n    }\n    use_usage_metadata_for_token_count: true\n    pt_explicit_cached_input_token_discount_ratio: 0.9\n  }\n}\nbase_model_config_map {\n  key: \"gemini-2.5-flash-lite-preview\"\n  value {\n    input_image_to_input_chars_ratio: 1\n    input_videosec_to_input_chars_ratio: 1\n    output_chars_to_input_chars_ratio: 4\n    input_audiosec_to_input_chars_ratio: 5\n    tier_burndown_configs {\n      key: \"1m\"\n      value {\n        chars_to_input_chars_ratio: 1\n        input_image_to_input_chars_ratio: 1\n        input_videosec_to_input_chars_ratio: 1\n        input_audiosec_to_input_chars_ratio: 5\n        output_chars_to_input_chars_ratio: 4\n        output_thinking_tokens_to_input_tokens_ratio: 4\n        output_tokens_to_input_tokens_ratio: 4\n        input_image_to_input_tokens_ratio: 1\n        input_video_to_input_tokens_ratio: 1\n        input_audio_to_input_tokens_ratio: 5\n        output_tokens_to_input_tokens_ratio_thinking_on: 4\n      }\n    }\n    tier_burndown_configs {\n      key: \"200k\"\n      value {\n        chars_to_input_chars_ratio: 1\n        input_image_to_input_chars_ratio: 1\n        input_videosec_to_input_chars_ratio: 1\n        input_audiosec_to_input_chars_ratio: 5\n        output_chars_to_input_chars_ratio: 4\n        output_thinking_tokens_to_input_tokens_ratio: 4\n        output_tokens_to_input_tokens_ratio: 4\n        input_image_to_input_tokens_ratio: 1\n        input_video_to_input_tokens_ratio: 1\n        input_audio_to_input_tokens_ratio: 5\n        output_tokens_to_input_tokens_ratio_thinking_on: 4\n      }\n    }\n    input_image_to_input_tokens_ratio: 1\n    output_thinking_tokens_to_input_tokens_ratio: 4\n    output_tokens_to_input_tokens_ratio: 4\n    input_video_to_input_tokens_ratio: 1\n    input_audio_to_input_tokens_ratio: 5\n    output_tokens_to_input_tokens_ratio_thinking_on: 4\n    cached_input_token_discount_ratio: 0.9\n    pt_explicit_cached_input_token_discount_ratio: 0.9\n  }\n}\nbase_model_config_map {\n  key: \"gemini-2.5-flash-lite-preview-09-2025\"\n  value {\n    input_image_to_input_chars_ratio: 1\n    input_videosec_to_input_chars_ratio: 1\n    output_chars_to_input_chars_ratio: 4\n    input_audiosec_to_input_chars_ratio: 3\n    tier_burndown_configs {\n      key: \"1m\"\n      value {\n        chars_to_input_chars_ratio: 1\n        input_image_to_input_chars_ratio: 1\n        input_videosec_to_input_chars_ratio: 1\n        input_audiosec_to_input_chars_ratio: 3\n        output_chars_to_input_chars_ratio: 4\n        output_thinking_tokens_to_input_tokens_ratio: 4\n        output_tokens_to_input_tokens_ratio: 4\n        input_image_to_input_tokens_ratio: 1\n        input_video_to_input_tokens_ratio: 1\n        input_audio_to_input_tokens_ratio: 3\n        output_tokens_to_input_tokens_ratio_thinking_on: 4\n      }\n    }\n    tier_burndown_configs {\n      key: \"200k\"\n      value {\n        chars_to_input_chars_ratio: 1\n        input_image_to_input_chars_ratio: 1\n        input_videosec_to_input_chars_ratio: 1\n        input_audiosec_to_input_chars_ratio: 3\n        output_chars_to_input_chars_ratio: 4\n        output_thinking_tokens_to_input_tokens_ratio: 4\n        output_tokens_to_input_tokens_ratio: 4\n        input_image_to_input_tokens_ratio: 1\n        input_video_to_input_tokens_ratio: 1\n        input_audio_to_input_tokens_ratio: 3\n        output_tokens_to_input_tokens_ratio_thinking_on: 4\n      }\n    }\n    input_image_to_input_tokens_ratio: 1\n    output_thinking_tokens_to_input_tokens_ratio: 4\n    output_tokens_to_input_tokens_ratio: 4\n    input_video_to_input_tokens_ratio: 1\n    input_audio_to_input_tokens_ratio: 3\n    output_tokens_to_input_tokens_ratio_thinking_on: 4\n    cached_input_token_discount_ratio: 0.9\n    default_burndown_config {\n      output_thinking_tokens_to_input_tokens_ratio: 4\n      output_tokens_to_input_tokens_ratio: 4\n      input_image_to_input_tokens_ratio: 1\n      input_video_to_input_tokens_ratio: 1\n      input_audio_to_input_tokens_ratio: 3\n      output_tokens_to_input_tokens_ratio_thinking_on: 4\n      input_token_to_input_tokens_ratio: 1\n    }\n    context_window_size_based_burndown_configs {\n      key: 200000\n      value {\n        output_thinking_tokens_to_input_tokens_ratio: 4\n        output_tokens_to_input_tokens_ratio: 4\n        input_image_to_input_tokens_ratio: 1\n        input_video_to_input_tokens_ratio: 1\n        input_audio_to_input_tokens_ratio: 3\n        output_tokens_to_input_tokens_ratio_thinking_on: 4\n        input_token_to_input_tokens_ratio: 1\n      }\n    }\n    context_window_size_based_burndown_configs {\n      key: 1000000\n      value {\n        output_thinking_tokens_to_input_tokens_ratio: 4\n        output_tokens_to_input_tokens_ratio: 4\n        input_image_to_input_tokens_ratio: 1\n        input_video_to_input_tokens_ratio: 1\n        input_audio_to_input_tokens_ratio: 3\n        output_tokens_to_input_tokens_ratio_thinking_on: 4\n        input_token_to_input_tokens_ratio: 1\n      }\n    }\n    use_usage_metadata_for_token_count: true\n    pt_explicit_cached_input_token_discount_ratio: 0.9\n    use_normalized_pt_token_count: true\n  }\n}\nbase_model_config_map {\n  key: \"gemini-2.5-flash-manual-test\"\n  value {\n    default_burndown_config {\n      output_thinking_tokens_to_input_tokens_ratio: 8\n      output_tokens_to_input_tokens_ratio: 8\n      input_image_to_input_tokens_ratio: 1\n      input_video_to_input_tokens_ratio: 1\n      input_audio_to_input_tokens_ratio: 1\n      output_tokens_to_input_tokens_ratio_thinking_on: 8\n      cached_input_token_discount_ratio: 0.7\n      input_token_to_input_tokens_ratio: 1\n    }\n    context_window_size_based_burndown_configs {\n      key: 200000\n      value {\n        output_thinking_tokens_to_input_tokens_ratio: 8\n        output_tokens_to_input_tokens_ratio: 8\n        input_image_to_input_tokens_ratio: 1\n        input_video_to_input_tokens_ratio: 1\n        input_audio_to_input_tokens_ratio: 1\n        output_tokens_to_input_tokens_ratio_thinking_on: 8\n        cached_input_token_discount_ratio: 0.7\n        input_token_to_input_tokens_ratio: 1\n      }\n    }\n    context_window_size_based_burndown_configs {\n      key: 1000000\n      value {\n        output_thinking_tokens_to_input_tokens_ratio: 6\n        output_tokens_to_input_tokens_ratio: 6\n        input_image_to_input_tokens_ratio: 2\n        input_video_to_input_tokens_ratio: 2\n        input_audio_to_input_tokens_ratio: 2\n        output_tokens_to_input_tokens_ratio_thinking_on: 6\n        cached_input_token_discount_ratio: 0.7\n        input_token_to_input_tokens_ratio: 2\n      }\n    }\n    use_usage_metadata_for_token_count: true\n    pt_explicit_cached_input_token_discount_ratio: 0.6\n    use_normalized_pt_token_count: true\n  }\n}\nbase_model_config_map {\n  key: \"gemini-2.5-flash-preview-09-2025\"\n  value {\n    input_image_to_input_chars_ratio: 1\n    input_videosec_to_input_chars_ratio: 1\n    output_chars_to_input_chars_ratio: 9\n    input_audiosec_to_input_chars_ratio: 4\n    tier_burndown_configs {\n      key: \"1m\"\n      value {\n        chars_to_input_chars_ratio: 1\n        input_image_to_input_chars_ratio: 1\n        input_videosec_to_input_chars_ratio: 1\n        input_audiosec_to_input_chars_ratio: 4\n        output_chars_to_input_chars_ratio: 9\n        output_thinking_tokens_to_input_tokens_ratio: 9\n        output_tokens_to_input_tokens_ratio: 9\n        input_image_to_input_tokens_ratio: 1\n        input_video_to_input_tokens_ratio: 1\n        input_audio_to_input_tokens_ratio: 4\n        output_tokens_to_input_tokens_ratio_thinking_on: 9\n      }\n    }\n    tier_burndown_configs {\n      key: \"200k\"\n      value {\n        chars_to_input_chars_ratio: 1\n        input_image_to_input_chars_ratio: 1\n        input_videosec_to_input_chars_ratio: 1\n        input_audiosec_to_input_chars_ratio: 4\n        output_chars_to_input_chars_ratio: 9\n        output_thinking_tokens_to_input_tokens_ratio: 9\n        output_tokens_to_input_tokens_ratio: 9\n        input_image_to_input_tokens_ratio: 1\n        input_video_to_input_tokens_ratio: 1\n        input_audio_to_input_tokens_ratio: 4\n        output_tokens_to_input_tokens_ratio_thinking_on: 9\n      }\n    }\n    input_image_to_input_tokens_ratio: 1\n    output_thinking_tokens_to_input_tokens_ratio: 9\n    output_tokens_to_input_tokens_ratio: 9\n    input_video_to_input_tokens_ratio: 1\n    input_audio_to_input_tokens_ratio: 4\n    output_tokens_to_input_tokens_ratio_thinking_on: 9\n    cached_input_token_discount_ratio: 0.9\n    default_burndown_config {\n      output_thinking_tokens_to_input_tokens_ratio: 9\n      output_tokens_to_input_tokens_ratio: 9\n      input_image_to_input_tokens_ratio: 1\n      input_video_to_input_tokens_ratio: 1\n      input_audio_to_input_tokens_ratio: 4\n      output_tokens_to_input_tokens_ratio_thinking_on: 9\n      input_token_to_input_tokens_ratio: 1\n    }\n    context_window_size_based_burndown_configs {\n      key: 200000\n      value {\n        output_thinking_tokens_to_input_tokens_ratio: 9\n        output_tokens_to_input_tokens_ratio: 9\n        input_image_to_input_tokens_ratio: 1\n        input_video_to_input_tokens_ratio: 1\n        input_audio_to_input_tokens_ratio: 4\n        output_tokens_to_input_tokens_ratio_thinking_on: 9\n        input_token_to_input_tokens_ratio: 1\n      }\n    }\n    context_window_size_based_burndown_configs {\n      key: 1000000\n      value {\n        output_thinking_tokens_to_input_tokens_ratio: 9\n        output_tokens_to_input_tokens_ratio: 9\n        input_image_to_input_tokens_ratio: 1\n        input_video_to_input_tokens_ratio: 1\n        input_audio_to_input_tokens_ratio: 4\n        output_tokens_to_input_tokens_ratio_thinking_on: 9\n        input_token_to_input_tokens_ratio: 1\n      }\n    }\n    use_usage_metadata_for_token_count: true\n    pt_explicit_cached_input_token_discount_ratio: 0.9\n    use_normalized_pt_token_count: true\n  }\n}\nbase_model_config_map {\n  key: \"gemini-2.5-flash-preview-image\"\n  value {\n    input_image_to_input_chars_ratio: 1\n    input_image_to_input_tokens_ratio: 1\n    output_tokens_to_input_tokens_ratio: 9\n    output_image_to_input_tokens_ratio: 100\n    default_burndown_config {\n      output_tokens_to_input_tokens_ratio: 9\n      input_image_to_input_tokens_ratio: 1\n      output_image_to_input_tokens_ratio: 100\n      input_token_to_input_tokens_ratio: 1\n    }\n    use_usage_metadata_for_token_count: true\n    use_normalized_pt_token_count: true\n  }\n}\nbase_model_config_map {\n  key: \"gemini-2.5-flash-preview-native-audio-dialog\"\n  value {\n    input_videosec_to_input_chars_ratio: 6\n    output_chars_to_input_chars_ratio: 4\n    input_audiosec_to_input_chars_ratio: 6\n    tier_burndown_configs {\n      key: \"200k\"\n      value {\n        chars_to_input_chars_ratio: 1\n        input_videosec_to_input_chars_ratio: 6\n        input_audiosec_to_input_chars_ratio: 6\n        output_chars_to_input_chars_ratio: 4\n        output_tokens_to_input_tokens_ratio: 4\n        input_video_to_input_tokens_ratio: 6\n        input_audio_to_input_tokens_ratio: 6\n        output_audio_to_input_tokens_ratio: 24\n      }\n    }\n    output_audio_to_input_tokens_ratio: 24\n  }\n}\nbase_model_config_map {\n  key: \"gemini-2.5-pro-ga\"\n  value {\n    input_image_to_input_chars_ratio: 1\n    input_videosec_to_input_chars_ratio: 1\n    output_chars_to_input_chars_ratio: 8\n    input_audiosec_to_input_chars_ratio: 1\n    tier_burndown_configs {\n      key: \"1m\"\n      value {\n        chars_to_input_chars_ratio: 2\n        input_image_to_input_chars_ratio: 2\n        input_videosec_to_input_chars_ratio: 2\n        input_audiosec_to_input_chars_ratio: 2\n        output_chars_to_input_chars_ratio: 6\n        output_thinking_tokens_to_input_tokens_ratio: 6\n        output_tokens_to_input_tokens_ratio: 6\n        input_image_to_input_tokens_ratio: 2\n        input_video_to_input_tokens_ratio: 2\n        input_audio_to_input_tokens_ratio: 2\n        output_tokens_to_input_tokens_ratio_thinking_on: 6\n      }\n    }\n    tier_burndown_configs {\n      key: \"200k\"\n      value {\n        chars_to_input_chars_ratio: 1\n        input_image_to_input_chars_ratio: 1\n        input_videosec_to_input_chars_ratio: 1\n        input_audiosec_to_input_chars_ratio: 1\n        output_chars_to_input_chars_ratio: 8\n        output_thinking_tokens_to_input_tokens_ratio: 8\n        output_tokens_to_input_tokens_ratio: 8\n        input_image_to_input_tokens_ratio: 1\n        input_video_to_input_tokens_ratio: 1\n        input_audio_to_input_tokens_ratio: 1\n        output_tokens_to_input_tokens_ratio_thinking_on: 8\n      }\n    }\n    input_image_to_input_tokens_ratio: 1\n    output_thinking_tokens_to_input_tokens_ratio: 8\n    output_tokens_to_input_tokens_ratio: 8\n    input_video_to_input_tokens_ratio: 1\n    input_audio_to_input_tokens_ratio: 1\n    output_tokens_to_input_tokens_ratio_thinking_on: 8\n    cached_input_token_discount_ratio: 0.9\n    default_burndown_config {\n      output_thinking_tokens_to_input_tokens_ratio: 8\n      output_tokens_to_input_tokens_ratio: 8\n      input_image_to_input_tokens_ratio: 1\n      input_video_to_input_tokens_ratio: 1\n      input_audio_to_input_tokens_ratio: 1\n      output_tokens_to_input_tokens_ratio_thinking_on: 8\n      input_token_to_input_tokens_ratio: 1\n    }\n    context_window_size_based_burndown_configs {\n      key: 200000\n      value {\n        output_thinking_tokens_to_input_tokens_ratio: 8\n        output_tokens_to_input_tokens_ratio: 8\n        input_image_to_input_tokens_ratio: 1\n        input_video_to_input_tokens_ratio: 1\n        input_audio_to_input_tokens_ratio: 1\n        output_tokens_to_input_tokens_ratio_thinking_on: 8\n        input_token_to_input_tokens_ratio: 1\n      }\n    }\n    context_window_size_based_burndown_configs {\n      key: 1000000\n      value {\n        output_thinking_tokens_to_input_tokens_ratio: 6\n        output_tokens_to_input_tokens_ratio: 6\n        input_image_to_input_tokens_ratio: 2\n        input_video_to_input_tokens_ratio: 2\n        input_audio_to_input_tokens_ratio: 2\n        output_tokens_to_input_tokens_ratio_thinking_on: 6\n        input_token_to_input_tokens_ratio: 2\n      }\n    }\n    use_usage_metadata_for_token_count: true\n    pt_explicit_cached_input_token_discount_ratio: 0.9\n  }\n}\nbase_model_config_map {\n  key: \"gemini-2.5-pro-preview\"\n  value {\n    input_image_to_input_chars_ratio: 1\n    input_videosec_to_input_chars_ratio: 1\n    output_chars_to_input_chars_ratio: 8\n    input_audiosec_to_input_chars_ratio: 1\n    tier_burndown_configs {\n      key: \"1m\"\n      value {\n        chars_to_input_chars_ratio: 2\n        input_image_to_input_chars_ratio: 2\n        input_videosec_to_input_chars_ratio: 2\n        input_audiosec_to_input_chars_ratio: 2\n        output_chars_to_input_chars_ratio: 6\n        output_thinking_tokens_to_input_tokens_ratio: 6\n      }\n    }\n    tier_burndown_configs {\n      key: \"200k\"\n      value {\n        chars_to_input_chars_ratio: 1\n        input_image_to_input_chars_ratio: 1\n        input_videosec_to_input_chars_ratio: 1\n        input_audiosec_to_input_chars_ratio: 1\n        output_chars_to_input_chars_ratio: 8\n        output_thinking_tokens_to_input_tokens_ratio: 8\n      }\n    }\n    output_thinking_tokens_to_input_tokens_ratio: 8\n    cached_input_token_discount_ratio: 0.9\n    default_burndown_config {\n      output_thinking_tokens_to_input_tokens_ratio: 8\n      output_tokens_to_input_tokens_ratio: 8\n      input_image_to_input_tokens_ratio: 1\n      input_video_to_input_tokens_ratio: 1\n      input_audio_to_input_tokens_ratio: 1\n      input_token_to_input_tokens_ratio: 1\n    }\n    context_window_size_based_burndown_configs {\n      key: 200000\n      value {\n        output_thinking_tokens_to_input_tokens_ratio: 8\n        output_tokens_to_input_tokens_ratio: 8\n        input_image_to_input_tokens_ratio: 1\n        input_video_to_input_tokens_ratio: 1\n        input_audio_to_input_tokens_ratio: 1\n        input_token_to_input_tokens_ratio: 1\n      }\n    }\n    context_window_size_based_burndown_configs {\n      key: 1000000\n      value {\n        output_thinking_tokens_to_input_tokens_ratio: 6\n        output_tokens_to_input_tokens_ratio: 6\n        input_image_to_input_tokens_ratio: 2\n        input_video_to_input_tokens_ratio: 2\n        input_audio_to_input_tokens_ratio: 2\n        input_token_to_input_tokens_ratio: 2\n      }\n    }\n    pt_explicit_cached_input_token_discount_ratio: 0.9\n  }\n}\nbase_model_config_map {\n  key: \"gemini-2.5-pro-preview-06-05\"\n  value {\n    input_image_to_input_chars_ratio: 1\n    input_videosec_to_input_chars_ratio: 1\n    output_chars_to_input_chars_ratio: 8\n    input_audiosec_to_input_chars_ratio: 1\n    tier_burndown_configs {\n      key: \"1m\"\n      value {\n        chars_to_input_chars_ratio: 2\n        input_image_to_input_chars_ratio: 2\n        input_videosec_to_input_chars_ratio: 2\n        input_audiosec_to_input_chars_ratio: 2\n        output_chars_to_input_chars_ratio: 6\n        output_thinking_tokens_to_input_tokens_ratio: 6\n        output_tokens_to_input_tokens_ratio: 6\n        input_image_to_input_tokens_ratio: 2\n        input_video_to_input_tokens_ratio: 2\n        input_audio_to_input_tokens_ratio: 2\n        output_tokens_to_input_tokens_ratio_thinking_on: 6\n      }\n    }\n    tier_burndown_configs {\n      key: \"200k\"\n      value {\n        chars_to_input_chars_ratio: 1\n        input_image_to_input_chars_ratio: 1\n        input_videosec_to_input_chars_ratio: 1\n        input_audiosec_to_input_chars_ratio: 1\n        output_chars_to_input_chars_ratio: 8\n        output_thinking_tokens_to_input_tokens_ratio: 8\n        output_tokens_to_input_tokens_ratio: 8\n        input_image_to_input_tokens_ratio: 1\n        input_video_to_input_tokens_ratio: 1\n        input_audio_to_input_tokens_ratio: 1\n        output_tokens_to_input_tokens_ratio_thinking_on: 8\n      }\n    }\n    input_image_to_input_tokens_ratio: 1\n    output_thinking_tokens_to_input_tokens_ratio: 8\n    output_tokens_to_input_tokens_ratio: 8\n    input_video_to_input_tokens_ratio: 1\n    input_audio_to_input_tokens_ratio: 1\n    output_tokens_to_input_tokens_ratio_thinking_on: 8\n    cached_input_token_discount_ratio: 0.9\n    default_burndown_config {\n      output_thinking_tokens_to_input_tokens_ratio: 8\n      output_tokens_to_input_tokens_ratio: 8\n      input_image_to_input_tokens_ratio: 1\n      input_video_to_input_tokens_ratio: 1\n      input_audio_to_input_tokens_ratio: 1\n      output_tokens_to_input_tokens_ratio_thinking_on: 8\n      input_token_to_input_tokens_ratio: 1\n    }\n    context_window_size_based_burndown_configs {\n      key: 200000\n      value {\n        output_thinking_tokens_to_input_tokens_ratio: 8\n        output_tokens_to_input_tokens_ratio: 8\n        input_image_to_input_tokens_ratio: 1\n        input_video_to_input_tokens_ratio: 1\n        input_audio_to_input_tokens_ratio: 1\n        output_tokens_to_input_tokens_ratio_thinking_on: 8\n        input_token_to_input_tokens_ratio: 1\n      }\n    }\n    context_window_size_based_burndown_configs {\n      key: 1000000\n      value {\n        output_thinking_tokens_to_input_tokens_ratio: 6\n        output_tokens_to_input_tokens_ratio: 6\n        input_image_to_input_tokens_ratio: 2\n        input_video_to_input_tokens_ratio: 2\n        input_audio_to_input_tokens_ratio: 2\n        output_tokens_to_input_tokens_ratio_thinking_on: 6\n        input_token_to_input_tokens_ratio: 2\n      }\n    }\n    pt_explicit_cached_input_token_discount_ratio: 0.9\n  }\n}\nbase_model_config_map {\n  key: \"gemini-live-2.5-flash\"\n  value {\n    input_videosec_to_input_chars_ratio: 6\n    output_chars_to_input_chars_ratio: 4\n    input_audiosec_to_input_chars_ratio: 6\n    tier_burndown_configs {\n      key: \"1m\"\n      value {\n        chars_to_input_chars_ratio: 1\n        input_videosec_to_input_chars_ratio: 6\n        input_audiosec_to_input_chars_ratio: 6\n        output_chars_to_input_chars_ratio: 4\n        output_tokens_to_input_tokens_ratio: 4\n        input_video_to_input_tokens_ratio: 6\n        input_audio_to_input_tokens_ratio: 6\n        output_audio_to_input_tokens_ratio: 24\n      }\n    }\n    tier_burndown_configs {\n      key: \"200k\"\n      value {\n        chars_to_input_chars_ratio: 1\n        input_videosec_to_input_chars_ratio: 6\n        input_audiosec_to_input_chars_ratio: 6\n        output_chars_to_input_chars_ratio: 4\n        output_tokens_to_input_tokens_ratio: 4\n        input_video_to_input_tokens_ratio: 6\n        input_audio_to_input_tokens_ratio: 6\n        output_audio_to_input_tokens_ratio: 24\n      }\n    }\n    output_tokens_to_input_tokens_ratio: 4\n    input_video_to_input_tokens_ratio: 6\n    input_audio_to_input_tokens_ratio: 6\n    output_audio_to_input_tokens_ratio: 24\n    default_burndown_config {\n      output_tokens_to_input_tokens_ratio: 4\n      input_image_to_input_tokens_ratio: 1\n      input_video_to_input_tokens_ratio: 6\n      input_audio_to_input_tokens_ratio: 6\n      output_audio_to_input_tokens_ratio: 24\n      input_token_to_input_tokens_ratio: 1\n    }\n    context_window_size_based_burndown_configs {\n      key: 200000\n      value {\n        output_tokens_to_input_tokens_ratio: 4\n        input_image_to_input_tokens_ratio: 1\n        input_video_to_input_tokens_ratio: 6\n        input_audio_to_input_tokens_ratio: 6\n        output_audio_to_input_tokens_ratio: 24\n        input_token_to_input_tokens_ratio: 1\n      }\n    }\n    context_window_size_based_burndown_configs {\n      key: 1000000\n      value {\n        output_tokens_to_input_tokens_ratio: 4\n        input_image_to_input_tokens_ratio: 1\n        input_video_to_input_tokens_ratio: 6\n        input_audio_to_input_tokens_ratio: 6\n        output_audio_to_input_tokens_ratio: 24\n        input_token_to_input_tokens_ratio: 1\n      }\n    }\n    use_usage_metadata_for_token_count: true\n  }\n}\nbase_model_config_map {\n  key: \"gemini-live-2.5-flash-preview-native-audio-09-2025\"\n  value {\n    input_image_to_input_chars_ratio: 6\n    input_image_to_input_tokens_ratio: 6\n    output_tokens_to_input_tokens_ratio: 4\n    input_video_to_input_tokens_ratio: 6\n    input_audio_to_input_tokens_ratio: 6\n    output_audio_to_input_tokens_ratio: 24\n    default_burndown_config {\n      input_image_to_input_chars_ratio: 6\n      output_tokens_to_input_tokens_ratio: 4\n      input_image_to_input_tokens_ratio: 6\n      input_video_to_input_tokens_ratio: 6\n      input_audio_to_input_tokens_ratio: 6\n      output_audio_to_input_tokens_ratio: 24\n      input_token_to_input_tokens_ratio: 1\n    }\n    use_usage_metadata_for_token_count: true\n  }\n}\nbase_model_config_map {\n  key: \"gemini-pro\"\n  value {\n    input_image_to_input_chars_ratio: 20000\n    input_videosec_to_input_chars_ratio: 16000\n    output_chars_to_input_chars_ratio: 3\n  }\n}\nbase_model_config_map {\n  key: \"gemini-pro-vision\"\n  value {\n    input_image_to_input_chars_ratio: 20000\n    input_videosec_to_input_chars_ratio: 16000\n    output_chars_to_input_chars_ratio: 3\n  }\n}\nbase_model_config_map {\n  key: \"gemini-ultra\"\n  value {\n    input_image_to_input_chars_ratio: 1000\n    input_videosec_to_input_chars_ratio: 800\n    output_chars_to_input_chars_ratio: 3\n  }\n}\nbase_model_config_map {\n  key: \"gemini-ultra-vision\"\n  value {\n    input_image_to_input_chars_ratio: 1000\n    input_videosec_to_input_chars_ratio: 800\n    output_chars_to_input_chars_ratio: 3\n  }\n}\nbase_model_config_map {\n  key: \"gpt-oss-120b-maas\"\n  value {\n    default_burndown_config {\n      output_tokens_to_input_tokens_ratio: 4\n      input_token_to_input_tokens_ratio: 1\n    }\n    use_usage_metadata_for_token_count: true\n    use_normalized_pt_token_count: true\n  }\n}\nbase_model_config_map {\n  key: \"gpt-oss-20b-maas\"\n  value {\n    default_burndown_config {\n      output_tokens_to_input_tokens_ratio: 4\n      input_token_to_input_tokens_ratio: 1\n    }\n    use_usage_metadata_for_token_count: true\n    use_normalized_pt_token_count: true\n  }\n}\nbase_model_config_map {\n  key: \"imagegeneration\"\n  value {\n    output_chars_to_input_chars_ratio: 1\n    output_image_to_input_chars_ratio: 4\n  }\n}\nbase_model_config_map {\n  key: \"imagen-2.0-edit\"\n  value {\n    output_chars_to_input_chars_ratio: 1\n    output_image_to_input_chars_ratio: 4\n  }\n}\nbase_model_config_map {\n  key: \"imagen-3.0-capability\"\n  value {\n    output_chars_to_input_chars_ratio: 1\n    output_image_to_input_chars_ratio: 4\n  }\n}\nbase_model_config_map {\n  key: \"imagen-3.0-fast-generate\"\n  value {\n    output_chars_to_input_chars_ratio: 1\n    output_image_to_input_chars_ratio: 4\n  }\n}\nbase_model_config_map {\n  key: \"imagen-3.0-generate\"\n  value {\n    output_chars_to_input_chars_ratio: 1\n    output_image_to_input_chars_ratio: 4\n  }\n}\nbase_model_config_map {\n  key: \"imagen-4.0-fast-generate\"\n  value {\n    output_chars_to_input_chars_ratio: 1\n    output_image_to_input_chars_ratio: 4\n  }\n}\nbase_model_config_map {\n  key: \"imagen-4.0-generate\"\n  value {\n    output_chars_to_input_chars_ratio: 1\n    output_image_to_input_chars_ratio: 4\n  }\n}\nbase_model_config_map {\n  key: \"imagen-4.0-ultra-generate\"\n  value {\n    output_chars_to_input_chars_ratio: 1\n    output_image_to_input_chars_ratio: 4\n  }\n}\nbase_model_config_map {\n  key: \"internal-test-google-ai-mp-private-offer-1.0\"\n  value {\n    output_chars_to_input_chars_ratio: 5\n  }\n}\nbase_model_config_map {\n  key: \"internal-test-google-ai-mp-private-offer-2.0\"\n  value {\n    output_chars_to_input_chars_ratio: 5\n  }\n}\nbase_model_config_map {\n  key: \"internal-test-google-ai-mp-usage-demo-3.0\"\n  value {\n    output_chars_to_input_chars_ratio: 5\n  }\n}\nbase_model_config_map {\n  key: \"internal-test-google-ai-mp-usage-demo-4.0\"\n  value {\n    output_chars_to_input_chars_ratio: 5\n    input_cache_read_chars_to_input_chars_ratio: 0.2\n    input_cache_write_chars_to_input_chars_ratio: 6\n    input_token_to_precharge_tokens_ratio: 1\n    max_output_token_to_precharge_tokens_ratio: 2\n    cache_input_token_to_precharge_tokens_ratio: 0.5\n    input_image_to_input_tokens_ratio: 1600\n    context_window_size_based_burndown_configs {\n      key: 10000\n      value {\n        chars_to_input_chars_ratio: 3\n        output_chars_to_input_chars_ratio: 6\n        input_cache_read_chars_to_input_chars_ratio: 4\n        input_cache_write_chars_to_input_chars_ratio: 5\n        input_cache_write_one_hour_chars_to_input_chars_ratio: 10\n      }\n    }\n    context_window_size_based_burndown_configs {\n      key: 1000000\n      value {\n        chars_to_input_chars_ratio: 30\n        output_chars_to_input_chars_ratio: 60\n        input_cache_read_chars_to_input_chars_ratio: 40\n        input_cache_write_chars_to_input_chars_ratio: 50\n        input_cache_write_one_hour_chars_to_input_chars_ratio: 100\n      }\n    }\n    should_parse_requested_max_output_tokens: true\n    should_parse_request_for_precharging: true\n    input_cache_write_one_hour_chars_to_input_chars_ratio: 12\n  }\n}\nbase_model_config_map {\n  key: \"internal-test-google-pinnacle-e2e-test\"\n  value {\n    output_chars_to_input_chars_ratio: 5\n    input_token_to_precharge_tokens_ratio: 1\n    max_output_token_to_precharge_tokens_ratio: 2\n    cache_input_token_to_precharge_tokens_ratio: 0.5\n    input_image_to_input_tokens_ratio: 1600\n    paygo_output_chars_to_input_chars_ratio: 1\n    paygo_input_cache_write_chars_to_input_chars_ratio: 1\n  }\n}\nbase_model_config_map {\n  key: \"internal-test-google-pinnacle-load-test-model-1\"\n  value {\n    output_chars_to_input_chars_ratio: 5\n    input_cache_read_chars_to_input_chars_ratio: 0.2\n    input_cache_write_chars_to_input_chars_ratio: 6\n    input_token_to_precharge_tokens_ratio: 1\n    max_output_token_to_precharge_tokens_ratio: 2\n    cache_input_token_to_precharge_tokens_ratio: 0.5\n    input_image_to_input_tokens_ratio: 1600\n    context_window_size_based_burndown_configs {\n      key: 10000\n      value {\n        chars_to_input_chars_ratio: 3\n        output_chars_to_input_chars_ratio: 6\n        input_cache_read_chars_to_input_chars_ratio: 4\n        input_cache_write_chars_to_input_chars_ratio: 5\n      }\n    }\n    context_window_size_based_burndown_configs {\n      key: 1000000\n      value {\n        chars_to_input_chars_ratio: 30\n        output_chars_to_input_chars_ratio: 60\n        input_cache_read_chars_to_input_chars_ratio: 40\n        input_cache_write_chars_to_input_chars_ratio: 50\n      }\n    }\n  }\n}\nbase_model_config_map {\n  key: \"internal-test-google-pinnacle-load-test-model-2\"\n  value {\n    output_chars_to_input_chars_ratio: 5\n    input_cache_read_chars_to_input_chars_ratio: 0.2\n    input_cache_write_chars_to_input_chars_ratio: 6\n    input_token_to_precharge_tokens_ratio: 1\n    max_output_token_to_precharge_tokens_ratio: 2\n    cache_input_token_to_precharge_tokens_ratio: 0.5\n    input_image_to_input_tokens_ratio: 1600\n    context_window_size_based_burndown_configs {\n      key: 10000\n      value {\n        chars_to_input_chars_ratio: 3\n        output_chars_to_input_chars_ratio: 6\n        input_cache_read_chars_to_input_chars_ratio: 4\n        input_cache_write_chars_to_input_chars_ratio: 5\n      }\n    }\n    context_window_size_based_burndown_configs {\n      key: 1000000\n      value {\n        chars_to_input_chars_ratio: 30\n        output_chars_to_input_chars_ratio: 60\n        input_cache_read_chars_to_input_chars_ratio: 40\n        input_cache_write_chars_to_input_chars_ratio: 50\n      }\n    }\n  }\n}\nbase_model_config_map {\n  key: \"internal-test-google-pinnacle-load-test-model-3\"\n  value {\n    output_chars_to_input_chars_ratio: 5\n    input_cache_read_chars_to_input_chars_ratio: 0.2\n    input_cache_write_chars_to_input_chars_ratio: 6\n    input_token_to_precharge_tokens_ratio: 1\n    max_output_token_to_precharge_tokens_ratio: 2\n    cache_input_token_to_precharge_tokens_ratio: 0.5\n    input_image_to_input_tokens_ratio: 1600\n    context_window_size_based_burndown_configs {\n      key: 10000\n      value {\n        chars_to_input_chars_ratio: 3\n        output_chars_to_input_chars_ratio: 6\n        input_cache_read_chars_to_input_chars_ratio: 4\n        input_cache_write_chars_to_input_chars_ratio: 5\n      }\n    }\n    context_window_size_based_burndown_configs {\n      key: 1000000\n      value {\n        chars_to_input_chars_ratio: 30\n        output_chars_to_input_chars_ratio: 60\n        input_cache_read_chars_to_input_chars_ratio: 40\n        input_cache_write_chars_to_input_chars_ratio: 50\n      }\n    }\n    should_parse_requested_max_output_tokens: true\n  }\n}\nbase_model_config_map {\n  key: \"internal-test-google-pinnacle-load-test-model-4\"\n  value {\n    output_chars_to_input_chars_ratio: 5\n    input_cache_read_chars_to_input_chars_ratio: 0.2\n    input_cache_write_chars_to_input_chars_ratio: 6\n    input_token_to_precharge_tokens_ratio: 1\n    max_output_token_to_precharge_tokens_ratio: 2\n    cache_input_token_to_precharge_tokens_ratio: 0.5\n    input_image_to_input_tokens_ratio: 1600\n    context_window_size_based_burndown_configs {\n      key: 10000\n      value {\n        chars_to_input_chars_ratio: 3\n        output_chars_to_input_chars_ratio: 6\n        input_cache_read_chars_to_input_chars_ratio: 4\n        input_cache_write_chars_to_input_chars_ratio: 5\n        input_cache_write_one_hour_chars_to_input_chars_ratio: 10\n      }\n    }\n    context_window_size_based_burndown_configs {\n      key: 1000000\n      value {\n        chars_to_input_chars_ratio: 30\n        output_chars_to_input_chars_ratio: 60\n        input_cache_read_chars_to_input_chars_ratio: 40\n        input_cache_write_chars_to_input_chars_ratio: 50\n        input_cache_write_one_hour_chars_to_input_chars_ratio: 100\n      }\n    }\n    input_cache_write_one_hour_chars_to_input_chars_ratio: 12\n  }\n}\nbase_model_config_map {\n  key: \"internal-test-google-pinnacle-playground-1\"\n  value {\n    input_image_to_input_chars_ratio: 1600\n    output_chars_to_input_chars_ratio: 5\n    input_token_to_precharge_tokens_ratio: 1\n    max_output_token_to_precharge_tokens_ratio: 0.5\n    cache_input_token_to_precharge_tokens_ratio: 0.5\n    input_image_to_input_tokens_ratio: 1600\n  }\n}\nbase_model_config_map {\n  key: \"internal-test-google-test-model\"\n  value {\n    input_image_to_input_chars_ratio: 100\n    input_videosec_to_input_chars_ratio: 200\n    output_chars_to_input_chars_ratio: 5\n    input_cache_read_chars_to_input_chars_ratio: 0.1\n    input_cache_write_chars_to_input_chars_ratio: 1.25\n    input_token_to_precharge_tokens_ratio: 1\n    max_output_token_to_precharge_tokens_ratio: 0.25\n    cache_input_token_to_precharge_tokens_ratio: 0.1125\n    input_image_to_input_tokens_ratio: 1600\n  }\n}\nbase_model_config_map {\n  key: \"kimi-k2-thinking-maas\"\n  value {\n    default_burndown_config {\n      output_tokens_to_input_tokens_ratio: 4\n      input_token_to_input_tokens_ratio: 1\n    }\n    use_usage_metadata_for_token_count: true\n    use_normalized_pt_token_count: true\n  }\n}\nbase_model_config_map {\n  key: \"llama-3.3-70b-instruct-maas\"\n  value {\n    default_burndown_config {\n      output_tokens_to_input_tokens_ratio: 1\n      input_token_to_input_tokens_ratio: 1\n    }\n    use_usage_metadata_for_token_count: true\n    use_normalized_pt_token_count: true\n  }\n}\nbase_model_config_map {\n  key: \"llama-4-maverick-17b-128e-instruct-maas\"\n  value {\n    default_burndown_config {\n      output_tokens_to_input_tokens_ratio: 4\n      input_image_to_input_tokens_ratio: 1\n      input_token_to_input_tokens_ratio: 1\n    }\n    use_usage_metadata_for_token_count: true\n    use_normalized_pt_token_count: true\n  }\n}\nbase_model_config_map {\n  key: \"llama-4-scout-17b-16e-instruct-maas\"\n  value {\n    default_burndown_config {\n      output_tokens_to_input_tokens_ratio: 3\n      input_image_to_input_tokens_ratio: 1\n      input_token_to_input_tokens_ratio: 1\n    }\n    use_usage_metadata_for_token_count: true\n    use_normalized_pt_token_count: true\n  }\n}\nbase_model_config_map {\n  key: \"minimax-m2-maas\"\n  value {\n    default_burndown_config {\n      output_tokens_to_input_tokens_ratio: 4\n      input_token_to_input_tokens_ratio: 1\n    }\n    use_usage_metadata_for_token_count: true\n    use_normalized_pt_token_count: true\n  }\n}\nbase_model_config_map {\n  key: \"mistralai-codestral-2\"\n  value {\n    output_chars_to_input_chars_ratio: 3\n  }\n}\nbase_model_config_map {\n  key: \"mistralai-codestral-2501\"\n  value {\n    output_chars_to_input_chars_ratio: 3\n  }\n}\nbase_model_config_map {\n  key: \"mistralai-ministral-3b-2410\"\n  value {\n  }\n}\nbase_model_config_map {\n  key: \"mistralai-mistral-large-2411\"\n  value {\n    output_chars_to_input_chars_ratio: 3\n  }\n}\nbase_model_config_map {\n  key: \"mistralai-mistral-medium-3\"\n  value {\n    output_chars_to_input_chars_ratio: 5\n  }\n}\nbase_model_config_map {\n  key: \"mistralai-mistral-nemo\"\n  value {\n    output_chars_to_input_chars_ratio: 1\n  }\n}\nbase_model_config_map {\n  key: \"mistralai-mistral-ocr-2505\"\n  value {\n    output_chars_to_input_chars_ratio: 1\n  }\n}\nbase_model_config_map {\n  key: \"mistralai-mistral-small-2503\"\n  value {\n    output_chars_to_input_chars_ratio: 3\n  }\n}\nbase_model_config_map {\n  key: \"mistralai-mistral-staging\"\n  value {\n    output_chars_to_input_chars_ratio: 5\n  }\n}\nbase_model_config_map {\n  key: \"openmaas-2.0-dsq-test\"\n  value {\n    default_burndown_config {\n      output_tokens_to_input_tokens_ratio: 8\n      input_token_to_input_tokens_ratio: 1\n    }\n    use_usage_metadata_for_token_count: true\n    use_normalized_pt_token_count: true\n  }\n}\nbase_model_config_map {\n  key: \"qwen3-235b-a22b-instruct-2507-maas\"\n  value {\n    default_burndown_config {\n      output_tokens_to_input_tokens_ratio: 4\n      input_token_to_input_tokens_ratio: 1\n    }\n    use_usage_metadata_for_token_count: true\n    use_normalized_pt_token_count: true\n  }\n}\nbase_model_config_map {\n  key: \"qwen3-coder-480b-a35b-instruct-maas\"\n  value {\n    default_burndown_config {\n      output_tokens_to_input_tokens_ratio: 4\n      input_token_to_input_tokens_ratio: 1\n    }\n    use_usage_metadata_for_token_count: true\n    use_normalized_pt_token_count: true\n  }\n}\nbase_model_config_map {\n  key: \"qwen3-next-80b-a3b-instruct-maas\"\n  value {\n    default_burndown_config {\n      output_tokens_to_input_tokens_ratio: 8\n      input_token_to_input_tokens_ratio: 1\n    }\n    use_usage_metadata_for_token_count: true\n    use_normalized_pt_token_count: true\n  }\n}\nbase_model_config_map {\n  key: \"qwen3-next-80b-a3b-thinking-maas\"\n  value {\n    default_burndown_config {\n      output_tokens_to_input_tokens_ratio: 8\n      input_token_to_input_tokens_ratio: 1\n    }\n    use_usage_metadata_for_token_count: true\n    use_normalized_pt_token_count: true\n  }\n}\nbase_model_config_map {\n  key: \"text-bison\"\n  value {\n    output_chars_to_input_chars_ratio: 2\n  }\n}\nbase_model_config_map {\n  key: \"text-bison-32k\"\n  value {\n    output_chars_to_input_chars_ratio: 2\n  }\n}\nbase_model_config_map {\n  key: \"text-embedding\"\n  value {\n  }\n}\nbase_model_config_map {\n  key: \"text-unicorn\"\n  value {\n    output_chars_to_input_chars_ratio: 3\n  }\n}\nbase_model_config_map {\n  key: \"veo-3.0-fast-generate-001\"\n  value {\n    output_video_to_input_tokens_ratio: 100\n    output_audio_to_input_tokens_ratio: 45\n  }\n}\nbase_model_config_map {\n  key: \"veo-3.0-generate-001\"\n  value {\n    output_video_to_input_tokens_ratio: 100\n    output_audio_to_input_tokens_ratio: 100\n  }\n}\nbase_model_config_map {\n  key: \"veo-3.1-fast-generate-001\"\n  value {\n    output_video_to_input_tokens_ratio: 100\n    output_audio_to_input_tokens_ratio: 45\n  }\n}\nbase_model_config_map {\n  key: \"veo-3.1-fast-generate-preview\"\n  value {\n    output_video_to_input_tokens_ratio: 100\n    output_audio_to_input_tokens_ratio: 45\n  }\n}\nbase_model_config_map {\n  key: \"veo-3.1-generate-001\"\n  value {\n    output_video_to_input_tokens_ratio: 100\n    output_audio_to_input_tokens_ratio: 100\n  }\n}\nbase_model_config_map {\n  key: \"veo-3.1-generate-preview\"\n  value {\n    output_video_to_input_tokens_ratio: 100\n    output_audio_to_input_tokens_ratio: 100\n  }\n}\nmodel_configs {\n  key: \"MedLM-Large-1.5\"\n  value {\n    billing_type: CHARACTER\n  }\n}\nmodel_configs {\n  key: \"MedLM-large\"\n  value {\n    billing_type: CHARACTER\n  }\n}\nmodel_configs {\n  key: \"MedLM-medium\"\n  value {\n    billing_type: CHARACTER\n  }\n}\nmodel_configs {\n  key: \"chat-bison\"\n  value {\n    billing_type: CHARACTER\n  }\n}\nmodel_configs {\n  key: \"chat-bison-32k\"\n  value {\n    billing_type: CHARACTER\n  }\n}\nmodel_configs {\n  key: \"code-bison\"\n  value {\n    billing_type: CHARACTER\n  }\n}\nmodel_configs {\n  key: \"code-bison-32k\"\n  value {\n    billing_type: CHARACTER\n  }\n}\nmodel_configs {\n  key: \"code-gecko\"\n  value {\n    billing_type: CHARACTER\n  }\n}\nmodel_configs {\n  key: \"codechat-bison\"\n  value {\n    billing_type: CHARACTER\n  }\n}\nmodel_configs {\n  key: \"codechat-bison-32k\"\n  value {\n    billing_type: CHARACTER\n  }\n}\nmodel_configs {\n  key: \"gemini-1.5-flash\"\n  value {\n    billing_type: CHARACTER\n    tier: \"128k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-1.5-flash-audio\"\n  value {\n    billing_type: AUDIO_SEC\n    tier: \"128k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-1.5-flash-audio-long\"\n  value {\n    billing_type: AUDIO_SEC\n    tier: \"1m\"\n  }\n}\nmodel_configs {\n  key: \"gemini-1.5-flash-image\"\n  value {\n    billing_type: IMAGE\n    tier: \"128k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-1.5-flash-image-long\"\n  value {\n    billing_type: IMAGE\n    tier: \"1m\"\n  }\n}\nmodel_configs {\n  key: \"gemini-1.5-flash-long\"\n  value {\n    billing_type: CHARACTER\n    tier: \"1m\"\n  }\n}\nmodel_configs {\n  key: \"gemini-1.5-flash-video\"\n  value {\n    billing_type: VIDEO_SEC\n    tier: \"128k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-1.5-flash-video-long\"\n  value {\n    billing_type: VIDEO_SEC\n    tier: \"1m\"\n  }\n}\nmodel_configs {\n  key: \"gemini-1.5-pro\"\n  value {\n    billing_type: CHARACTER\n    tier: \"128k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-1.5-pro-audio\"\n  value {\n    billing_type: AUDIO_SEC\n    tier: \"128k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-1.5-pro-audio-long\"\n  value {\n    billing_type: AUDIO_SEC\n    tier: \"1m\"\n  }\n}\nmodel_configs {\n  key: \"gemini-1.5-pro-image\"\n  value {\n    billing_type: IMAGE\n    tier: \"128k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-1.5-pro-image-long\"\n  value {\n    billing_type: IMAGE\n    tier: \"1m\"\n  }\n}\nmodel_configs {\n  key: \"gemini-1.5-pro-long\"\n  value {\n    billing_type: CHARACTER\n    tier: \"1m\"\n  }\n}\nmodel_configs {\n  key: \"gemini-1.5-pro-video\"\n  value {\n    billing_type: VIDEO_SEC\n    tier: \"128k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-1.5-pro-video-long\"\n  value {\n    billing_type: VIDEO_SEC\n    tier: \"1m\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.0-flash\"\n  value {\n    billing_type: CHARACTER\n    tier: \"128k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.0-flash-audio\"\n  value {\n    billing_type: AUDIO_SEC\n    tier: \"128k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.0-flash-audio-long\"\n  value {\n    billing_type: AUDIO_SEC\n    tier: \"1m\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.0-flash-image\"\n  value {\n    billing_type: IMAGE\n    tier: \"128k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.0-flash-image-long\"\n  value {\n    billing_type: IMAGE\n    tier: \"1m\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.0-flash-lite\"\n  value {\n    billing_type: CHARACTER\n    tier: \"128k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.0-flash-lite-audio\"\n  value {\n    billing_type: AUDIO_SEC\n    tier: \"128k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.0-flash-lite-audio-long\"\n  value {\n    billing_type: AUDIO_SEC\n    tier: \"1m\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.0-flash-lite-image\"\n  value {\n    billing_type: IMAGE\n    tier: \"128k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.0-flash-lite-image-long\"\n  value {\n    billing_type: IMAGE\n    tier: \"1m\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.0-flash-lite-long\"\n  value {\n    billing_type: CHARACTER\n    tier: \"1m\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.0-flash-lite-video\"\n  value {\n    billing_type: VIDEO_SEC\n    tier: \"128k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.0-flash-lite-video-long\"\n  value {\n    billing_type: VIDEO_SEC\n    tier: \"1m\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.0-flash-long\"\n  value {\n    billing_type: CHARACTER\n    tier: \"1m\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.0-flash-preview-image-generation\"\n  value {\n    billing_type: CHARACTER\n    tier: \"32k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.0-flash-preview-image-generation-audio\"\n  value {\n    billing_type: AUDIO_SEC\n    tier: \"32k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.0-flash-preview-image-generation-image\"\n  value {\n    billing_type: IMAGE\n    tier: \"32k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.0-flash-preview-image-generation-video\"\n  value {\n    billing_type: VIDEO_SEC\n    tier: \"32k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.0-flash-video\"\n  value {\n    billing_type: VIDEO_SEC\n    tier: \"128k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.0-flash-video-long\"\n  value {\n    billing_type: VIDEO_SEC\n    tier: \"1m\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-flash\"\n  value {\n    billing_type: CHARACTER\n    tier: \"200k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-flash-audio\"\n  value {\n    billing_type: AUDIO_SEC\n    tier: \"200k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-flash-audio-long\"\n  value {\n    billing_type: AUDIO_SEC\n    tier: \"1m\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-flash-ga\"\n  value {\n    billing_type: CHARACTER\n    tier: \"200k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-flash-ga-audio\"\n  value {\n    billing_type: AUDIO_SEC\n    tier: \"200k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-flash-ga-audio-long\"\n  value {\n    billing_type: AUDIO_SEC\n    tier: \"1m\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-flash-ga-image\"\n  value {\n    billing_type: IMAGE\n    tier: \"200k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-flash-ga-image-long\"\n  value {\n    billing_type: IMAGE\n    tier: \"1m\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-flash-ga-long\"\n  value {\n    billing_type: CHARACTER\n    tier: \"1m\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-flash-ga-thinking-on\"\n  value {\n    billing_type: CHARACTER\n    tier: \"200k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-flash-ga-thinking-on-long\"\n  value {\n    billing_type: CHARACTER\n    tier: \"200k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-flash-ga-thought\"\n  value {\n    billing_type: THINKING_TOKEN\n    tier: \"200k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-flash-ga-thought-long\"\n  value {\n    billing_type: THINKING_TOKEN\n    tier: \"1m\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-flash-ga-video\"\n  value {\n    billing_type: VIDEO_SEC\n    tier: \"200k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-flash-ga-video-long\"\n  value {\n    billing_type: VIDEO_SEC\n    tier: \"1m\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-flash-image\"\n  value {\n    billing_type: IMAGE\n    tier: \"200k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-flash-image-ga\"\n  value {\n    billing_type: CHARACTER\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-flash-image-ga-image\"\n  value {\n    billing_type: IMAGE\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-flash-image-long\"\n  value {\n    billing_type: IMAGE\n    tier: \"1m\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-flash-lite\"\n  value {\n    billing_type: CHARACTER\n    tier: \"200k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-flash-lite-audio\"\n  value {\n    billing_type: AUDIO_SEC\n    tier: \"200k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-flash-lite-audio-long\"\n  value {\n    billing_type: AUDIO_SEC\n    tier: \"1m\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-flash-lite-image\"\n  value {\n    billing_type: IMAGE\n    tier: \"200k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-flash-lite-image-long\"\n  value {\n    billing_type: IMAGE\n    tier: \"1m\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-flash-lite-long\"\n  value {\n    billing_type: CHARACTER\n    tier: \"1m\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-flash-lite-preview\"\n  value {\n    billing_type: CHARACTER\n    tier: \"200k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-flash-lite-preview-09-2025\"\n  value {\n    billing_type: CHARACTER\n    tier: \"200k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-flash-lite-preview-09-2025-audio\"\n  value {\n    billing_type: AUDIO_SEC\n    tier: \"200k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-flash-lite-preview-09-2025-audio-long\"\n  value {\n    billing_type: AUDIO_SEC\n    tier: \"1m\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-flash-lite-preview-09-2025-image\"\n  value {\n    billing_type: IMAGE\n    tier: \"200k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-flash-lite-preview-09-2025-image-long\"\n  value {\n    billing_type: IMAGE\n    tier: \"1m\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-flash-lite-preview-09-2025-long\"\n  value {\n    billing_type: CHARACTER\n    tier: \"1m\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-flash-lite-preview-09-2025-thinking-on\"\n  value {\n    billing_type: CHARACTER\n    tier: \"200k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-flash-lite-preview-09-2025-thinking-on-long\"\n  value {\n    billing_type: CHARACTER\n    tier: \"200k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-flash-lite-preview-09-2025-thought\"\n  value {\n    billing_type: THINKING_TOKEN\n    tier: \"200k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-flash-lite-preview-09-2025-thought-long\"\n  value {\n    billing_type: THINKING_TOKEN\n    tier: \"1m\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-flash-lite-preview-09-2025-video\"\n  value {\n    billing_type: VIDEO_SEC\n    tier: \"200k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-flash-lite-preview-09-2025-video-long\"\n  value {\n    billing_type: VIDEO_SEC\n    tier: \"1m\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-flash-lite-preview-audio\"\n  value {\n    billing_type: AUDIO_SEC\n    tier: \"200k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-flash-lite-preview-audio-long\"\n  value {\n    billing_type: AUDIO_SEC\n    tier: \"1m\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-flash-lite-preview-image\"\n  value {\n    billing_type: IMAGE\n    tier: \"200k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-flash-lite-preview-image-long\"\n  value {\n    billing_type: IMAGE\n    tier: \"1m\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-flash-lite-preview-long\"\n  value {\n    billing_type: CHARACTER\n    tier: \"1m\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-flash-lite-preview-thinking-on\"\n  value {\n    billing_type: CHARACTER\n    tier: \"200k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-flash-lite-preview-thinking-on-long\"\n  value {\n    billing_type: CHARACTER\n    tier: \"1m\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-flash-lite-preview-thought\"\n  value {\n    billing_type: THINKING_TOKEN\n    tier: \"200k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-flash-lite-preview-video\"\n  value {\n    billing_type: VIDEO_SEC\n    tier: \"200k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-flash-lite-preview-video-long\"\n  value {\n    billing_type: VIDEO_SEC\n    tier: \"1m\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-flash-lite-prview-thought-long\"\n  value {\n    billing_type: THINKING_TOKEN\n    tier: \"1m\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-flash-lite-thinking-on\"\n  value {\n    billing_type: CHARACTER\n    tier: \"200k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-flash-lite-thinking-on-long\"\n  value {\n    billing_type: CHARACTER\n    tier: \"1m\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-flash-lite-thought\"\n  value {\n    billing_type: THINKING_TOKEN\n    tier: \"200k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-flash-lite-thought-long\"\n  value {\n    billing_type: THINKING_TOKEN\n    tier: \"1m\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-flash-lite-video\"\n  value {\n    billing_type: VIDEO_SEC\n    tier: \"200k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-flash-lite-video-long\"\n  value {\n    billing_type: VIDEO_SEC\n    tier: \"1m\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-flash-long\"\n  value {\n    billing_type: CHARACTER\n    tier: \"1m\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-flash-preview-09-2025\"\n  value {\n    billing_type: CHARACTER\n    tier: \"200k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-flash-preview-09-2025-audio\"\n  value {\n    billing_type: AUDIO_SEC\n    tier: \"200k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-flash-preview-09-2025-audio-long\"\n  value {\n    billing_type: AUDIO_SEC\n    tier: \"1m\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-flash-preview-09-2025-image\"\n  value {\n    billing_type: IMAGE\n    tier: \"200k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-flash-preview-09-2025-image-long\"\n  value {\n    billing_type: IMAGE\n    tier: \"1m\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-flash-preview-09-2025-long\"\n  value {\n    billing_type: CHARACTER\n    tier: \"1m\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-flash-preview-09-2025-thinking-on\"\n  value {\n    billing_type: CHARACTER\n    tier: \"200k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-flash-preview-09-2025-thinking-on-long\"\n  value {\n    billing_type: CHARACTER\n    tier: \"200k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-flash-preview-09-2025-thought\"\n  value {\n    billing_type: THINKING_TOKEN\n    tier: \"200k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-flash-preview-09-2025-thought-long\"\n  value {\n    billing_type: THINKING_TOKEN\n    tier: \"1m\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-flash-preview-09-2025-video\"\n  value {\n    billing_type: VIDEO_SEC\n    tier: \"200k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-flash-preview-09-2025-video-long\"\n  value {\n    billing_type: VIDEO_SEC\n    tier: \"1m\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-flash-preview-image\"\n  value {\n    billing_type: CHARACTER\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-flash-preview-image-image\"\n  value {\n    billing_type: IMAGE\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-flash-preview-native-audio-dialog\"\n  value {\n    billing_type: CHARACTER\n    tier: \"200k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-flash-preview-native-audio-dialog-audio\"\n  value {\n    billing_type: AUDIO_SEC\n    tier: \"200k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-flash-preview-native-audio-dialog-video\"\n  value {\n    billing_type: VIDEO_SEC\n    tier: \"200k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-flash-thought\"\n  value {\n    billing_type: THINKING_TOKEN\n    tier: \"200k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-flash-thought-long\"\n  value {\n    billing_type: THINKING_TOKEN\n    tier: \"1m\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-flash-video\"\n  value {\n    billing_type: VIDEO_SEC\n    tier: \"200k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-flash-video-long\"\n  value {\n    billing_type: VIDEO_SEC\n    tier: \"1m\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-pro-ga\"\n  value {\n    billing_type: CHARACTER\n    tier: \"200k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-pro-ga-audio\"\n  value {\n    billing_type: AUDIO_SEC\n    tier: \"200k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-pro-ga-audio-long\"\n  value {\n    billing_type: AUDIO_SEC\n    tier: \"1m\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-pro-ga-image\"\n  value {\n    billing_type: IMAGE\n    tier: \"200k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-pro-ga-image-long\"\n  value {\n    billing_type: IMAGE\n    tier: \"1m\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-pro-ga-long\"\n  value {\n    billing_type: CHARACTER\n    tier: \"1m\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-pro-ga-thought\"\n  value {\n    billing_type: THINKING_TOKEN\n    tier: \"200k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-pro-ga-thought-long\"\n  value {\n    billing_type: THINKING_TOKEN\n    tier: \"1m\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-pro-ga-video\"\n  value {\n    billing_type: VIDEO_SEC\n    tier: \"200k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-pro-ga-video-long\"\n  value {\n    billing_type: VIDEO_SEC\n    tier: \"1m\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-pro-preview\"\n  value {\n    billing_type: CHARACTER\n    tier: \"200k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-pro-preview-06-05\"\n  value {\n    billing_type: CHARACTER\n    tier: \"200k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-pro-preview-06-05-audio\"\n  value {\n    billing_type: AUDIO_SEC\n    tier: \"200k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-pro-preview-06-05-audio-long\"\n  value {\n    billing_type: AUDIO_SEC\n    tier: \"1m\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-pro-preview-06-05-image\"\n  value {\n    billing_type: IMAGE\n    tier: \"200k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-pro-preview-06-05-image-long\"\n  value {\n    billing_type: IMAGE\n    tier: \"1m\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-pro-preview-06-05-long\"\n  value {\n    billing_type: CHARACTER\n    tier: \"1m\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-pro-preview-06-05-thought\"\n  value {\n    billing_type: THINKING_TOKEN\n    tier: \"200k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-pro-preview-06-05-thought-long\"\n  value {\n    billing_type: THINKING_TOKEN\n    tier: \"1m\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-pro-preview-06-05-video\"\n  value {\n    billing_type: VIDEO_SEC\n    tier: \"200k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-pro-preview-06-05-video-long\"\n  value {\n    billing_type: VIDEO_SEC\n    tier: \"1m\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-pro-preview-audio\"\n  value {\n    billing_type: AUDIO_SEC\n    tier: \"200k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-pro-preview-audio-long\"\n  value {\n    billing_type: AUDIO_SEC\n    tier: \"1m\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-pro-preview-image\"\n  value {\n    billing_type: IMAGE\n    tier: \"200k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-pro-preview-image-long\"\n  value {\n    billing_type: IMAGE\n    tier: \"1m\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-pro-preview-long\"\n  value {\n    billing_type: CHARACTER\n    tier: \"1m\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-pro-preview-thought\"\n  value {\n    billing_type: THINKING_TOKEN\n    tier: \"200k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-pro-preview-thought-long\"\n  value {\n    billing_type: THINKING_TOKEN\n    tier: \"1m\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-pro-preview-video\"\n  value {\n    billing_type: VIDEO_SEC\n    tier: \"200k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-2.5-pro-preview-video-long\"\n  value {\n    billing_type: VIDEO_SEC\n    tier: \"1m\"\n  }\n}\nmodel_configs {\n  key: \"gemini-live-2.5-flash\"\n  value {\n    billing_type: CHARACTER\n    tier: \"200k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-live-2.5-flash-audio\"\n  value {\n    billing_type: AUDIO_SEC\n    tier: \"200k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-live-2.5-flash-audio-long\"\n  value {\n    billing_type: AUDIO_SEC\n    tier: \"1m\"\n  }\n}\nmodel_configs {\n  key: \"gemini-live-2.5-flash-long\"\n  value {\n    billing_type: CHARACTER\n    tier: \"1m\"\n  }\n}\nmodel_configs {\n  key: \"gemini-live-2.5-flash-preview-native-audio-09-2025\"\n  value {\n    billing_type: CHARACTER\n  }\n}\nmodel_configs {\n  key: \"gemini-live-2.5-flash-preview-native-audio-09-2025-audio\"\n  value {\n    billing_type: AUDIO_SEC\n  }\n}\nmodel_configs {\n  key: \"gemini-live-2.5-flash-preview-native-audio-09-2025-image\"\n  value {\n    billing_type: IMAGE\n  }\n}\nmodel_configs {\n  key: \"gemini-live-2.5-flash-preview-native-audio-09-2025-video\"\n  value {\n    billing_type: VIDEO_SEC\n  }\n}\nmodel_configs {\n  key: \"gemini-live-2.5-flash-video\"\n  value {\n    billing_type: VIDEO_SEC\n    tier: \"200k\"\n  }\n}\nmodel_configs {\n  key: \"gemini-live-2.5-flash-video-long\"\n  value {\n    billing_type: VIDEO_SEC\n    tier: \"1m\"\n  }\n}\nmodel_configs {\n  key: \"gemini-pro\"\n  value {\n    billing_type: CHARACTER\n  }\n}\nmodel_configs {\n  key: \"gemini-pro-vision\"\n  value {\n    billing_type: CHARACTER\n  }\n}\nmodel_configs {\n  key: \"gemini-pro-vision-image\"\n  value {\n    billing_type: IMAGE\n  }\n}\nmodel_configs {\n  key: \"gemini-pro-vision-video\"\n  value {\n    billing_type: VIDEO_SEC\n  }\n}\nmodel_configs {\n  key: \"gemini-ultra\"\n  value {\n    billing_type: CHARACTER\n  }\n}\nmodel_configs {\n  key: \"gemini-ultra-vision\"\n  value {\n    billing_type: CHARACTER\n  }\n}\nmodel_configs {\n  key: \"gemini-ultra-vision-image\"\n  value {\n    billing_type: IMAGE\n  }\n}\nmodel_configs {\n  key: \"gemini-ultra-vision-video\"\n  value {\n    billing_type: VIDEO_SEC\n  }\n}\nmodel_configs {\n  key: \"imagegeneration\"\n  value {\n    billing_type: CHARACTER\n    quota_report_type: REPORT_OUTPUT_IN_ADVANCE\n  }\n}\nmodel_configs {\n  key: \"imagen-2.0-edit\"\n  value {\n    billing_type: CHARACTER\n    quota_report_type: REPORT_OUTPUT_IN_ADVANCE\n  }\n}\nmodel_configs {\n  key: \"imagen-3.0-capability\"\n  value {\n    billing_type: CHARACTER\n    quota_report_type: REPORT_OUTPUT_IN_ADVANCE\n  }\n}\nmodel_configs {\n  key: \"imagen-3.0-fast-generate\"\n  value {\n    billing_type: CHARACTER\n    quota_report_type: REPORT_OUTPUT_IN_ADVANCE\n  }\n}\nmodel_configs {\n  key: \"imagen-3.0-generate\"\n  value {\n    billing_type: CHARACTER\n    quota_report_type: REPORT_OUTPUT_IN_ADVANCE\n  }\n}\nmodel_configs {\n  key: \"imagen-4.0-fast-generate\"\n  value {\n    billing_type: CHARACTER\n    quota_report_type: REPORT_OUTPUT_IN_ADVANCE\n  }\n}\nmodel_configs {\n  key: \"imagen-4.0-generate\"\n  value {\n    billing_type: CHARACTER\n    quota_report_type: REPORT_OUTPUT_IN_ADVANCE\n  }\n}\nmodel_configs {\n  key: \"imagen-4.0-ultra-generate\"\n  value {\n    billing_type: CHARACTER\n    quota_report_type: REPORT_OUTPUT_IN_ADVANCE\n  }\n}\nmodel_configs {\n  key: \"text-bison\"\n  value {\n    billing_type: CHARACTER\n  }\n}\nmodel_configs {\n  key: \"text-bison-32k\"\n  value {\n    billing_type: CHARACTER\n  }\n}\nmodel_configs {\n  key: \"text-bison-batch\"\n  value {\n    billing_type: CHARACTER\n  }\n}\nmodel_configs {\n  key: \"text-embedding-005\"\n  value {\n    billing_type: CHARACTER\n  }\n}\nmodel_configs {\n  key: \"text-unicorn\"\n  value {\n    billing_type: CHARACTER\n  }\n}\nmodel_configs {\n  key: \"textembedding-gecko\"\n  value {\n    billing_type: CHARACTER\n  }\n}\nmodel_configs {\n  key: \"veo-3.0-fast-generate-001\"\n  value {\n    billing_type: CHARACTER\n  }\n}\nmodel_configs {\n  key: \"veo-3.0-generate-001\"\n  value {\n    billing_type: CHARACTER\n  }\n}\nmodel_version_quota_config_map {\n  key: \"ai21-jamba-1.5-large\"\n  value {\n    quota_configs {\n      key: \"001\"\n      value {\n        bouncer_group_id_prefix: \"ai21-jamba-1.5-large@001\"\n        capacity_quota_type: CAPACITY_REGIONAL\n        default_max_output_tokens: 1\n        enable_qs_overall_capacity_for_tokens: true\n        disable_bouncer: true\n      }\n    }\n    quota_configs {\n      key: \"GLOBAL_ENDPOINT\"\n      value {\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"ai21-jamba-1.5-mini\"\n  value {\n    quota_configs {\n      key: \"001\"\n      value {\n        bouncer_group_id_prefix: \"ai21-jamba-1.5-mini@001\"\n        capacity_quota_type: CAPACITY_REGIONAL\n        default_max_output_tokens: 1\n        enable_qs_overall_capacity_for_tokens: true\n        disable_bouncer: true\n      }\n    }\n    quota_configs {\n      key: \"GLOBAL_ENDPOINT\"\n      value {\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"anthropic-anthropic-claude2\"\n  value {\n    quota_configs {\n      key: \"001\"\n      value {\n        disable_bouncer: true\n        disable_quota_server: true\n      }\n    }\n    quota_configs {\n      key: \"GLOBAL_ENDPOINT\"\n      value {\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"anthropic-claude-2p0\"\n  value {\n    quota_configs {\n      key: \"001\"\n      value {\n        disable_bouncer: true\n        disable_quota_server: true\n      }\n    }\n    quota_configs {\n      key: \"GLOBAL_ENDPOINT\"\n      value {\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"anthropic-claude-3-5-haiku\"\n  value {\n    quota_configs {\n      key: \"20241022\"\n      value {\n        bouncer_group_id_prefix: \"anthropic-claude-3-5-haiku@20241022\"\n        capacity_quota_type: CAPACITY_REGIONAL\n        default_max_output_tokens: 1265\n        avg_input_tokens_to_request_size: 0.184\n        default_max_input_tokens: 6245\n        enable_qs_overall_capacity_for_tokens: true\n        disable_bouncer: true\n        caching_max_input_tokens: 6245\n        caching_avg_input_tokens_to_request_size: 0.0843\n        enable_pinnacle_qs_regionalization: true\n        enable_pinnacle_qs_regionalization_paygo: true\n      }\n    }\n    quota_configs {\n      key: \"GLOBAL_ENDPOINT\"\n      value {\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"anthropic-claude-3-5-haiku-staging\"\n  value {\n    quota_configs {\n      key: \"20241022\"\n      value {\n        bouncer_group_id_prefix: \"anthropic-claude-3-5-haiku-staging@20241022\"\n        capacity_quota_type: CAPACITY_REGIONAL\n        default_max_output_tokens: 1265\n        avg_input_tokens_to_request_size: 0.184\n        default_max_input_tokens: 6245\n        enable_qs_overall_capacity_for_tokens: true\n        disable_bouncer: true\n        caching_max_input_tokens: 6245\n        caching_avg_input_tokens_to_request_size: 0.0184\n        enable_pinnacle_qs_regionalization: true\n        enable_pinnacle_qs_regionalization_paygo: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"anthropic-claude-3-5-sonnet\"\n  value {\n    quota_configs {\n      key: \"20240620\"\n      value {\n        bouncer_group_id_prefix: \"anthropic-claude-3-5-sonnet@20240620\"\n        capacity_quota_type: CAPACITY_REGIONAL\n        default_max_output_tokens: 1640\n        avg_input_tokens_to_request_size: 0.1656\n        default_max_input_tokens: 15347\n        enable_qs_overall_capacity_for_tokens: true\n        disable_bouncer: true\n        caching_max_input_tokens: 15347\n        caching_avg_input_tokens_to_request_size: 0.1267\n        enable_pinnacle_qs_regionalization: true\n        enable_pinnacle_qs_regionalization_paygo: true\n      }\n    }\n    quota_configs {\n      key: \"GLOBAL_ENDPOINT\"\n      value {\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"anthropic-claude-3-5-sonnet-v2\"\n  value {\n    quota_configs {\n      key: \"20241022\"\n      value {\n        bouncer_group_id_prefix: \"anthropic-claude-3-5-sonnet-v2@20241022\"\n        capacity_quota_type: CAPACITY_REGIONAL\n        default_max_output_tokens: 1640\n        avg_input_tokens_to_request_size: 0.1656\n        default_max_input_tokens: 15347\n        enable_qs_overall_capacity_for_tokens: true\n        disable_bouncer: true\n        caching_max_input_tokens: 15347\n        caching_avg_input_tokens_to_request_size: 0.12\n        enable_pinnacle_qs_regionalization: true\n        enable_pinnacle_qs_regionalization_paygo: true\n      }\n    }\n    quota_configs {\n      key: \"GLOBAL_ENDPOINT\"\n      value {\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"anthropic-claude-3-5-sonnet-v2-staging\"\n  value {\n    quota_configs {\n      key: \"20241113\"\n      value {\n        bouncer_group_id_prefix: \"anthropic-claude-3-5-sonnet-v2-staging@20241022\"\n        capacity_quota_type: CAPACITY_REGIONAL\n        default_max_output_tokens: 1640\n        avg_input_tokens_to_request_size: 0.1656\n        default_max_input_tokens: 15347\n        enable_qs_overall_capacity_for_tokens: true\n        disable_bouncer: true\n        caching_max_input_tokens: 15347\n        caching_avg_input_tokens_to_request_size: 0.01656\n        enable_pinnacle_qs_regionalization: true\n        enable_pinnacle_qs_regionalization_paygo: true\n      }\n    }\n    quota_configs {\n      key: \"GLOBAL_ENDPOINT\"\n      value {\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"anthropic-claude-3-7-sonnet\"\n  value {\n    quota_configs {\n      key: \"20250219\"\n      value {\n        capacity_quota_type: CAPACITY_REGIONAL\n        default_max_output_tokens: 1640\n        avg_input_tokens_to_request_size: 0.1656\n        default_max_input_tokens: 15347\n        enable_qs_overall_capacity_for_tokens: true\n        disable_bouncer: true\n        caching_max_input_tokens: 15347\n        caching_avg_input_tokens_to_request_size: 0.1363\n        precharging_cap_for_pt: 16987\n        enable_pinnacle_qs_regionalization: true\n        enable_pinnacle_qs_regionalization_paygo: true\n      }\n    }\n    quota_configs {\n      key: \"GLOBAL_ENDPOINT\"\n      value {\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"anthropic-claude-3-haiku\"\n  value {\n    quota_configs {\n      key: \"20240307\"\n      value {\n        bouncer_group_id_prefix: \"anthropic-claude-3-haiku@20240307\"\n        capacity_quota_type: CAPACITY_REGIONAL\n        default_max_output_tokens: 1265\n        avg_input_tokens_to_request_size: 0.184\n        default_max_input_tokens: 6245\n        enable_qs_overall_capacity_for_tokens: true\n        disable_bouncer: true\n        caching_max_input_tokens: 6245\n        caching_avg_input_tokens_to_request_size: 0.0958\n        enable_pinnacle_qs_regionalization: true\n        enable_pinnacle_qs_regionalization_paygo: true\n      }\n    }\n    quota_configs {\n      key: \"GLOBAL_ENDPOINT\"\n      value {\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"anthropic-claude-3-opus\"\n  value {\n    quota_configs {\n      key: \"20240229\"\n      value {\n        bouncer_group_id_prefix: \"anthropic-claude-3-opus@20240229\"\n        capacity_quota_type: CAPACITY_REGIONAL\n        default_max_output_tokens: 2705\n        bouncer_project_id: \"vertex-prediction-llm-pinnacle-30s\"\n        avg_input_tokens_to_request_size: 0.1582\n        default_max_input_tokens: 14461\n        enable_qs_overall_capacity_for_tokens: true\n        disable_bouncer: true\n        caching_max_input_tokens: 14461\n        caching_avg_input_tokens_to_request_size: 0.1266\n        enable_pinnacle_qs_regionalization: true\n        enable_pinnacle_qs_regionalization_paygo: true\n      }\n    }\n    quota_configs {\n      key: \"GLOBAL_ENDPOINT\"\n      value {\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"anthropic-claude-3-sonnet\"\n  value {\n    quota_configs {\n      key: \"20240229\"\n      value {\n        bouncer_group_id_prefix: \"anthropic-claude-3-sonnet@20240229\"\n        capacity_quota_type: CAPACITY_REGIONAL\n        default_max_output_tokens: 1300\n        avg_input_tokens_to_request_size: 0.158\n        default_max_input_tokens: 10438\n        enable_qs_overall_capacity_for_tokens: true\n        disable_bouncer: true\n        caching_max_input_tokens: 10438\n        caching_avg_input_tokens_to_request_size: 0.158\n        enable_pinnacle_qs_regionalization: true\n        enable_pinnacle_qs_regionalization_paygo: true\n      }\n    }\n    quota_configs {\n      key: \"GLOBAL_ENDPOINT\"\n      value {\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"anthropic-claude-3-sonnet-staging\"\n  value {\n    quota_configs {\n      key: \"20240229\"\n      value {\n        capacity_quota_type: CAPACITY_REGIONAL\n        default_max_output_tokens: 1640\n        avg_input_tokens_to_request_size: 0.1656\n        default_max_input_tokens: 15347\n        enable_qs_overall_capacity_for_tokens: true\n        disable_bouncer: true\n        caching_max_input_tokens: 15347\n        caching_avg_input_tokens_to_request_size: 0.1363\n        precharging_cap_for_pt: 16987\n        enable_weighted_tokens_for_paygo_tpm_quota: true\n        enable_tpm_quota_split: true\n        enable_pinnacle_qs_regionalization: true\n        enable_pinnacle_qs_regionalization_paygo: true\n        context_window_size_based_precharge_tier_configs {\n          key: 200000\n          value {\n            default_max_output_tokens: 1640\n            default_max_input_tokens: 15347\n            avg_input_tokens_to_request_size: 0.1656\n            caching_avg_input_tokens_to_request_size: 0.1363\n            input_token_to_precharge_tokens_ratio: 1\n            cache_input_token_to_precharge_tokens_ratio: 0.1125\n            input_image_to_input_token_ratio: 1500\n          }\n        }\n        context_window_size_based_precharge_tier_configs {\n          key: 1000000\n          value {\n            default_max_output_tokens: 1640\n            default_max_input_tokens: 15347\n            avg_input_tokens_to_request_size: 0.1656\n            caching_avg_input_tokens_to_request_size: 0.1363\n            shared_token_quota_qs_user_prefix: \"1000000-global-tokens-\"\n            batch_shared_token_quota_qs_user_prefix: \"1000000-batch-prediction-shared-tokens-\"\n            input_token_to_precharge_tokens_ratio: 1\n            cache_input_token_to_precharge_tokens_ratio: 0.1125\n            input_image_to_input_token_ratio: 1500\n          }\n        }\n        tier_routing_config {\n          short_context_byte_limit: 2097152\n          request_size_to_context_window_size_tiers {\n            key: 2097152\n            value: 1\n          }\n          request_size_to_context_window_size_tiers {\n            key: 100000000000\n            value: 200001\n          }\n        }\n      }\n    }\n    quota_configs {\n      key: \"GLOBAL_ENDPOINT\"\n      value {\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"anthropic-claude-haiku-4-5\"\n  value {\n    quota_configs {\n      key: \"20251001\"\n      value {\n        capacity_quota_type: CAPACITY_REGIONAL\n        default_max_output_tokens: 1640\n        avg_input_tokens_to_request_size: 0.1656\n        default_max_input_tokens: 15347\n        enable_qs_overall_capacity_for_tokens: true\n        disable_bouncer: true\n        caching_max_input_tokens: 15347\n        caching_avg_input_tokens_to_request_size: 0.1363\n        precharging_cap_for_pt: 16987\n        enable_weighted_tokens_for_paygo_tpm_quota: true\n        enable_tpm_quota_split: true\n        enable_pinnacle_qs_regionalization: true\n        enable_pinnacle_qs_regionalization_paygo: true\n        context_window_size_based_precharge_tier_configs {\n          key: 200000\n          value {\n            default_max_output_tokens: 1640\n            default_max_input_tokens: 15347\n            avg_input_tokens_to_request_size: 0.1656\n            caching_avg_input_tokens_to_request_size: 0.1363\n            input_token_to_precharge_tokens_ratio: 1\n            cache_input_token_to_precharge_tokens_ratio: 0.1125\n            input_image_to_input_token_ratio: 1500\n          }\n        }\n        context_window_size_based_precharge_tier_configs {\n          key: 1000000\n          value {\n            default_max_output_tokens: 1640\n            default_max_input_tokens: 15347\n            avg_input_tokens_to_request_size: 0.1656\n            caching_avg_input_tokens_to_request_size: 0.1363\n            shared_token_quota_qs_user_prefix: \"1000000-global-tokens-\"\n            batch_shared_token_quota_qs_user_prefix: \"1000000-batch-prediction-shared-tokens-\"\n            input_token_to_precharge_tokens_ratio: 1\n            cache_input_token_to_precharge_tokens_ratio: 0.1125\n            input_image_to_input_token_ratio: 1500\n          }\n        }\n        tier_routing_config {\n          short_context_byte_limit: 2097152\n          request_size_to_context_window_size_tiers {\n            key: 2097152\n            value: 1\n          }\n          request_size_to_context_window_size_tiers {\n            key: 100000000000\n            value: 200001\n          }\n        }\n      }\n    }\n    quota_configs {\n      key: \"GLOBAL_ENDPOINT\"\n      value {\n        enable_tpm_quota_split: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"anthropic-claude-instant-1p2\"\n  value {\n    quota_configs {\n      key: \"001\"\n      value {\n        disable_bouncer: true\n        disable_quota_server: true\n      }\n    }\n    quota_configs {\n      key: \"GLOBAL_ENDPOINT\"\n      value {\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"anthropic-claude-opus-4\"\n  value {\n    quota_configs {\n      key: \"20250514\"\n      value {\n        capacity_quota_type: CAPACITY_REGIONAL\n        default_max_output_tokens: 1640\n        avg_input_tokens_to_request_size: 0.1656\n        default_max_input_tokens: 15347\n        enable_qs_overall_capacity_for_tokens: true\n        disable_bouncer: true\n        caching_max_input_tokens: 15347\n        caching_avg_input_tokens_to_request_size: 0.1363\n        precharging_cap_for_pt: 16987\n        enable_weighted_tokens_for_paygo_tpm_quota: true\n        enable_tpm_quota_split: true\n        enable_pinnacle_qs_regionalization: true\n        enable_pinnacle_qs_regionalization_paygo: true\n      }\n    }\n    quota_configs {\n      key: \"GLOBAL_ENDPOINT\"\n      value {\n        enable_tpm_quota_split: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"anthropic-claude-opus-4-1\"\n  value {\n    quota_configs {\n      key: \"20250805\"\n      value {\n        capacity_quota_type: CAPACITY_REGIONAL\n        default_max_output_tokens: 1640\n        avg_input_tokens_to_request_size: 0.1656\n        default_max_input_tokens: 15347\n        enable_qs_overall_capacity_for_tokens: true\n        disable_bouncer: true\n        caching_max_input_tokens: 15347\n        caching_avg_input_tokens_to_request_size: 0.1363\n        precharging_cap_for_pt: 16987\n        enable_weighted_tokens_for_paygo_tpm_quota: true\n        enable_tpm_quota_split: true\n        enable_pinnacle_qs_regionalization: true\n        enable_pinnacle_qs_regionalization_paygo: true\n      }\n    }\n    quota_configs {\n      key: \"GLOBAL_ENDPOINT\"\n      value {\n        enable_tpm_quota_split: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"anthropic-claude-opus-4-5\"\n  value {\n    quota_configs {\n      key: \"20251101\"\n      value {\n        capacity_quota_type: CAPACITY_REGIONAL\n        default_max_output_tokens: 1640\n        avg_input_tokens_to_request_size: 0.1656\n        default_max_input_tokens: 15347\n        enable_qs_overall_capacity_for_tokens: true\n        disable_bouncer: true\n        caching_max_input_tokens: 15347\n        caching_avg_input_tokens_to_request_size: 0.1363\n        precharging_cap_for_pt: 16987\n        enable_weighted_tokens_for_paygo_tpm_quota: true\n        enable_tpm_quota_split: true\n        enable_pinnacle_qs_regionalization: true\n        enable_pinnacle_qs_regionalization_paygo: true\n        context_window_size_based_precharge_tier_configs {\n          key: 200000\n          value {\n            default_max_output_tokens: 1640\n            default_max_input_tokens: 15347\n            avg_input_tokens_to_request_size: 0.1656\n            caching_avg_input_tokens_to_request_size: 0.1363\n            input_token_to_precharge_tokens_ratio: 1\n            cache_input_token_to_precharge_tokens_ratio: 0.1125\n            input_image_to_input_token_ratio: 1500\n          }\n        }\n        context_window_size_based_precharge_tier_configs {\n          key: 1000000\n          value {\n            default_max_output_tokens: 1640\n            default_max_input_tokens: 15347\n            avg_input_tokens_to_request_size: 0.1656\n            caching_avg_input_tokens_to_request_size: 0.1363\n            shared_token_quota_qs_user_prefix: \"1000000-global-tokens-\"\n            batch_shared_token_quota_qs_user_prefix: \"1000000-batch-prediction-shared-tokens-\"\n            input_token_to_precharge_tokens_ratio: 1\n            cache_input_token_to_precharge_tokens_ratio: 0.1125\n            input_image_to_input_token_ratio: 1500\n          }\n        }\n        tier_routing_config {\n          short_context_byte_limit: 2097152\n          request_size_to_context_window_size_tiers {\n            key: 2097152\n            value: 1\n          }\n          request_size_to_context_window_size_tiers {\n            key: 100000000000\n            value: 200001\n          }\n        }\n      }\n    }\n    quota_configs {\n      key: \"GLOBAL_ENDPOINT\"\n      value {\n        enable_tpm_quota_split: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"anthropic-claude-sonnet-4\"\n  value {\n    quota_configs {\n      key: \"20250514\"\n      value {\n        capacity_quota_type: CAPACITY_REGIONAL\n        default_max_output_tokens: 1640\n        avg_input_tokens_to_request_size: 0.1656\n        default_max_input_tokens: 15347\n        enable_qs_overall_capacity_for_tokens: true\n        disable_bouncer: true\n        caching_max_input_tokens: 15347\n        caching_avg_input_tokens_to_request_size: 0.1363\n        precharging_cap_for_pt: 16987\n        enable_weighted_tokens_for_paygo_tpm_quota: true\n        enable_tpm_quota_split: true\n        enable_pinnacle_qs_regionalization: true\n        enable_pinnacle_qs_regionalization_paygo: true\n        context_window_size_based_precharge_tier_configs {\n          key: 200000\n          value {\n            default_max_output_tokens: 1640\n            default_max_input_tokens: 15347\n            avg_input_tokens_to_request_size: 0.1656\n            caching_avg_input_tokens_to_request_size: 0.1363\n            input_token_to_precharge_tokens_ratio: 1\n            cache_input_token_to_precharge_tokens_ratio: 0.1125\n            input_image_to_input_token_ratio: 1500\n          }\n        }\n        context_window_size_based_precharge_tier_configs {\n          key: 1000000\n          value {\n            default_max_output_tokens: 1640\n            default_max_input_tokens: 15347\n            avg_input_tokens_to_request_size: 0.1656\n            caching_avg_input_tokens_to_request_size: 0.1363\n            shared_token_quota_qs_user_prefix: \"1000000-global-tokens-\"\n            batch_shared_token_quota_qs_user_prefix: \"1000000-batch-prediction-shared-tokens-\"\n            input_token_to_precharge_tokens_ratio: 1\n            cache_input_token_to_precharge_tokens_ratio: 0.1125\n            input_image_to_input_token_ratio: 1500\n          }\n        }\n        tier_routing_config {\n          short_context_byte_limit: 2097152\n          request_size_to_context_window_size_tiers {\n            key: 2097152\n            value: 1\n          }\n          request_size_to_context_window_size_tiers {\n            key: 100000000000\n            value: 200001\n          }\n        }\n      }\n    }\n    quota_configs {\n      key: \"GLOBAL_ENDPOINT\"\n      value {\n        enable_tpm_quota_split: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"anthropic-claude-sonnet-4-5\"\n  value {\n    quota_configs {\n      key: \"20250929\"\n      value {\n        capacity_quota_type: CAPACITY_REGIONAL\n        default_max_output_tokens: 1640\n        avg_input_tokens_to_request_size: 0.1656\n        default_max_input_tokens: 15347\n        enable_qs_overall_capacity_for_tokens: true\n        disable_bouncer: true\n        caching_max_input_tokens: 15347\n        caching_avg_input_tokens_to_request_size: 0.1363\n        precharging_cap_for_pt: 16987\n        enable_weighted_tokens_for_paygo_tpm_quota: true\n        enable_tpm_quota_split: true\n        enable_pinnacle_qs_regionalization: true\n        enable_pinnacle_qs_regionalization_paygo: true\n        context_window_size_based_precharge_tier_configs {\n          key: 200000\n          value {\n            default_max_output_tokens: 1640\n            default_max_input_tokens: 15347\n            avg_input_tokens_to_request_size: 0.1656\n            caching_avg_input_tokens_to_request_size: 0.1363\n            input_token_to_precharge_tokens_ratio: 1\n            cache_input_token_to_precharge_tokens_ratio: 0.1125\n            input_image_to_input_token_ratio: 1500\n          }\n        }\n        context_window_size_based_precharge_tier_configs {\n          key: 1000000\n          value {\n            default_max_output_tokens: 1640\n            default_max_input_tokens: 15347\n            avg_input_tokens_to_request_size: 0.1656\n            caching_avg_input_tokens_to_request_size: 0.1363\n            shared_token_quota_qs_user_prefix: \"1000000-global-tokens-\"\n            batch_shared_token_quota_qs_user_prefix: \"1000000-batch-prediction-shared-tokens-\"\n            input_token_to_precharge_tokens_ratio: 1\n            cache_input_token_to_precharge_tokens_ratio: 0.1125\n            input_image_to_input_token_ratio: 1500\n          }\n        }\n        tier_routing_config {\n          short_context_byte_limit: 2097152\n          request_size_to_context_window_size_tiers {\n            key: 2097152\n            value: 1\n          }\n          request_size_to_context_window_size_tiers {\n            key: 100000000000\n            value: 200001\n          }\n        }\n      }\n    }\n    quota_configs {\n      key: \"GLOBAL_ENDPOINT\"\n      value {\n        enable_tpm_quota_split: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"anthropic-count-tokens\"\n  value {\n    quota_configs {\n      key: \"2023-06-01\"\n      value {\n        disable_bouncer: true\n        disable_quota_server: true\n      }\n    }\n    quota_configs {\n      key: \"GLOBAL_ENDPOINT\"\n      value {\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"anthropic-maas-infra\"\n  value {\n    quota_configs {\n      key: \"20250721\"\n      value {\n        bouncer_group_id_prefix: \"anthropic-maas-infra@20250721\"\n        capacity_quota_type: CAPACITY_REGIONAL\n        default_max_output_tokens: 1640\n        avg_input_tokens_to_request_size: 0.1656\n        default_max_input_tokens: 15347\n        enable_qs_overall_capacity_for_tokens: true\n        disable_bouncer: true\n        caching_max_input_tokens: 15347\n        caching_avg_input_tokens_to_request_size: 0.12\n        enable_pinnacle_qs_regionalization: true\n        enable_pinnacle_qs_regionalization_paygo: true\n      }\n    }\n    quota_configs {\n      key: \"GLOBAL_ENDPOINT\"\n      value {\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"anthropic-marketplace-publisher-model-138\"\n  value {\n    quota_configs {\n      key: \"001\"\n      value {\n        disable_bouncer: true\n        disable_quota_server: true\n      }\n    }\n    quota_configs {\n      key: \"GLOBAL_ENDPOINT\"\n      value {\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"anthropic-spillover-api\"\n  value {\n    quota_configs {\n      key: \"20250314\"\n      value {\n        disable_bouncer: true\n        disable_quota_server: true\n      }\n    }\n    quota_configs {\n      key: \"GLOBAL_ENDPOINT\"\n      value {\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"anthropic-spillover-api-tpu-001\"\n  value {\n    quota_configs {\n      key: \"20250314\"\n      value {\n        disable_bouncer: true\n        disable_quota_server: true\n      }\n    }\n    quota_configs {\n      key: \"GLOBAL_ENDPOINT\"\n      value {\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"anthropic-spillover-api-tpu-002\"\n  value {\n    quota_configs {\n      key: \"20250314\"\n      value {\n        disable_bouncer: true\n        disable_quota_server: true\n      }\n    }\n    quota_configs {\n      key: \"GLOBAL_ENDPOINT\"\n      value {\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"anthropic-spillover-api-tpu-003\"\n  value {\n    quota_configs {\n      key: \"20250314\"\n      value {\n        disable_bouncer: true\n        disable_quota_server: true\n      }\n    }\n    quota_configs {\n      key: \"GLOBAL_ENDPOINT\"\n      value {\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"anthropic-spillover-api-tpu-004\"\n  value {\n    quota_configs {\n      key: \"20250314\"\n      value {\n        disable_bouncer: true\n        disable_quota_server: true\n      }\n    }\n    quota_configs {\n      key: \"GLOBAL_ENDPOINT\"\n      value {\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"anthropic-spillover-api-tpu-005\"\n  value {\n    quota_configs {\n      key: \"20250314\"\n      value {\n        disable_bouncer: true\n        disable_quota_server: true\n      }\n    }\n    quota_configs {\n      key: \"GLOBAL_ENDPOINT\"\n      value {\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"anthropic-spillover-api-tpu-006\"\n  value {\n    quota_configs {\n      key: \"20250314\"\n      value {\n        disable_bouncer: true\n        disable_quota_server: true\n      }\n    }\n    quota_configs {\n      key: \"GLOBAL_ENDPOINT\"\n      value {\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"computer-use-preview\"\n  value {\n    quota_configs {\n      key: \"09-15\"\n      value {\n        bouncer_group_id_prefix: \"computer-use-preview@0915\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"global\"\n          value: \"nondrz\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        vta_station_id_prefix_override: \"computer-use-preview@0915\"\n        enable_org_pt_quota_server: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"09-15-autopush\"\n      value {\n        bouncer_group_id_prefix: \"computer-use-preview@0915-autopush\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"global\"\n          value: \"nondrz\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        vta_station_id_prefix_override: \"computer-use-preview@0915-autopush\"\n        enable_org_pt_quota_server: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"09-15-preprod\"\n      value {\n        bouncer_group_id_prefix: \"computer-use-preview@0915-preprod\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        vta_station_id_prefix_override: \"computer-use-preview@0915-preprod\"\n        enable_org_pt_quota_server: true\n        enable_base_model_config_pt_config: true\n      }\n    }\n    quota_configs {\n      key: \"09-15-staging\"\n      value {\n        bouncer_group_id_prefix: \"computer-use-preview@0915-staging\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"global\"\n          value: \"nondrz\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        vta_station_id_prefix_override: \"computer-use-preview@0915-staging\"\n        enable_org_pt_quota_server: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"deepseek-ocr-maas\"\n  value {\n    quota_configs {\n      key: \"001\"\n      value {\n        bouncer_group_id_prefix: \"deepseek-ocr-maas@001\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"lc\"\n          max_input_token_count: 163840\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"us-central1\"\n          value: \"us-central1\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1\n        enable_bouncer_global_quota: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        vta_station_id_prefix_override: \"deepseek-ocr-maas@001\"\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"001-autopush\"\n      value {\n        bouncer_group_id_prefix: \"deepseek-ocr-maas@001-autopush\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"lc\"\n          max_input_token_count: 163840\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"us-central1\"\n          value: \"us-central1\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1\n        enable_bouncer_global_quota: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        vta_station_id_prefix_override: \"deepseek-ocr-maas@001-autopush\"\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"deepseek-r1-0528-maas\"\n  value {\n    quota_configs {\n      key: \"001\"\n      value {\n        bouncer_group_id_prefix: \"deepseek-r1-0528-maas@001\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"lc\"\n          max_input_token_count: 163840\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"us-central1\"\n          value: \"us-central1\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1\n        enable_bouncer_global_quota: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"001-autopush\"\n      value {\n        bouncer_group_id_prefix: \"deepseek-r1-0528-maas@001-autopush\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"lc\"\n          max_input_token_count: 163840\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"us-central1\"\n          value: \"us-central1\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1\n        enable_bouncer_global_quota: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"deepseek-v3.1-maas\"\n  value {\n    quota_configs {\n      key: \"001\"\n      value {\n        bouncer_group_id_prefix: \"deepseek-v3.1-maas@001\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"lc\"\n          max_input_token_count: 163840\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"us-west2\"\n          value: \"us-west2\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1\n        enable_bouncer_global_quota: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        vta_station_id_prefix_override: \"deepseek-v3.1-maas@001\"\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"001-autopush\"\n      value {\n        bouncer_group_id_prefix: \"deepseek-v3.1-maas@001-autopush\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"lc\"\n          max_input_token_count: 163840\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"us-central1\"\n          value: \"us-west2\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1\n        enable_bouncer_global_quota: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        vta_station_id_prefix_override: \"deepseek-v3.1-maas@001-autopush\"\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"dummy-anthropic-test-model-2\"\n  value {\n    quota_configs {\n      key: \"GLOBAL_ENDPOINT\"\n      value {\n      }\n    }\n    quota_configs {\n      key: \"test-version-2\"\n      value {\n        capacity_quota_type: CAPACITY_REGIONAL\n        default_max_output_tokens: 1234\n        avg_input_tokens_to_request_size: 0.1656\n        default_max_input_tokens: 14000\n        enable_qs_overall_capacity_for_tokens: true\n        disable_bouncer: true\n        caching_max_input_tokens: 15347\n        caching_avg_input_tokens_to_request_size: 0.1363\n        precharging_cap_for_pt: 16987\n        enable_pinnacle_qs_regionalization: true\n        enable_pinnacle_qs_regionalization_paygo: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"gemini-1.5-flash\"\n  value {\n    quota_configs {\n      key: \"002\"\n      value {\n        bouncer_group_id_prefix: \"gemini-1.5-flash@002\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 32767\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        capacity_quota_override {\n          key: \"asia-northeast1\"\n          value: \"japan\"\n        }\n        capacity_quota_override {\n          key: \"asia-south1\"\n          value: \"india\"\n        }\n        capacity_quota_override {\n          key: \"australia-southeast1\"\n          value: \"australia\"\n        }\n        capacity_quota_override {\n          key: \"europe-west2\"\n          value: \"uk\"\n        }\n        capacity_quota_override {\n          key: \"europe-west3\"\n          value: \"germany\"\n        }\n        capacity_quota_override {\n          key: \"northamerica-northeast1\"\n          value: \"canada\"\n        }\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 2048\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"us\"\n          model_name: \"/batchx/endpoint-gemini-1-5-flash-002-prod\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"eu\"\n          model_name: \"/batchx/endpoint-gemini-1-5-flash-002-eu-prod\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"canada\"\n          model_name: \"/batchx/endpoint-gemini-1-5-flash-002-ca-prod\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"germany\"\n          model_name: \"/batchx/endpoint-gemini-1-5-flash-002-de-prod\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"japan\"\n          model_name: \"/batchx/endpoint-gemini-1-5-flash-002-jp-prod\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: IMAGE_TOKENIZER\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/vqcoca-soft\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: IMAGE_TOKENIZER\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/vqcoca-soft\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TOKENIZER\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/audio-usm-hard-tokenizer\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TOKENIZER\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/audio-usm-hard-tokenizer\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TRANSCRIPTION\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/audio-usm-draftlm-transcription\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TRANSCRIPTION\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/audio-usm-draftlm-transcription\"\n        }\n        org_based_limit_supported_regions: \"us\"\n        org_based_limit_supported_regions: \"eu\"\n        org_based_limit_supported_regions: \"asia\"\n        org_based_limit_supported_regions: \"global\"\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"002-autopush-0917\"\n      value {\n        bouncer_group_id_prefix: \"gemini-1.5-flash@002-autopush-0917\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 32767\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        capacity_quota_override {\n          key: \"eu\"\n          value: \"us\"\n        }\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 200\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_quota_server_first_overflow: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        quota_overflow_multiplier_in_seconds: 120\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"us\"\n          model_name: \"/batchx/endpoint-gemini-1-5-flash-002-dev\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"eu\"\n          model_name: \"/batchx/endpoint-gemini-1-5-flash-002-eu-dev\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: IMAGE_TOKENIZER\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/vqcoca-soft\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: IMAGE_TOKENIZER\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/vqcoca-soft\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TOKENIZER\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/audio-usm-hard-tokenizer\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TOKENIZER\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/audio-usm-hard-tokenizer\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TRANSCRIPTION\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/audio-usm-draftlm-transcription\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TRANSCRIPTION\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/audio-usm-draftlm-transcription\"\n        }\n        org_based_limit_supported_regions: \"us\"\n        org_based_limit_supported_regions: \"global\"\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"002-batchx-autopush\"\n      value {\n        bouncer_group_id_prefix: \"gemini-1.5-flash-002-batchx-nonprod\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 32767\n          bouncer_project_id: \"vertex-batchx-llm-10s\"\n        }\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 2048\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_org_based_limit: true\n        org_based_limit_supported_regions: \"us\"\n        org_based_limit_supported_regions: \"global\"\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n      }\n    }\n    quota_configs {\n      key: \"002-batchx-preprod\"\n      value {\n        bouncer_group_id_prefix: \"gemini-1.5-flash-002-batchx-prod\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 32767\n          bouncer_project_id: \"vertex-batchx-llm-10s\"\n        }\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 2048\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n      }\n    }\n    quota_configs {\n      key: \"002-batchx-prod\"\n      value {\n        bouncer_group_id_prefix: \"gemini-1.5-flash-002-batchx-prod\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 32767\n          bouncer_project_id: \"vertex-batchx-llm-10s\"\n        }\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 2048\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n      }\n    }\n    quota_configs {\n      key: \"002-batchx-staging\"\n      value {\n        bouncer_group_id_prefix: \"gemini-1.5-flash-002-batchx-nonprod\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 32767\n          bouncer_project_id: \"vertex-batchx-llm-10s\"\n        }\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 2048\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_org_based_limit: true\n        org_based_limit_supported_regions: \"us\"\n        org_based_limit_supported_regions: \"global\"\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n      }\n    }\n    quota_configs {\n      key: \"002-dynamic-paygo-test-autopush\"\n      value {\n        bouncer_group_id_prefix: \"gemini-1.5-flash@002-dynamic-paygo-test-autopush\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 32767\n          bouncer_project_id: \"vertex-batchx-llm-10s\"\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-batchx-llm-10s\"\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        capacity_quota_override {\n          key: \"eu\"\n          value: \"us\"\n        }\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 2048\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_org_based_limit: true\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"us\"\n          model_name: \"/batchx/endpoint-gemini-1-5-flash-002-prod\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"eu\"\n          model_name: \"/batchx/endpoint-gemini-1-5-flash-002-eu-prod\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"canada\"\n          model_name: \"/batchx/endpoint-gemini-1-5-flash-002-ca-prod\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"germany\"\n          model_name: \"/batchx/endpoint-gemini-1-5-flash-002-de-prod\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"japan\"\n          model_name: \"/batchx/endpoint-gemini-1-5-flash-002-jp-prod\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: IMAGE_TOKENIZER\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/vqcoca-soft\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: IMAGE_TOKENIZER\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/vqcoca-soft\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TOKENIZER\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/audio-usm-hard-tokenizer\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TOKENIZER\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/audio-usm-hard-tokenizer\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TRANSCRIPTION\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/audio-usm-draftlm-transcription\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TRANSCRIPTION\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/audio-usm-draftlm-transcription\"\n        }\n        org_based_limit_supported_regions: \"us\"\n        org_based_limit_supported_regions: \"global\"\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n      }\n    }\n    quota_configs {\n      key: \"002-dynamic-paygo-test-preprod\"\n      value {\n        bouncer_group_id_prefix: \"gemini-1.5-flash@002-dynamic-paygo-test-preprod\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 32767\n          bouncer_project_id: \"vertex-batchx-llm-10s\"\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-batchx-llm-10s\"\n        }\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 2048\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"us\"\n          model_name: \"/batchx/endpoint-gemini-1-5-flash-002-prod\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"eu\"\n          model_name: \"/batchx/endpoint-gemini-1-5-flash-002-eu-prod\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"canada\"\n          model_name: \"/batchx/endpoint-gemini-1-5-flash-002-ca-prod\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"germany\"\n          model_name: \"/batchx/endpoint-gemini-1-5-flash-002-de-prod\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"japan\"\n          model_name: \"/batchx/endpoint-gemini-1-5-flash-002-jp-prod\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: IMAGE_TOKENIZER\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/vqcoca-soft\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: IMAGE_TOKENIZER\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/vqcoca-soft\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TOKENIZER\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/audio-usm-hard-tokenizer\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TOKENIZER\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/audio-usm-hard-tokenizer\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TRANSCRIPTION\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/audio-usm-draftlm-transcription\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TRANSCRIPTION\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/audio-usm-draftlm-transcription\"\n        }\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n      }\n    }\n    quota_configs {\n      key: \"002-dynamic-paygo-test-prod\"\n      value {\n        bouncer_group_id_prefix: \"gemini-1.5-flash@002-dynamic-paygo-test-prod\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 32767\n          bouncer_project_id: \"vertex-batchx-llm-10s\"\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-batchx-llm-10s\"\n        }\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 2048\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"us\"\n          model_name: \"/batchx/endpoint-gemini-1-5-flash-002-prod\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"eu\"\n          model_name: \"/batchx/endpoint-gemini-1-5-flash-002-eu-prod\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"canada\"\n          model_name: \"/batchx/endpoint-gemini-1-5-flash-002-ca-prod\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"germany\"\n          model_name: \"/batchx/endpoint-gemini-1-5-flash-002-de-prod\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"japan\"\n          model_name: \"/batchx/endpoint-gemini-1-5-flash-002-jp-prod\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: IMAGE_TOKENIZER\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/vqcoca-soft\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: IMAGE_TOKENIZER\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/vqcoca-soft\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TOKENIZER\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/audio-usm-hard-tokenizer\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TOKENIZER\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/audio-usm-hard-tokenizer\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TRANSCRIPTION\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/audio-usm-draftlm-transcription\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TRANSCRIPTION\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/audio-usm-draftlm-transcription\"\n        }\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n      }\n    }\n    quota_configs {\n      key: \"002-dynamic-paygo-test-staging\"\n      value {\n        bouncer_group_id_prefix: \"gemini-1.5-flash@002-dynamic-paygo-test-staging\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 32767\n          bouncer_project_id: \"vertex-batchx-llm-10s\"\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-batchx-llm-10s\"\n        }\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 2048\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_org_based_limit: true\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"us\"\n          model_name: \"/batchx/endpoint-gemini-1-5-flash-002-prod\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"eu\"\n          model_name: \"/batchx/endpoint-gemini-1-5-flash-002-eu-prod\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"canada\"\n          model_name: \"/batchx/endpoint-gemini-1-5-flash-002-ca-prod\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"germany\"\n          model_name: \"/batchx/endpoint-gemini-1-5-flash-002-de-prod\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"japan\"\n          model_name: \"/batchx/endpoint-gemini-1-5-flash-002-jp-prod\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: IMAGE_TOKENIZER\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/vqcoca-soft\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: IMAGE_TOKENIZER\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/vqcoca-soft\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TOKENIZER\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/audio-usm-hard-tokenizer\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TOKENIZER\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/audio-usm-hard-tokenizer\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TRANSCRIPTION\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/audio-usm-draftlm-transcription\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TRANSCRIPTION\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/audio-usm-draftlm-transcription\"\n        }\n        org_based_limit_supported_regions: \"us\"\n        org_based_limit_supported_regions: \"global\"\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n      }\n    }\n    quota_configs {\n      key: \"002-staging-0917\"\n      value {\n        bouncer_group_id_prefix: \"gemini-1.5-flash@002-staging-0917\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 32767\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        capacity_quota_override {\n          key: \"eu\"\n          value: \"us\"\n        }\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 2048\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_quota_server_first_overflow: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        quota_overflow_multiplier_in_seconds: 240\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"us\"\n          model_name: \"/batchx/endpoint-gemini-1-5-flash-002-dev\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"eu\"\n          model_name: \"/batchx/endpoint-gemini-1-5-flash-002-eu-dev\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: IMAGE_TOKENIZER\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/vqcoca-soft\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: IMAGE_TOKENIZER\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/vqcoca-soft\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TOKENIZER\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/audio-usm-hard-tokenizer\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TOKENIZER\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/audio-usm-hard-tokenizer\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TRANSCRIPTION\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/audio-usm-draftlm-transcription\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TRANSCRIPTION\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/audio-usm-draftlm-transcription\"\n        }\n        org_based_limit_supported_regions: \"us\"\n        org_based_limit_supported_regions: \"global\"\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"gemini-1.5-flash-8b\"\n  value {\n    quota_configs {\n      key: \"002\"\n      value {\n        bouncer_group_id_prefix: \"gemini-1.5-flash-8b@002\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 32767\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        capacity_quota_override {\n          key: \"eu\"\n          value: \"us\"\n        }\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 2048\n        enable_bouncer_global_quota: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"gemini-1.5-pro\"\n  value {\n    quota_configs {\n      key: \"002\"\n      value {\n        bouncer_group_id_prefix: \"gemini-1.5-pro@002\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 32767\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 2000000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        capacity_quota_override {\n          key: \"northamerica-northeast1\"\n          value: \"canada\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-1\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 2048\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"us\"\n          model_name: \"/batchx/endpoint-gemini-1-5-pro-002-prod\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"eu\"\n          model_name: \"/batchx/endpoint-gemini-1-5-pro-002-eu-prod\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: IMAGE_TOKENIZER\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/vqcoca-soft\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: IMAGE_TOKENIZER\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/vqcoca-soft\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TOKENIZER\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/audio-usm-hard-tokenizer\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TOKENIZER\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/audio-usm-hard-tokenizer\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TRANSCRIPTION\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/audio-usm-draftlm-transcription\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TRANSCRIPTION\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/audio-usm-draftlm-transcription\"\n        }\n        org_based_limit_supported_regions: \"us\"\n        org_based_limit_supported_regions: \"eu\"\n        org_based_limit_supported_regions: \"asia\"\n        org_based_limit_supported_regions: \"global\"\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"002-autopush-0911\"\n      value {\n        bouncer_group_id_prefix: \"gemini-1.5-pro@002-autopush-0911\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 32767\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-1\"\n        count_input_tokens_with_rpc: true\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"us\"\n          model_name: \"/batchx/endpoint-gemini-1-5-pro-002-prod\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"eu\"\n          model_name: \"/batchx/endpoint-gemini-1-5-pro-002-eu-prod\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: IMAGE_TOKENIZER\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/vqcoca-soft\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: IMAGE_TOKENIZER\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/vqcoca-soft\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TOKENIZER\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/audio-usm-hard-tokenizer\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TOKENIZER\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/audio-usm-hard-tokenizer\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TRANSCRIPTION\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/audio-usm-draftlm-transcription\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TRANSCRIPTION\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/audio-usm-draftlm-transcription\"\n        }\n        org_based_limit_supported_regions: \"us\"\n        org_based_limit_supported_regions: \"global\"\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"002-batchx-autopush\"\n      value {\n        bouncer_group_id_prefix: \"gemini-1.5-pro-002-batchx-nonprod\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 32767\n          bouncer_project_id: \"vertex-batchx-llm-10s\"\n        }\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 2048\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_org_based_limit: true\n        org_based_limit_supported_regions: \"us\"\n        org_based_limit_supported_regions: \"global\"\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n      }\n    }\n    quota_configs {\n      key: \"002-batchx-preprod\"\n      value {\n        bouncer_group_id_prefix: \"gemini-1.5-pro-002-batchx-prod\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 32767\n          bouncer_project_id: \"vertex-batchx-llm-10s\"\n        }\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 2048\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n      }\n    }\n    quota_configs {\n      key: \"002-batchx-prod\"\n      value {\n        bouncer_group_id_prefix: \"gemini-1.5-pro-002-batchx-prod\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 32767\n          bouncer_project_id: \"vertex-batchx-llm-10s\"\n        }\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 2048\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n      }\n    }\n    quota_configs {\n      key: \"002-batchx-staging\"\n      value {\n        bouncer_group_id_prefix: \"gemini-1.5-pro-002-batchx-nonprod\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 32767\n          bouncer_project_id: \"vertex-batchx-llm-10s\"\n        }\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 2048\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_org_based_limit: true\n        org_based_limit_supported_regions: \"us\"\n        org_based_limit_supported_regions: \"global\"\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n      }\n    }\n    quota_configs {\n      key: \"002-staging-0917\"\n      value {\n        bouncer_group_id_prefix: \"gemini-1.5-pro@002-staging-0917\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 32767\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        capacity_quota_override {\n          key: \"eu\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-1\"\n        count_input_tokens_with_rpc: true\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"us\"\n          model_name: \"/batchx/endpoint-gemini-1-5-pro-002-prod\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"eu\"\n          model_name: \"/batchx/endpoint-gemini-1-5-pro-002-eu-prod\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: IMAGE_TOKENIZER\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/vqcoca-soft\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: IMAGE_TOKENIZER\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/vqcoca-soft\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TOKENIZER\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/audio-usm-hard-tokenizer\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TOKENIZER\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/audio-usm-hard-tokenizer\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TRANSCRIPTION\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/audio-usm-draftlm-transcription\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TRANSCRIPTION\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/audio-usm-draftlm-transcription\"\n        }\n        org_based_limit_supported_regions: \"us\"\n        org_based_limit_supported_regions: \"global\"\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"gemini-2.0-flash\"\n  value {\n    quota_configs {\n      key: \"001\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.0-flash@001\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 32767\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"us\"\n          model_name: \"/vertex-spot/gemini-v3-s-rev16p5-sc-text-us-prod\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"eu\"\n          model_name: \"/batchx/endpoint-gemini-2-0-flash-001-eu-prod\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: IMAGE_TOKENIZER\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/coca-plus-mmft-2x2-soft\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: IMAGE_TOKENIZER\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/coca-plus-mmft-2x2-soft\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TOKENIZER\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/audio-v7p3-hard-tokenizer\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TOKENIZER\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/audio-v7p3-hard-tokenizer\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TRANSCRIPTION\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/audio-usm-draftlm-transcription\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TRANSCRIPTION\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/audio-usm-draftlm-transcription\"\n        }\n        org_based_limit_supported_regions: \"us\"\n        org_based_limit_supported_regions: \"eu\"\n        org_based_limit_supported_regions: \"asia\"\n        org_based_limit_supported_regions: \"global\"\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"001-autopush\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.0-flash@001-autopush\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 32767\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        capacity_quota_override {\n          key: \"eu\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 200\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"us\"\n          model_name: \"/vertex-spot/gemini-v3-s-rev16p5-sc-text-us-prod\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"eu\"\n          model_name: \"/batchx/endpoint-gemini-2-0-flash-001-eu-prod\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: IMAGE_TOKENIZER\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/coca-plus-mmft-2x2-soft\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: IMAGE_TOKENIZER\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/coca-plus-mmft-2x2-soft\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TOKENIZER\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/audio-v7p3-hard-tokenizer\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TOKENIZER\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/audio-v7p3-hard-tokenizer\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TRANSCRIPTION\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/audio-usm-draftlm-transcription\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TRANSCRIPTION\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/audio-usm-draftlm-transcription\"\n        }\n        org_based_limit_supported_regions: \"us\"\n        org_based_limit_supported_regions: \"global\"\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n        enable_uta_with_fail_open: true\n      }\n    }\n    quota_configs {\n      key: \"001-preprod\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.0-flash@001-preprod\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 32767\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        capacity_quota_override {\n          key: \"eu\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 200\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"us\"\n          model_name: \"/vertex-spot/gemini-v3-s-rev16p5-sc-text-us-prod\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"eu\"\n          model_name: \"/batchx/endpoint-gemini-2-0-flash-001-eu-prod\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: IMAGE_TOKENIZER\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/coca-plus-mmft-2x2-soft\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: IMAGE_TOKENIZER\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/coca-plus-mmft-2x2-soft\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TOKENIZER\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/audio-v7p3-hard-tokenizer\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TOKENIZER\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/audio-v7p3-hard-tokenizer\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TRANSCRIPTION\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/audio-usm-draftlm-transcription\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TRANSCRIPTION\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/audio-usm-draftlm-transcription\"\n        }\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        enable_uta_with_fail_open: true\n      }\n    }\n    quota_configs {\n      key: \"001-staging\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.0-flash@001-staging\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 32767\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        capacity_quota_override {\n          key: \"eu\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 200\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"us\"\n          model_name: \"/vertex-spot/gemini-v3-s-rev16p5-sc-text-us-prod\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"eu\"\n          model_name: \"/batchx/endpoint-gemini-2-0-flash-001-eu-prod\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: IMAGE_TOKENIZER\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/coca-plus-mmft-2x2-soft\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: IMAGE_TOKENIZER\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/coca-plus-mmft-2x2-soft\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TOKENIZER\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/audio-v7p3-hard-tokenizer\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TOKENIZER\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/audio-v7p3-hard-tokenizer\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TRANSCRIPTION\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/audio-usm-draftlm-transcription\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TRANSCRIPTION\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/audio-usm-draftlm-transcription\"\n        }\n        org_based_limit_supported_regions: \"us\"\n        org_based_limit_supported_regions: \"global\"\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n        enable_uta_with_fail_open: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"gemini-2.0-flash-lite\"\n  value {\n    quota_configs {\n      key: \"001\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.0-flash-lite@001\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 320\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 32767\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        capacity_quota_override {\n          key: \"eu\"\n          value: \"eu\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 512\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        org_based_limit_supported_regions: \"us\"\n        org_based_limit_supported_regions: \"eu\"\n        org_based_limit_supported_regions: \"asia\"\n        org_based_limit_supported_regions: \"global\"\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"001-autopush\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.0-flash-lite@001-autopush\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 32767\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        capacity_quota_override {\n          key: \"eu\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 512\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        org_based_limit_supported_regions: \"us\"\n        org_based_limit_supported_regions: \"global\"\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n        enable_uta_with_fail_open: true\n      }\n    }\n    quota_configs {\n      key: \"001-preprod\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.0-flash-lite@001-preprod\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 32767\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        capacity_quota_override {\n          key: \"eu\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 512\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        enable_uta_with_fail_open: true\n      }\n    }\n    quota_configs {\n      key: \"001-staging\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.0-flash-lite@001-staging\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 32767\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        capacity_quota_override {\n          key: \"eu\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 512\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        org_based_limit_supported_regions: \"us\"\n        org_based_limit_supported_regions: \"global\"\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n        enable_uta_with_fail_open: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"gemini-2.0-flash-live\"\n  value {\n    quota_configs {\n      key: \"001\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.0-flash@001\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 32767\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        min_precharged_input_tokens: 1024\n        enable_bouncer_global_quota: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        vta_station_id_prefix_override: \"gemini-2.0-flash@001\"\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"001-autopush\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.0-flash@001-autopush\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 32767\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        capacity_quota_override {\n          key: \"eu\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        min_precharged_input_tokens: 200\n        enable_bouncer_global_quota: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        force_throttle_ratio: 0.5\n        force_throttle_environments: \"autopush\"\n        vta_station_id_prefix_override: \"gemini-2.0-flash@001-autopush\"\n        org_based_limit_supported_regions: \"us\"\n        org_based_limit_supported_regions: \"global\"\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"001-preprod\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.0-flash@001-preprod\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 32767\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        capacity_quota_override {\n          key: \"eu\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        min_precharged_input_tokens: 200\n        enable_bouncer_global_quota: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        vta_station_id_prefix_override: \"gemini-2.0-flash@001-preprod\"\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n      }\n    }\n    quota_configs {\n      key: \"001-staging\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.0-flash@001-staging\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 32767\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        capacity_quota_override {\n          key: \"eu\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        min_precharged_input_tokens: 200\n        enable_bouncer_global_quota: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        vta_station_id_prefix_override: \"gemini-2.0-flash@001-staging\"\n        org_based_limit_supported_regions: \"us\"\n        org_based_limit_supported_regions: \"global\"\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"gemini-2.0-flash-preview-image-generation\"\n  value {\n    quota_configs {\n      key: \"001\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.0-flash-preview-image-generation@default\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 32767\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"001-autopush\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.0-flash-preview-image-generation@default-autopush\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 32767\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_org_based_limit: true\n        org_based_limit_supported_regions: \"us\"\n        org_based_limit_supported_regions: \"global\"\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"001-staging\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.0-flash-preview-image-generation@default-staging\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 32767\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n      }\n    }\n    quota_configs {\n      key: \"default\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.0-flash-preview-image-generation@default\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 32767\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"default-autopush\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.0-flash-preview-image-generation@default-autopush\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 32767\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_org_based_limit: true\n        org_based_limit_supported_regions: \"us\"\n        org_based_limit_supported_regions: \"global\"\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"default-preprod\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.0-flash-preview-image-generation@default-preprod\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 32767\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_org_based_limit: true\n        org_based_limit_supported_regions: \"us\"\n        org_based_limit_supported_regions: \"global\"\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"default-staging\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.0-flash-preview-image-generation@default-staging\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 32767\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"gemini-2.5-flash\"\n  value {\n    quota_configs {\n      key: \"001\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-flash@default\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"us\"\n          model_name: \"/vertex-belowspot/gemini-v3p1-s-rev19-sc-text-us\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-spot/gemini-v3p1-s-rev19-sc-text-eu\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: IMAGE_TOKENIZER\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/coca-plus-mmft-2x2-soft\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: IMAGE_TOKENIZER\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/coca-plus-mmft-2x2-soft\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TOKENIZER\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/audio-v7p3-hard-tokenizer\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TOKENIZER\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/audio-v7p3-hard-tokenizer\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TRANSCRIPTION\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/audio-usm-draftlm-transcription\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TRANSCRIPTION\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/audio-usm-draftlm-transcription\"\n        }\n        vta_station_id_prefix_override: \"gemini-2.5-flash@default\"\n        enable_org_pt_quota_server: true\n        force_check_input_tokens_super_quota: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"001-autopush\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-flash@default-autopush\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"us\"\n          model_name: \"/vertex-belowspot/gemini-v3p1-s-rev19-sc-text-us\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-spot/gemini-v3p1-s-rev19-sc-text-eu\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: IMAGE_TOKENIZER\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/coca-plus-mmft-2x2-soft\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: IMAGE_TOKENIZER\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/coca-plus-mmft-2x2-soft\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TOKENIZER\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/audio-v7p3-hard-tokenizer\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TOKENIZER\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/audio-v7p3-hard-tokenizer\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TRANSCRIPTION\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/audio-usm-draftlm-transcription\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TRANSCRIPTION\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/audio-usm-draftlm-transcription\"\n        }\n        vta_station_id_prefix_override: \"gemini-2.5-flash@default-autopush\"\n        org_based_limit_supported_regions: \"us\"\n        org_based_limit_supported_regions: \"global\"\n        enable_org_pt_quota_server: true\n        uta_populate_metrics_from_resource_usage: true\n        enable_uta_with_fail_open: true\n      }\n    }\n    quota_configs {\n      key: \"default\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-flash@default\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"us\"\n          model_name: \"/vertex-belowspot/gemini-v3p1-s-rev19-sc-text-us\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-spot/gemini-v3p1-s-rev19-sc-text-eu\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: IMAGE_TOKENIZER\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/coca-plus-mmft-2x2-soft\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: IMAGE_TOKENIZER\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/coca-plus-mmft-2x2-soft\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TOKENIZER\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/audio-v7p3-hard-tokenizer\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TOKENIZER\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/audio-v7p3-hard-tokenizer\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TRANSCRIPTION\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/audio-usm-draftlm-transcription\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TRANSCRIPTION\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/audio-usm-draftlm-transcription\"\n        }\n        enable_org_pt_quota_server: true\n        force_check_input_tokens_super_quota: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"default-autopush\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-flash@default-autopush\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"us\"\n          model_name: \"/vertex-belowspot/gemini-v3p1-s-rev19-sc-text-us\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-spot/gemini-v3p1-s-rev19-sc-text-eu\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: IMAGE_TOKENIZER\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/coca-plus-mmft-2x2-soft\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: IMAGE_TOKENIZER\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/coca-plus-mmft-2x2-soft\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TOKENIZER\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/audio-v7p3-hard-tokenizer\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TOKENIZER\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/audio-v7p3-hard-tokenizer\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TRANSCRIPTION\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/audio-usm-draftlm-transcription\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TRANSCRIPTION\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/audio-usm-draftlm-transcription\"\n        }\n        org_based_limit_supported_regions: \"us\"\n        org_based_limit_supported_regions: \"global\"\n        enable_org_pt_quota_server: true\n        uta_populate_metrics_from_resource_usage: true\n        enable_uta_with_fail_open: true\n      }\n    }\n    quota_configs {\n      key: \"default-preprod\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-flash@default-preprod\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"us\"\n          model_name: \"/vertex-belowspot/gemini-v3p1-s-rev19-sc-text-us\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-spot/gemini-v3p1-s-rev19-sc-text-eu\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: IMAGE_TOKENIZER\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/coca-plus-mmft-2x2-soft\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: IMAGE_TOKENIZER\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/coca-plus-mmft-2x2-soft\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TOKENIZER\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/audio-v7p3-hard-tokenizer\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TOKENIZER\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/audio-v7p3-hard-tokenizer\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TRANSCRIPTION\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/audio-usm-draftlm-transcription\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TRANSCRIPTION\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/audio-usm-draftlm-transcription\"\n        }\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n        enable_uta_with_fail_open: true\n      }\n    }\n    quota_configs {\n      key: \"default-staging\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-flash@default-staging\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"us\"\n          model_name: \"/vertex-belowspot/gemini-v3p1-s-rev19-sc-text-us\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-spot/gemini-v3p1-s-rev19-sc-text-eu\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: IMAGE_TOKENIZER\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/coca-plus-mmft-2x2-soft\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: IMAGE_TOKENIZER\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/coca-plus-mmft-2x2-soft\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TOKENIZER\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/audio-v7p3-hard-tokenizer\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TOKENIZER\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/audio-v7p3-hard-tokenizer\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TRANSCRIPTION\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/audio-usm-draftlm-transcription\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TRANSCRIPTION\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/audio-usm-draftlm-transcription\"\n        }\n        org_based_limit_supported_regions: \"us\"\n        org_based_limit_supported_regions: \"global\"\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n        enable_uta_with_fail_open: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"gemini-2.5-flash-ga\"\n  value {\n    quota_configs {\n      key: \"default\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-flash@default\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"asia-northeast1\"\n          value: \"jpn\"\n        }\n        capacity_quota_override {\n          key: \"asia-northeast3\"\n          value: \"kor\"\n        }\n        capacity_quota_override {\n          key: \"asia-south1\"\n          value: \"ind\"\n        }\n        capacity_quota_override {\n          key: \"asia-southeast1\"\n          value: \"sin\"\n        }\n        capacity_quota_override {\n          key: \"australia-southeast1\"\n          value: \"aus\"\n        }\n        capacity_quota_override {\n          key: \"europe-west2\"\n          value: \"uk\"\n        }\n        capacity_quota_override {\n          key: \"europe-west3\"\n          value: \"deu\"\n        }\n        capacity_quota_override {\n          key: \"europe-west9\"\n          value: \"fra\"\n        }\n        capacity_quota_override {\n          key: \"northamerica-northeast1\"\n          value: \"can\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"us\"\n          model_name: \"/vertex-belowspot/gemini-v3p1-s-rev19-sc-text-us\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-spot/gemini-v3p1-s-rev19-sc-text-eu\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: IMAGE_TOKENIZER\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/coca-plus-mmft-2x2-soft\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: IMAGE_TOKENIZER\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/coca-plus-mmft-2x2-soft\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TOKENIZER\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/audio-v7p3-hard-tokenizer\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TOKENIZER\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/audio-v7p3-hard-tokenizer\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TRANSCRIPTION\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/audio-usm-draftlm-transcription\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TRANSCRIPTION\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/audio-usm-draftlm-transcription\"\n        }\n        vta_station_id_prefix_override: \"gemini-2.5-flash@default\"\n        enable_org_pt_quota_server: true\n        force_check_input_tokens_super_quota: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"default-autopush\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-flash@default-autopush\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"us\"\n          model_name: \"/vertex-belowspot/gemini-v3p1-s-rev19-sc-text-us\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-spot/gemini-v3p1-s-rev19-sc-text-eu\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: IMAGE_TOKENIZER\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/coca-plus-mmft-2x2-soft\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: IMAGE_TOKENIZER\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/coca-plus-mmft-2x2-soft\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TOKENIZER\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/audio-v7p3-hard-tokenizer\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TOKENIZER\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/audio-v7p3-hard-tokenizer\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TRANSCRIPTION\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/audio-usm-draftlm-transcription\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TRANSCRIPTION\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/audio-usm-draftlm-transcription\"\n        }\n        vta_station_id_prefix_override: \"gemini-2.5-flash@default-autopush\"\n        enable_org_pt_quota_server: true\n        enable_org_limit_dry_run: true\n        uta_populate_metrics_from_resource_usage: true\n        enable_uta_with_fail_open: true\n      }\n    }\n    quota_configs {\n      key: \"default-preprod\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-flash@default\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"asia-northeast1\"\n          value: \"jpn\"\n        }\n        capacity_quota_override {\n          key: \"asia-northeast3\"\n          value: \"kor\"\n        }\n        capacity_quota_override {\n          key: \"asia-south1\"\n          value: \"ind\"\n        }\n        capacity_quota_override {\n          key: \"asia-southeast1\"\n          value: \"sin\"\n        }\n        capacity_quota_override {\n          key: \"australia-southeast1\"\n          value: \"aus\"\n        }\n        capacity_quota_override {\n          key: \"europe-west2\"\n          value: \"uk\"\n        }\n        capacity_quota_override {\n          key: \"europe-west3\"\n          value: \"deu\"\n        }\n        capacity_quota_override {\n          key: \"europe-west9\"\n          value: \"fra\"\n        }\n        capacity_quota_override {\n          key: \"northamerica-northeast1\"\n          value: \"can\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"us\"\n          model_name: \"/vertex-belowspot/gemini-v3p1-s-rev19-sc-text-us\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-spot/gemini-v3p1-s-rev19-sc-text-eu\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: IMAGE_TOKENIZER\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/coca-plus-mmft-2x2-soft\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: IMAGE_TOKENIZER\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/coca-plus-mmft-2x2-soft\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TOKENIZER\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/audio-v7p3-hard-tokenizer\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TOKENIZER\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/audio-v7p3-hard-tokenizer\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TRANSCRIPTION\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/audio-usm-draftlm-transcription\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TRANSCRIPTION\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/audio-usm-draftlm-transcription\"\n        }\n        vta_station_id_prefix_override: \"gemini-2.5-flash@default\"\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n        enable_uta_with_fail_open: true\n      }\n    }\n    quota_configs {\n      key: \"default-staging\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-flash@default-staging\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"us\"\n          model_name: \"/vertex-belowspot/gemini-v3p1-s-rev19-sc-text-us\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-spot/gemini-v3p1-s-rev19-sc-text-eu\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: IMAGE_TOKENIZER\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/coca-plus-mmft-2x2-soft\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: IMAGE_TOKENIZER\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/coca-plus-mmft-2x2-soft\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TOKENIZER\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/audio-v7p3-hard-tokenizer\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TOKENIZER\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/audio-v7p3-hard-tokenizer\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TRANSCRIPTION\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/audio-usm-draftlm-transcription\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: AUDIO_TRANSCRIPTION\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/audio-usm-draftlm-transcription\"\n        }\n        vta_station_id_prefix_override: \"gemini-2.5-flash@default-staging\"\n        org_based_limit_supported_regions: \"us\"\n        org_based_limit_supported_regions: \"global\"\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n        enable_uta_with_fail_open: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"gemini-2.5-flash-image-ga\"\n  value {\n    quota_configs {\n      key: \"default\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-flash-image-ga@default\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"default-autopush\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-flash-image-ga@default-autopush\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"default-preprod\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-flash-image-ga@default-preprod\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"default-staging\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-flash-image-ga@default-staging\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        enable_org_pt_quota_server: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"gemini-2.5-flash-latest\"\n  value {\n    quota_configs {\n      key: \"default\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-flash-latest@default\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"default-autopush\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-flash-latest@default-autopush\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        org_based_limit_supported_regions: \"us\"\n        org_based_limit_supported_regions: \"global\"\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n        enable_uta_with_fail_open: true\n      }\n    }\n    quota_configs {\n      key: \"default-preprod\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-flash-latest@default\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        enable_uta_with_fail_open: true\n      }\n    }\n    quota_configs {\n      key: \"default-staging\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-flash-latest@default-staging\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        org_based_limit_supported_regions: \"us\"\n        org_based_limit_supported_regions: \"global\"\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n        enable_uta_with_fail_open: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"gemini-2.5-flash-lite\"\n  value {\n    quota_configs {\n      key: \"default\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-flash-lite@default\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        org_based_limit_supported_regions: \"us\"\n        org_based_limit_supported_regions: \"eu\"\n        org_based_limit_supported_regions: \"asia\"\n        org_based_limit_supported_regions: \"global\"\n        enable_org_pt_quota_server: true\n        force_check_input_tokens_super_quota: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"default-autopush\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-flash-lite@default-autopush\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        org_based_limit_supported_regions: \"us\"\n        org_based_limit_supported_regions: \"global\"\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n        enable_uta_with_fail_open: true\n      }\n    }\n    quota_configs {\n      key: \"default-preprod\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-flash-lite@default-preprod\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        org_based_limit_supported_regions: \"us\"\n        org_based_limit_supported_regions: \"eu\"\n        org_based_limit_supported_regions: \"asia\"\n        org_based_limit_supported_regions: \"global\"\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n        enable_uta_with_fail_open: true\n      }\n    }\n    quota_configs {\n      key: \"default-staging\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-flash-lite@default-staging\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        org_based_limit_supported_regions: \"us\"\n        org_based_limit_supported_regions: \"global\"\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n        enable_uta_with_fail_open: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"gemini-2.5-flash-lite-latest\"\n  value {\n    quota_configs {\n      key: \"default\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-flash-lite-latest@default\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"default-autopush\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-flash-lite-latest@default-autopush\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        org_based_limit_supported_regions: \"us\"\n        org_based_limit_supported_regions: \"global\"\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n        enable_uta_with_fail_open: true\n      }\n    }\n    quota_configs {\n      key: \"default-preprod\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-flash-lite-latest@default-preprod\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        enable_uta_with_fail_open: true\n      }\n    }\n    quota_configs {\n      key: \"default-staging\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-flash-lite-latest@default-staging\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        org_based_limit_supported_regions: \"us\"\n        org_based_limit_supported_regions: \"global\"\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n        enable_uta_with_fail_open: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"gemini-2.5-flash-lite-preview\"\n  value {\n    quota_configs {\n      key: \"06-17\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-flash-lite@default\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        vta_station_id_prefix_override: \"gemini-2.5-flash-lite@default\"\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"06-17-autopush\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-flash-lite@default-autopush\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        vta_station_id_prefix_override: \"gemini-2.5-flash-lite@default-autopush\"\n        org_based_limit_supported_regions: \"us\"\n        org_based_limit_supported_regions: \"global\"\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n        enable_uta_with_fail_open: true\n      }\n    }\n    quota_configs {\n      key: \"06-17-preprod\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-flash-lite@default-preprod\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        vta_station_id_prefix_override: \"gemini-2.5-flash-lite@default-preprod\"\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n        enable_uta_with_fail_open: true\n      }\n    }\n    quota_configs {\n      key: \"06-17-staging\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-flash-lite@default-staging\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        vta_station_id_prefix_override: \"gemini-2.5-flash-lite@default-staging\"\n        org_based_limit_supported_regions: \"us\"\n        org_based_limit_supported_regions: \"global\"\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n        enable_uta_with_fail_open: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"gemini-2.5-flash-lite-preview-09-2025\"\n  value {\n    quota_configs {\n      key: \"default\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-flash-lite-preview-09-2025@default\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"global\"\n          value: \"nondrz\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"default-autopush\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-flash-lite-preview-09-2025@default-autopush\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"global\"\n          value: \"nondrz\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n        enable_uta_with_fail_open: true\n      }\n    }\n    quota_configs {\n      key: \"default-preprod\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-flash-lite-preview-09-2025@default-preprod\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        enable_uta_with_fail_open: true\n      }\n    }\n    quota_configs {\n      key: \"default-staging\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-flash-lite-preview-09-2025@default-staging\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"global\"\n          value: \"nondrz\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n        enable_uta_with_fail_open: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"gemini-2.5-flash-manual-test\"\n  value {\n    quota_configs {\n      key: \"default\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-flash-manual-test@default\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_pt_quota_server: true\n        enable_base_model_config_pt_config: true\n      }\n    }\n    quota_configs {\n      key: \"default-autopush\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-flash@default-autopush\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        vta_station_id_prefix_override: \"gemini-2.5-flash@default-autopush\"\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"default-preprod\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-flash-manual-test@default\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        vta_station_id_prefix_override: \"gemini-2.5-flash-manual-test@default\"\n        enable_org_pt_quota_server: true\n        enable_base_model_config_pt_config: true\n      }\n    }\n    quota_configs {\n      key: \"default-staging\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-flash@default-staging\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        vta_station_id_prefix_override: \"gemini-2.5-flash@default-staging\"\n        enable_org_pt_quota_server: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"gemini-2.5-flash-optimized\"\n  value {\n    quota_configs {\n      key: \"default\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-flash-optimized@default\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_pt_quota_server: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"default-autopush\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-flash-optimized@default-autopush\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        enable_org_pt_quota_server: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"default-preprod\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-flash-optimized@default-preprod\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_pt_quota_server: true\n        enable_base_model_config_pt_config: true\n      }\n    }\n    quota_configs {\n      key: \"default-staging\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-flash-optimized@default-staging\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        enable_org_pt_quota_server: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"gemini-2.5-flash-preview-09-2025\"\n  value {\n    quota_configs {\n      key: \"default\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-flash-preview-09-2025@default\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"global\"\n          value: \"nondrz\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"default-autopush\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-flash-preview-09-2025@default-autopush\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"global\"\n          value: \"nondrz\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n        enable_uta_with_fail_open: true\n      }\n    }\n    quota_configs {\n      key: \"default-preprod\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-flash-preview-09-2025@default-preprod\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n        enable_uta_with_fail_open: true\n      }\n    }\n    quota_configs {\n      key: \"default-staging\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-flash-preview-09-2025@default-staging\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"global\"\n          value: \"nondrz\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n        enable_uta_with_fail_open: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"gemini-2.5-flash-preview-image\"\n  value {\n    quota_configs {\n      key: \"default\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-flash-preview-image@default\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"default-autopush\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-flash-preview-image@default-autopush\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        org_based_limit_supported_regions: \"us\"\n        org_based_limit_supported_regions: \"global\"\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"default-preprod\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-flash-preview-image@default-preprod\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n      }\n    }\n    quota_configs {\n      key: \"default-staging\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-flash-preview-image@default-staging\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        org_based_limit_supported_regions: \"us\"\n        org_based_limit_supported_regions: \"global\"\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"gemini-2.5-flash-preview-native-audio-dialog\"\n  value {\n    quota_configs {\n      key: \"default\"\n      value {\n        bouncer_group_id_prefix: \" gemini-2.5-flash-preview-native-audio-dialog@default\"\n        default_max_output_tokens: 1\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n      }\n    }\n    quota_configs {\n      key: \"default-autopush\"\n      value {\n        bouncer_group_id_prefix: \" gemini-2.5-flash-preview-native-audio-dialog@default-autopush\"\n        default_max_output_tokens: 1\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n      }\n    }\n    quota_configs {\n      key: \"default-preprod\"\n      value {\n        bouncer_group_id_prefix: \" gemini-2.5-flash-preview-native-audio-dialog@default-preprod\"\n        default_max_output_tokens: 1\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n      }\n    }\n    quota_configs {\n      key: \"default-staging\"\n      value {\n        bouncer_group_id_prefix: \" gemini-2.5-flash-preview-native-audio-dialog@default-staging\"\n        default_max_output_tokens: 1\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"gemini-2.5-pro-ga\"\n  value {\n    quota_configs {\n      key: \"001\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-pro-ga@default\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 65536\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        vta_station_id_prefix_override: \"gemini-2.5-pro-ga@default\"\n        enable_org_pt_quota_server: true\n        force_check_input_tokens_super_quota: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"001-autopush\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-pro-ga@default-autopush\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 65536\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        vta_station_id_prefix_override: \"gemini-2.5-pro-ga@default-autopush\"\n        org_based_limit_supported_regions: \"us\"\n        org_based_limit_supported_regions: \"global\"\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n        enable_uta_with_fail_open: true\n      }\n    }\n    quota_configs {\n      key: \"default\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-pro-ga@default\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 65536\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"asia-northeast1\"\n          value: \"jpn\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        enable_org_pt_quota_server: true\n        force_check_input_tokens_super_quota: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"default-autopush\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-pro-ga@default-autopush\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 65536\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        org_based_limit_supported_regions: \"us\"\n        org_based_limit_supported_regions: \"global\"\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n        enable_uta_with_fail_open: true\n      }\n    }\n    quota_configs {\n      key: \"default-preprod\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-pro-ga@default-preprod\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 65536\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n        enable_uta_with_fail_open: true\n      }\n    }\n    quota_configs {\n      key: \"default-staging\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-pro-ga@default-staging\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 65536\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        org_based_limit_supported_regions: \"us\"\n        org_based_limit_supported_regions: \"global\"\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n        enable_uta_with_fail_open: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"gemini-2.5-pro-latest\"\n  value {\n    quota_configs {\n      key: \"default\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-pro-latest@default\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 65536\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_pt_quota_server: true\n        force_check_input_tokens_super_quota: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"default-autopush\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-pro-latest@default-autopush\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 65536\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        org_based_limit_supported_regions: \"us\"\n        org_based_limit_supported_regions: \"global\"\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n        enable_uta_with_fail_open: true\n      }\n    }\n    quota_configs {\n      key: \"default-preprod\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-pro-latest@default\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 65536\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        vta_station_id_prefix_override: \"gemini-2.5-pro-latest@default\"\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n        enable_uta_with_fail_open: true\n      }\n    }\n    quota_configs {\n      key: \"default-staging\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-pro-latest@default-staging\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 65536\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        org_based_limit_supported_regions: \"us\"\n        org_based_limit_supported_regions: \"global\"\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n        enable_uta_with_fail_open: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"gemini-2.5-pro-optimized\"\n  value {\n    quota_configs {\n      key: \"default\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-pro-optimized@default\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_pt_quota_server: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"default-autopush\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-pro-optimized@default-autopush\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        enable_org_pt_quota_server: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"default-preprod\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-pro-optimized@default-preprod\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_pt_quota_server: true\n        enable_base_model_config_pt_config: true\n      }\n    }\n    quota_configs {\n      key: \"default-staging\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-pro-optimized@default-staging\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        enable_org_pt_quota_server: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"gemini-2.5-pro-preview\"\n  value {\n    quota_configs {\n      key: \"001\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-pro-ga@default\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 65536\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-1\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 2048\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/gemini-v3p1-m-rev18-sc-text\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/gemini-v3p1-m-rev18-sc-text\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL1M\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/gemini-v3p1-m-rev18-lc-text\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL1M\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/gemini-v3p1-m-rev18-lc-text\"\n        }\n        vta_station_id_prefix_override: \"gemini-2.5-pro-ga@default\"\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"001-autopush\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-pro-ga@default-autopush\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 65536\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-1\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 2048\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/gemini-v3p1-m-rev18-sc-text\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/gemini-v3p1-m-rev18-sc-text\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL1M\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/gemini-v3p1-m-rev18-lc-text\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL1M\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/gemini-v3p1-m-rev18-lc-text\"\n        }\n        vta_station_id_prefix_override: \"gemini-2.5-pro-ga@default-autopush\"\n        org_based_limit_supported_regions: \"us\"\n        org_based_limit_supported_regions: \"global\"\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"001-staging\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-pro-ga@default-staging\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 65536\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-1\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 2048\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/gemini-v3p1-m-rev18-sc-text\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/gemini-v3p1-m-rev18-sc-text\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL1M\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/gemini-v3p1-m-rev18-lc-text\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL1M\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/gemini-v3p1-m-rev18-lc-text\"\n        }\n        vta_station_id_prefix_override: \"gemini-2.5-pro-ga@default-staging\"\n        org_based_limit_supported_regions: \"us\"\n        org_based_limit_supported_regions: \"global\"\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"05-06\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-pro-ga@default\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 65536\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-1\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 2048\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/gemini-v3p1-m-rev18-sc-text\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/gemini-v3p1-m-rev18-sc-text\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL1M\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/gemini-v3p1-m-rev18-lc-text\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL1M\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/gemini-v3p1-m-rev18-lc-text\"\n        }\n        vta_station_id_prefix_override: \"gemini-2.5-pro-ga@default\"\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"05-06-autopush\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-pro-ga@default-autopush\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 65536\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-1\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 2048\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/gemini-v3p1-m-rev18-sc-text\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/gemini-v3p1-m-rev18-sc-text\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL1M\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/gemini-v3p1-m-rev18-lc-text\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL1M\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/gemini-v3p1-m-rev18-lc-text\"\n        }\n        vta_station_id_prefix_override: \"gemini-2.5-pro-ga@default-autopush\"\n        org_based_limit_supported_regions: \"us\"\n        org_based_limit_supported_regions: \"global\"\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"default\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-pro-ga@default\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 65536\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-1\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 2048\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/gemini-v3p1-m-rev18-sc-text\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/gemini-v3p1-m-rev18-sc-text\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL1M\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/gemini-v3p1-m-rev18-lc-text\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL1M\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/gemini-v3p1-m-rev18-lc-text\"\n        }\n        vta_station_id_prefix_override: \"gemini-2.5-pro-ga@default\"\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"default-autopush\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-pro-ga@default-autopush\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 65536\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-1\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 2048\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/gemini-v3p1-m-rev18-sc-text\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/gemini-v3p1-m-rev18-sc-text\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL1M\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/gemini-v3p1-m-rev18-lc-text\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL1M\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/gemini-v3p1-m-rev18-lc-text\"\n        }\n        vta_station_id_prefix_override: \"gemini-2.5-pro-ga@default-autopush\"\n        org_based_limit_supported_regions: \"us\"\n        org_based_limit_supported_regions: \"global\"\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"default-preprod\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-pro-ga@default-preprod\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 65536\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-1\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 2048\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/gemini-v3p1-m-rev18-sc-text\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/gemini-v3p1-m-rev18-sc-text\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL1M\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/gemini-v3p1-m-rev18-lc-text\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL1M\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/gemini-v3p1-m-rev18-lc-text\"\n        }\n        vta_station_id_prefix_override: \"gemini-2.5-pro-ga@default-preprod\"\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"default-staging\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-pro-ga@default-staging\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 65536\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-1\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 2048\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/gemini-v3p1-m-rev18-sc-text\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL32K\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/gemini-v3p1-m-rev18-sc-text\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL1M\n          bouncer_region: \"us\"\n          model_name: \"/vertex-dynamic/gemini-v3p1-m-rev18-lc-text\"\n        }\n        dynamic_endpoint_configs {\n          resource_type: MAIN_MODEL1M\n          bouncer_region: \"eu\"\n          model_name: \"/vertex-dynamic/gemini-v3p1-m-rev18-lc-text\"\n        }\n        vta_station_id_prefix_override: \"gemini-2.5-pro-ga@default-staging\"\n        org_based_limit_supported_regions: \"us\"\n        org_based_limit_supported_regions: \"global\"\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"gemini-2.5-pro-preview-06-05\"\n  value {\n    quota_configs {\n      key: \"001\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-pro-ga@default\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 65536\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        vta_station_id_prefix_override: \"gemini-2.5-pro-ga@default\"\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"001-autopush\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-pro-ga@default-autopush\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 65536\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        vta_station_id_prefix_override: \"gemini-2.5-pro-ga@default-autopush\"\n        org_based_limit_supported_regions: \"us\"\n        org_based_limit_supported_regions: \"global\"\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n        enable_uta_with_fail_open: true\n      }\n    }\n    quota_configs {\n      key: \"default\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-pro-ga@default\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 65536\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        vta_station_id_prefix_override: \"gemini-2.5-pro-ga@default\"\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"default-autopush\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-pro-ga@default-autopush\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 65536\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        vta_station_id_prefix_override: \"gemini-2.5-pro-ga@default-autopush\"\n        org_based_limit_supported_regions: \"us\"\n        org_based_limit_supported_regions: \"global\"\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"default-preprod\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-pro-ga@default-preprod\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 65536\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        vta_station_id_prefix_override: \"gemini-2.5-pro-ga@default-preprod\"\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        enable_uta_with_fail_open: true\n      }\n    }\n    quota_configs {\n      key: \"default-staging\"\n      value {\n        bouncer_group_id_prefix: \"gemini-2.5-pro-ga@default-staging\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 65536\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        vta_station_id_prefix_override: \"gemini-2.5-pro-ga@default-staging\"\n        org_based_limit_supported_regions: \"us\"\n        org_based_limit_supported_regions: \"global\"\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n        enable_uta_with_fail_open: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"gemini-3-flash-preview\"\n  value {\n    quota_configs {\n      key: \"12-2025\"\n      value {\n        bouncer_group_id_prefix: \"gemini-3-flash-preview@12-2025\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"global\"\n          value: \"nondrz\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        uta_populate_metrics_from_resource_usage: true\n        enable_uta_with_fail_open: true\n      }\n    }\n    quota_configs {\n      key: \"12-2025-autopush\"\n      value {\n        bouncer_group_id_prefix: \"gemini-3-flash-preview@12-2025-autopush\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"global\"\n          value: \"nondrz\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        uta_populate_metrics_from_resource_usage: true\n        enable_uta_with_fail_open: true\n      }\n    }\n    quota_configs {\n      key: \"12-2025-preprod\"\n      value {\n        bouncer_group_id_prefix: \"gemini-3-flash-preview@12-2025\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"global\"\n          value: \"nondrz\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        vta_station_id_prefix_override: \"gemini-3-flash-preview@12-2025\"\n        uta_populate_metrics_from_resource_usage: true\n        enable_uta_with_fail_open: true\n      }\n    }\n    quota_configs {\n      key: \"12-2025-staging\"\n      value {\n        bouncer_group_id_prefix: \"gemini-3-flash-preview@12-2025-staging\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"global\"\n          value: \"nondrz\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        uta_populate_metrics_from_resource_usage: true\n        enable_uta_with_fail_open: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"gemini-3-flash-proudbear-testing-only\"\n  value {\n    quota_configs {\n      key: \"default\"\n      value {\n        bouncer_group_id_prefix: \"gemini-3-flash-proudbear-testing-only@default\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"default-autopush\"\n      value {\n        bouncer_group_id_prefix: \"gemini-3-flash-proudbear-testing-only@default-autopush\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"default-preprod\"\n      value {\n        bouncer_group_id_prefix: \"gemini-3-flash-proudbear-testing-only@default\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        vta_station_id_prefix_override: \"gemini-3-flash-proudbear-testing-only@default\"\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"default-staging\"\n      value {\n        bouncer_group_id_prefix: \"gemini-3-flash-proudbear-testing-only@default-staging\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"gemini-3-pro-preview\"\n  value {\n    quota_configs {\n      key: \"11-2025\"\n      value {\n        bouncer_group_id_prefix: \"gemini-3.0-pro-eval@default\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"global\"\n          value: \"nondrz\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 4096\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        vta_station_id_prefix_override: \"gemini-3.0-pro-eval@default\"\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"11-2025-autopush\"\n      value {\n        bouncer_group_id_prefix: \"gemini-3.0-pro-eval@default-autopush\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"global\"\n          value: \"nondrz\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 4096\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        vta_station_id_prefix_override: \"gemini-3.0-pro-eval@default-autopush\"\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"11-2025-preprod\"\n      value {\n        bouncer_group_id_prefix: \"gemini-3.0-pro-eval@default\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"global\"\n          value: \"nondrz\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 4096\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        vta_station_id_prefix_override: \"gemini-3.0-pro-eval@default\"\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"11-2025-staging\"\n      value {\n        bouncer_group_id_prefix: \"gemini-3.0-pro-eval@default-staging\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"global\"\n          value: \"nondrz\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 4096\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        vta_station_id_prefix_override: \"gemini-3.0-pro-eval@default-staging\"\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"gemini-3-pro-preview-latest\"\n  value {\n    quota_configs {\n      key: \"default\"\n      value {\n        bouncer_group_id_prefix: \"gemini-3-pro-preview-latest@default\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"default-autopush\"\n      value {\n        bouncer_group_id_prefix: \"gemini-3-pro-preview-latest@default-autopush\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"default-preprod\"\n      value {\n        bouncer_group_id_prefix: \"gemini-3-pro-preview-latest@default\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        vta_station_id_prefix_override: \"gemini-3-pro-preview-latest@default\"\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"default-staging\"\n      value {\n        bouncer_group_id_prefix: \"gemini-3-pro-preview-latest@default-staging\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"gemini-3.0-flash-testing-only\"\n  value {\n    quota_configs {\n      key: \"default\"\n      value {\n        bouncer_group_id_prefix: \"gemini-3.0-flash-testing-only@default\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"global\"\n          value: \"nondrz\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        uta_populate_metrics_from_resource_usage: true\n        enable_uta_with_fail_open: true\n      }\n    }\n    quota_configs {\n      key: \"default-autopush\"\n      value {\n        bouncer_group_id_prefix: \"gemini-3.0-flash-testing-only@default-autopush\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"global\"\n          value: \"nondrz\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        uta_populate_metrics_from_resource_usage: true\n        enable_uta_with_fail_open: true\n      }\n    }\n    quota_configs {\n      key: \"default-preprod\"\n      value {\n        bouncer_group_id_prefix: \"gemini-3.0-flash-testing-only@default\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"global\"\n          value: \"nondrz\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        vta_station_id_prefix_override: \"gemini-3.0-flash-testing-only@default\"\n        uta_populate_metrics_from_resource_usage: true\n        enable_uta_with_fail_open: true\n      }\n    }\n    quota_configs {\n      key: \"default-staging\"\n      value {\n        bouncer_group_id_prefix: \"gemini-3.0-flash-testing-only@default-staging\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"global\"\n          value: \"nondrz\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        uta_populate_metrics_from_resource_usage: true\n        enable_uta_with_fail_open: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"gemini-3.0-pro-eval\"\n  value {\n    quota_configs {\n      key: \"default\"\n      value {\n        bouncer_group_id_prefix: \"gemini-3.0-pro-eval@default\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"global\"\n          value: \"nondrz\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 4096\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"default-autopush\"\n      value {\n        bouncer_group_id_prefix: \"gemini-3.0-pro-eval@default-autopush\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"global\"\n          value: \"nondrz\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 4096\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"default-preprod\"\n      value {\n        bouncer_group_id_prefix: \"gemini-3.0-pro-eval@default\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"global\"\n          value: \"nondrz\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 4096\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        vta_station_id_prefix_override: \"gemini-3.0-pro-eval@default\"\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"default-staging\"\n      value {\n        bouncer_group_id_prefix: \"gemini-3.0-pro-eval@default-staging\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"global\"\n          value: \"nondrz\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 4096\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"gemini-3.0-pro-image-preview\"\n  value {\n    quota_configs {\n      key: \"default\"\n      value {\n        bouncer_group_id_prefix: \"gemini-3.0-pro-image-preview@default\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 8192\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 131072\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"global\"\n          value: \"nondrz\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        uta_populate_metrics_from_resource_usage: true\n        enable_uta_with_fail_open: true\n      }\n    }\n    quota_configs {\n      key: \"default-autopush\"\n      value {\n        bouncer_group_id_prefix: \"gemini-3.0-pro-image-preview@default-autopush\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 8192\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 131072\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"global\"\n          value: \"nondrz\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        uta_populate_metrics_from_resource_usage: true\n        enable_uta_with_fail_open: true\n      }\n    }\n    quota_configs {\n      key: \"default-preprod\"\n      value {\n        bouncer_group_id_prefix: \"gemini-3.0-pro-image-preview@default\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 8192\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 131072\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"global\"\n          value: \"nondrz\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        vta_station_id_prefix_override: \"gemini-3.0-pro-image-preview@default\"\n        uta_populate_metrics_from_resource_usage: true\n        enable_uta_with_fail_open: true\n      }\n    }\n    quota_configs {\n      key: \"default-staging\"\n      value {\n        bouncer_group_id_prefix: \"gemini-3.0-pro-image-preview@default-staging\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 8192\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 131072\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"global\"\n          value: \"nondrz\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        uta_populate_metrics_from_resource_usage: true\n        enable_uta_with_fail_open: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"gemini-3.0-pro-image-preview-testing-only\"\n  value {\n    quota_configs {\n      key: \"default\"\n      value {\n        bouncer_group_id_prefix: \"gemini-3.0-pro-image-preview@default\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 8192\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 131072\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"global\"\n          value: \"nondrz\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        vta_station_id_prefix_override: \"gemini-3.0-pro-image-preview@default\"\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"default-autopush\"\n      value {\n        bouncer_group_id_prefix: \"gemini-3.0-pro-image-preview@default-autopush\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 8192\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 131072\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"global\"\n          value: \"nondrz\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        vta_station_id_prefix_override: \"gemini-3.0-pro-image-preview@default-autopush\"\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"default-preprod\"\n      value {\n        bouncer_group_id_prefix: \"gemini-3.0-pro-image-preview@default\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 8192\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 131072\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"global\"\n          value: \"nondrz\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        vta_station_id_prefix_override: \"gemini-3.0-pro-image-preview@default\"\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"default-staging\"\n      value {\n        bouncer_group_id_prefix: \"gemini-3.0-pro-image-preview@default-staging\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 8192\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 131072\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"global\"\n          value: \"nondrz\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        vta_station_id_prefix_override: \"gemini-3.0-pro-image-preview@default-staging\"\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"gemini-experimental\"\n  value {\n    quota_configs {\n      key: \"experimental-0418-autopush\"\n      value {\n        bouncer_group_id_prefix: \"gemini-pro@experimental-0418-autopush\"\n        default_max_output_tokens: 1\n        dry_run: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"gemini-live-2.5-flash\"\n  value {\n    quota_configs {\n      key: \"default\"\n      value {\n        bouncer_group_id_prefix: \"gemini-live-2.5-flash@default\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"default-autopush\"\n      value {\n        bouncer_group_id_prefix: \"gemini-live-2.5-flash@default-autopush\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        org_based_limit_supported_regions: \"us\"\n        org_based_limit_supported_regions: \"global\"\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"default-preprod\"\n      value {\n        bouncer_group_id_prefix: \"gemini-live-2.5-flash@default-preprod\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n      }\n    }\n    quota_configs {\n      key: \"default-staging\"\n      value {\n        bouncer_group_id_prefix: \"gemini-live-2.5-flash@default-staging\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        org_based_limit_supported_regions: \"us\"\n        org_based_limit_supported_regions: \"global\"\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"gemini-live-2.5-flash-native-audio\"\n  value {\n    quota_configs {\n      key: \"default\"\n      value {\n        bouncer_group_id_prefix: \"gemini-live-2.5-flash-native-audio@default\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"default-autopush\"\n      value {\n        bouncer_group_id_prefix: \"gemini-live-2.5-flash-native-audio@default-autopush\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n      }\n    }\n    quota_configs {\n      key: \"default-preprod\"\n      value {\n        bouncer_group_id_prefix: \"gemini-live-2.5-flash-native-audio@default\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        vta_station_id_prefix_override: \"gemini-live-2.5-flash-native-audio@default\"\n      }\n    }\n    quota_configs {\n      key: \"default-staging\"\n      value {\n        bouncer_group_id_prefix: \"gemini-live-2.5-flash-native-audio@default-staging\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        routing_configs {\n          station_identifier: \"1m\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"gemini-live-2.5-flash-preview-native-audio-09-2025\"\n  value {\n    quota_configs {\n      key: \"default\"\n      value {\n        bouncer_group_id_prefix: \"gemini-live-2.5-flash-preview-native-audio-09-2025@default\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"default-autopush\"\n      value {\n        bouncer_group_id_prefix: \"gemini-live-2.5-flash-preview-native-audio-09-2025@default-autopush\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"default-staging\"\n      value {\n        bouncer_group_id_prefix: \"gemini-live-2.5-flash-preview-native-audio-09-2025@default-staging\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"32k\"\n          max_input_token_count: 131072\n        }\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        enable_enterprise_tier: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_org_based_limit: true\n        enable_org_pt_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"gemini-pro\"\n  value {\n    quota_configs {\n      key: \"001-fa-e2e-high\"\n      value {\n        bouncer_group_id_prefix: \"gemini-pro@001-fa-e2e-high\"\n        default_max_output_tokens: 1\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n      }\n    }\n    quota_configs {\n      key: \"001-fa-e2e-low\"\n      value {\n        bouncer_group_id_prefix: \"gemini-pro@001-fa-e2e-low\"\n        default_max_output_tokens: 1024\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"gpt-oss-120b-maas\"\n  value {\n    quota_configs {\n      key: \"001\"\n      value {\n        bouncer_group_id_prefix: \"gpt-oss-120b-maas@001\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"lc\"\n          max_input_token_count: 131072\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"us-central1\"\n          value: \"us-central1\"\n        }\n        capacity_quota_override {\n          key: \"us-west2\"\n          value: \"us-west2\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1\n        enable_bouncer_global_quota: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"001-autopush\"\n      value {\n        bouncer_group_id_prefix: \"gpt-oss-120b-maas@001-autopush\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"lc\"\n          max_input_token_count: 131072\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"us-central1\"\n          value: \"us-central1\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1\n        enable_bouncer_global_quota: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"gpt-oss-20b-maas\"\n  value {\n    quota_configs {\n      key: \"001\"\n      value {\n        bouncer_group_id_prefix: \"gpt-oss-20b-maas@001\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"lc\"\n          max_input_token_count: 131072\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"us-central1\"\n          value: \"us-central1\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1\n        enable_bouncer_global_quota: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"001-autopush\"\n      value {\n        bouncer_group_id_prefix: \"gpt-oss-20b-maas@001-autopush\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"lc\"\n          max_input_token_count: 131072\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"us-central1\"\n          value: \"us-central1\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1\n        enable_bouncer_global_quota: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"imagegeneration\"\n  value {\n    quota_configs {\n      key: \"006\"\n      value {\n        disable_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"imagen-2.0-edit\"\n  value {\n    quota_configs {\n      key: \"preview-0627\"\n      value {\n        disable_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"imagen-3.0-capability\"\n  value {\n    quota_configs {\n      key: \"001\"\n      value {\n        disable_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n      }\n    }\n    quota_configs {\n      key: \"002\"\n      value {\n        disable_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"imagen-3.0-fast-generate\"\n  value {\n    quota_configs {\n      key: \"001\"\n      value {\n        disable_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n      }\n    }\n    quota_configs {\n      key: \"002\"\n      value {\n        disable_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"imagen-3.0-generate\"\n  value {\n    quota_configs {\n      key: \"001\"\n      value {\n        disable_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n      }\n    }\n    quota_configs {\n      key: \"002\"\n      value {\n        disable_quota_server: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"imagen-4.0-fast-generate\"\n  value {\n    quota_configs {\n      key: \"001\"\n      value {\n        bouncer_group_id_prefix: \"imagen-4.0-fast-generate@default\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 400\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        capacity_quota_override {\n          key: \"eu\"\n          value: \"us\"\n        }\n        capacity_quota_override {\n          key: \"global\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        disable_quota_server: true\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        vta_station_id_prefix_override: \"imagen-4.0-fast-generate@default\"\n        org_based_limit_supported_regions: \"us\"\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"default-autopush\"\n      value {\n        bouncer_group_id_prefix: \"imagen-4.0-fast-generate@default-autopush\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 400\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        capacity_quota_override {\n          key: \"eu\"\n          value: \"us\"\n        }\n        capacity_quota_override {\n          key: \"global\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        disable_quota_server: true\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        vta_station_id_prefix_override: \"imagen-4.0-fast-generate@default-autopush\"\n        org_based_limit_supported_regions: \"us\"\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"default-staging\"\n      value {\n        bouncer_group_id_prefix: \"imagen-4.0-fast-generate@default-staging\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 400\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        capacity_quota_override {\n          key: \"eu\"\n          value: \"us\"\n        }\n        capacity_quota_override {\n          key: \"global\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        disable_quota_server: true\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        vta_station_id_prefix_override: \"imagen-4.0-fast-generate@default-staging\"\n        org_based_limit_supported_regions: \"us\"\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"imagen-4.0-generate\"\n  value {\n    quota_configs {\n      key: \"001\"\n      value {\n        bouncer_group_id_prefix: \"imagen-4.0-generate@default\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 400\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        capacity_quota_override {\n          key: \"eu\"\n          value: \"us\"\n        }\n        capacity_quota_override {\n          key: \"global\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        disable_quota_server: true\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        vta_station_id_prefix_override: \"imagen-4.0-generate@default\"\n        org_based_limit_supported_regions: \"us\"\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"default-autopush\"\n      value {\n        bouncer_group_id_prefix: \"imagen-4.0-generate@default-autopush\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 400\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        capacity_quota_override {\n          key: \"eu\"\n          value: \"us\"\n        }\n        capacity_quota_override {\n          key: \"global\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        disable_quota_server: true\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        vta_station_id_prefix_override: \"imagen-4.0-generate@default-autopush\"\n        org_based_limit_supported_regions: \"us\"\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"default-staging\"\n      value {\n        bouncer_group_id_prefix: \"imagen-4.0-generate@default-staging\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 400\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        capacity_quota_override {\n          key: \"eu\"\n          value: \"us\"\n        }\n        capacity_quota_override {\n          key: \"global\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        disable_quota_server: true\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        vta_station_id_prefix_override: \"imagen-4.0-generate@default-staging\"\n        org_based_limit_supported_regions: \"us\"\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"imagen-4.0-ultra-generate\"\n  value {\n    quota_configs {\n      key: \"001\"\n      value {\n        bouncer_group_id_prefix: \"imagen-4.0-generate@default\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 600\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        capacity_quota_override {\n          key: \"eu\"\n          value: \"us\"\n        }\n        capacity_quota_override {\n          key: \"global\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        disable_quota_server: true\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        vta_station_id_prefix_override: \"imagen-4.0-generate@default\"\n        org_based_limit_supported_regions: \"us\"\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"default-autopush\"\n      value {\n        bouncer_group_id_prefix: \"imagen-4.0-generate@default-autopush\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 600\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        capacity_quota_override {\n          key: \"eu\"\n          value: \"us\"\n        }\n        capacity_quota_override {\n          key: \"global\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        disable_quota_server: true\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        vta_station_id_prefix_override: \"imagen-4.0-generate@default-autopush\"\n        org_based_limit_supported_regions: \"us\"\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"default-staging\"\n      value {\n        bouncer_group_id_prefix: \"imagen-4.0-generate@default-staging\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 600\n        capacity_quota_override {\n          key: \"asia\"\n          value: \"us\"\n        }\n        capacity_quota_override {\n          key: \"eu\"\n          value: \"us\"\n        }\n        capacity_quota_override {\n          key: \"global\"\n          value: \"us\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        disable_quota_server: true\n        tuned_model_share_base_model_quota: true\n        enable_bouncer_global_quota: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        vta_station_id_prefix_override: \"imagen-4.0-generate@default-staging\"\n        org_based_limit_supported_regions: \"us\"\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"internal-test-google-ai-mp-private-offer-1.0\"\n  value {\n    quota_configs {\n      key: \"001\"\n      value {\n        bouncer_group_id_prefix: \"internal-test-google-ai-mp-private-offer-1.0@001\"\n        capacity_quota_type: CAPACITY_REGIONAL\n        default_max_output_tokens: 1\n        bouncer_project_id: \"vertex-prediction-llm-pinnacle-30s\"\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"internal-test-google-ai-mp-private-offer-2.0\"\n  value {\n    quota_configs {\n      key: \"001\"\n      value {\n        bouncer_group_id_prefix: \"internal-test-google-ai-mp-private-offer-2.0@001\"\n        capacity_quota_type: CAPACITY_REGIONAL\n        default_max_output_tokens: 1\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"internal-test-google-ai-mp-usage-demo-3.0\"\n  value {\n    quota_configs {\n      key: \"001\"\n      value {\n        bouncer_group_id_prefix: \"internal-test-google-ai-mp-usage-demo-3.0@001\"\n        capacity_quota_type: CAPACITY_REGIONAL\n        default_max_output_tokens: 1\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"internal-test-google-ai-mp-usage-demo-4.0\"\n  value {\n    quota_configs {\n      key: \"001\"\n      value {\n        capacity_quota_type: CAPACITY_REGIONAL\n        default_max_output_tokens: 2000\n        avg_input_tokens_to_request_size: 5\n        default_max_input_tokens: 18000\n        enable_qs_overall_capacity_for_tokens: true\n        disable_bouncer: true\n        caching_max_input_tokens: 16000\n        caching_avg_input_tokens_to_request_size: 3\n        precharging_cap_for_pt: 20000\n        enable_pinnacle_qs_regionalization: true\n        enable_pinnacle_qs_regionalization_paygo: true\n        context_window_size_based_precharge_tier_configs {\n          key: 10000\n          value {\n            default_max_output_tokens: 100\n            default_max_input_tokens: 15000\n            avg_input_tokens_to_request_size: 5\n            caching_avg_input_tokens_to_request_size: 4\n            avg_cache_token_percentage: 0.5\n            input_token_to_precharge_tokens_ratio: 7\n            cache_input_token_to_precharge_tokens_ratio: 6\n          }\n        }\n        context_window_size_based_precharge_tier_configs {\n          key: 1000000\n          value {\n            default_max_output_tokens: 1000\n            default_max_input_tokens: 20000\n            avg_input_tokens_to_request_size: 10\n            caching_avg_input_tokens_to_request_size: 8\n            shared_token_quota_qs_user_prefix: \"op-long-global-tokens-\"\n            batch_shared_token_quota_qs_user_prefix: \"bp-long-global-tokens-\"\n            avg_cache_token_percentage: 0.1\n            input_token_to_precharge_tokens_ratio: 14\n            cache_input_token_to_precharge_tokens_ratio: 12\n            should_use_requested_max_output_tokens: true\n          }\n        }\n        tier_routing_config {\n          short_context_byte_limit: 500\n          request_size_to_context_window_size_tiers {\n            key: 500\n            value: 10000\n          }\n          request_size_to_context_window_size_tiers {\n            key: 1000000\n            value: 1000000\n          }\n        }\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"internal-test-google-multi-region-test-model\"\n  value {\n    quota_configs {\n      key: \"001\"\n      value {\n        bouncer_group_id_prefix: \"multi-region-test-model@001\"\n        capacity_quota_type: CAPACITY_REGIONAL\n        default_max_output_tokens: 60\n        bouncer_project_id: \"vertex-prediction-llm-pinnacle-testing\"\n        avg_input_tokens_to_request_size: 0.4\n        default_max_input_tokens: 100\n        enable_qs_overall_capacity_for_tokens: true\n        enable_pinnacle_qs_regionalization: true\n        enable_pinnacle_qs_regionalization_paygo: true\n      }\n    }\n    quota_configs {\n      key: \"GLOBAL_ENDPOINT\"\n      value {\n        enable_tpm_quota_split: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"internal-test-google-pinnacle-e2e-test\"\n  value {\n    quota_configs {\n      key: \"001\"\n      value {\n        bouncer_group_id_prefix: \"internal-test-google-pinnacle-e2e-test@001\"\n        capacity_quota_type: CAPACITY_REGIONAL\n        default_max_output_tokens: 60\n        bouncer_project_id: \"vertex-prediction-llm-pinnacle-testing\"\n        avg_input_tokens_to_request_size: 0.4\n        default_max_input_tokens: 100\n        enable_qs_overall_capacity_for_tokens: true\n        caching_max_input_tokens: 100\n        caching_avg_input_tokens_to_request_size: 0.04\n        enable_weighted_tokens_for_paygo_tpm_quota: true\n        enable_tpm_quota_split: true\n        enable_pinnacle_qs_regionalization: true\n        enable_pinnacle_qs_regionalization_paygo: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"internal-test-google-pinnacle-load-test-model-1\"\n  value {\n    quota_configs {\n      key: \"001\"\n      value {\n        capacity_quota_type: CAPACITY_REGIONAL\n        default_max_output_tokens: 2000\n        avg_input_tokens_to_request_size: 5\n        default_max_input_tokens: 18000\n        enable_qs_overall_capacity_for_tokens: true\n        disable_bouncer: true\n        caching_max_input_tokens: 16000\n        caching_avg_input_tokens_to_request_size: 3\n        precharging_cap_for_pt: 20000\n        enable_pinnacle_qs_regionalization: true\n        enable_pinnacle_qs_regionalization_paygo: true\n        context_window_size_based_precharge_tier_configs {\n          key: 10000\n          value {\n            default_max_output_tokens: 100\n            default_max_input_tokens: 15000\n            avg_input_tokens_to_request_size: 5\n            caching_avg_input_tokens_to_request_size: 4\n            avg_cache_token_percentage: 0.5\n            input_token_to_precharge_tokens_ratio: 7\n            cache_input_token_to_precharge_tokens_ratio: 6\n          }\n        }\n        context_window_size_based_precharge_tier_configs {\n          key: 1000000\n          value {\n            default_max_output_tokens: 1000\n            default_max_input_tokens: 20000\n            avg_input_tokens_to_request_size: 10\n            caching_avg_input_tokens_to_request_size: 8\n            shared_token_quota_qs_user_prefix: \"op-long-global-tokens-\"\n            batch_shared_token_quota_qs_user_prefix: \"bp-long-global-tokens-\"\n            avg_cache_token_percentage: 0.1\n            input_token_to_precharge_tokens_ratio: 14\n            cache_input_token_to_precharge_tokens_ratio: 12\n          }\n        }\n        tier_routing_config {\n          short_context_byte_limit: 1000\n          request_size_to_context_window_size_tiers {\n            key: 1000\n            value: 10000\n          }\n          request_size_to_context_window_size_tiers {\n            key: 1000000\n            value: 1000000\n          }\n        }\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"internal-test-google-pinnacle-load-test-model-2\"\n  value {\n    quota_configs {\n      key: \"001\"\n      value {\n        capacity_quota_type: CAPACITY_REGIONAL\n        default_max_output_tokens: 2000\n        avg_input_tokens_to_request_size: 5\n        default_max_input_tokens: 18000\n        enable_qs_overall_capacity_for_tokens: true\n        disable_bouncer: true\n        caching_max_input_tokens: 16000\n        caching_avg_input_tokens_to_request_size: 3\n        precharging_cap_for_pt: 20000\n        enable_pinnacle_qs_regionalization: true\n        enable_pinnacle_qs_regionalization_paygo: true\n        context_window_size_based_precharge_tier_configs {\n          key: 10000\n          value {\n            default_max_output_tokens: 100\n            default_max_input_tokens: 15000\n            avg_input_tokens_to_request_size: 5\n            caching_avg_input_tokens_to_request_size: 4\n            avg_cache_token_percentage: 0.5\n            input_token_to_precharge_tokens_ratio: 7\n            cache_input_token_to_precharge_tokens_ratio: 6\n          }\n        }\n        context_window_size_based_precharge_tier_configs {\n          key: 1000000\n          value {\n            default_max_output_tokens: 1000\n            default_max_input_tokens: 20000\n            avg_input_tokens_to_request_size: 10\n            caching_avg_input_tokens_to_request_size: 8\n            avg_cache_token_percentage: 0.1\n            input_token_to_precharge_tokens_ratio: 14\n            cache_input_token_to_precharge_tokens_ratio: 12\n          }\n        }\n        tier_routing_config {\n          short_context_byte_limit: 5000\n          request_size_to_context_window_size_tiers {\n            key: 1000\n            value: 10000\n          }\n          request_size_to_context_window_size_tiers {\n            key: 1000000\n            value: 1000000\n          }\n        }\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"internal-test-google-pinnacle-load-test-model-3\"\n  value {\n    quota_configs {\n      key: \"001\"\n      value {\n        capacity_quota_type: CAPACITY_REGIONAL\n        default_max_output_tokens: 2000\n        avg_input_tokens_to_request_size: 5\n        default_max_input_tokens: 18000\n        enable_qs_overall_capacity_for_tokens: true\n        disable_bouncer: true\n        caching_max_input_tokens: 16000\n        caching_avg_input_tokens_to_request_size: 3\n        precharging_cap_for_pt: 20000\n        enable_pinnacle_qs_regionalization: true\n        enable_pinnacle_qs_regionalization_paygo: true\n        context_window_size_based_precharge_tier_configs {\n          key: 10000\n          value {\n            default_max_output_tokens: 100\n            default_max_input_tokens: 15000\n            avg_input_tokens_to_request_size: 5\n            caching_avg_input_tokens_to_request_size: 4\n            avg_cache_token_percentage: 0.5\n            input_token_to_precharge_tokens_ratio: 7\n            cache_input_token_to_precharge_tokens_ratio: 6\n          }\n        }\n        context_window_size_based_precharge_tier_configs {\n          key: 1000000\n          value {\n            default_max_output_tokens: 1000\n            default_max_input_tokens: 20000\n            avg_input_tokens_to_request_size: 10\n            caching_avg_input_tokens_to_request_size: 8\n            disable_pt_check: true\n            shared_token_quota_qs_user_prefix: \"op-long-global-tokens-\"\n            batch_shared_token_quota_qs_user_prefix: \"bp-long-global-tokens-\"\n            avg_cache_token_percentage: 0.1\n            input_token_to_precharge_tokens_ratio: 14\n            cache_input_token_to_precharge_tokens_ratio: 12\n          }\n        }\n        tier_routing_config {\n          short_context_byte_limit: 1000\n          request_size_to_context_window_size_tiers {\n            key: 1000\n            value: 10000\n          }\n          request_size_to_context_window_size_tiers {\n            key: 1000000\n            value: 1000000\n          }\n        }\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"internal-test-google-pinnacle-load-test-model-4\"\n  value {\n    quota_configs {\n      key: \"001\"\n      value {\n        capacity_quota_type: CAPACITY_REGIONAL\n        default_max_output_tokens: 2000\n        avg_input_tokens_to_request_size: 5\n        default_max_input_tokens: 18000\n        enable_qs_overall_capacity_for_tokens: true\n        disable_bouncer: true\n        caching_max_input_tokens: 16000\n        caching_avg_input_tokens_to_request_size: 3\n        precharging_cap_for_pt: 20000\n        enable_pinnacle_qs_regionalization: true\n        enable_pinnacle_qs_regionalization_paygo: true\n        context_window_size_based_precharge_tier_configs {\n          key: 10000\n          value {\n            default_max_output_tokens: 100\n            default_max_input_tokens: 15000\n            avg_input_tokens_to_request_size: 5\n            caching_avg_input_tokens_to_request_size: 4\n            avg_cache_token_percentage: 0.5\n            input_token_to_precharge_tokens_ratio: 7\n            cache_input_token_to_precharge_tokens_ratio: 6\n          }\n        }\n        context_window_size_based_precharge_tier_configs {\n          key: 1000000\n          value {\n            default_max_output_tokens: 1000\n            default_max_input_tokens: 20000\n            avg_input_tokens_to_request_size: 10\n            caching_avg_input_tokens_to_request_size: 8\n            shared_token_quota_qs_user_prefix: \"op-long-global-tokens-\"\n            batch_shared_token_quota_qs_user_prefix: \"bp-long-global-tokens-\"\n            avg_cache_token_percentage: 0.1\n            input_token_to_precharge_tokens_ratio: 14\n            cache_input_token_to_precharge_tokens_ratio: 12\n            should_use_requested_max_output_tokens: true\n          }\n        }\n        tier_routing_config {\n          short_context_byte_limit: 500\n          request_size_to_context_window_size_tiers {\n            key: 500\n            value: 1\n          }\n          request_size_to_context_window_size_tiers {\n            key: 1000000\n            value: 10001\n          }\n        }\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"internal-test-google-pinnacle-load-test-model-5\"\n  value {\n    quota_configs {\n      key: \"001\"\n      value {\n        bouncer_group_id_prefix: \"internal-test-google-pinnacle-load-test-model-5@001\"\n        capacity_quota_type: CAPACITY_REGIONAL\n        default_max_output_tokens: 1640\n        avg_input_tokens_to_request_size: 0.1656\n        default_max_input_tokens: 15347\n        enable_qs_overall_capacity_for_tokens: true\n        enable_pinnacle_qs_regionalization: true\n        enable_pinnacle_qs_regionalization_paygo: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"internal-test-google-pinnacle-load-test-model-6\"\n  value {\n    quota_configs {\n      key: \"001\"\n      value {\n        bouncer_group_id_prefix: \"internal-test-google-pinnacle-load-test-model-6@001\"\n        capacity_quota_type: CAPACITY_REGIONAL\n        default_max_output_tokens: 1265\n        avg_input_tokens_to_request_size: 0.184\n        default_max_input_tokens: 6245\n        enable_pinnacle_qs_regionalization: true\n        enable_pinnacle_qs_regionalization_paygo: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"internal-test-google-pinnacle-load-test-model-7\"\n  value {\n    quota_configs {\n      key: \"001\"\n      value {\n        bouncer_group_id_prefix: \"internal-test-google-pinnacle-load-test-model-7@001\"\n        capacity_quota_type: CAPACITY_REGIONAL\n        default_max_output_tokens: 2705\n        avg_input_tokens_to_request_size: 0.1582\n        default_max_input_tokens: 14461\n        enable_pinnacle_qs_regionalization: true\n        enable_pinnacle_qs_regionalization_paygo: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"internal-test-google-pinnacle-load-test-model-8\"\n  value {\n    quota_configs {\n      key: \"001\"\n      value {\n        bouncer_group_id_prefix: \"internal-test-google-pinnacle-load-test-model-8@001\"\n        capacity_quota_type: CAPACITY_REGIONAL\n        default_max_output_tokens: 2705\n        avg_input_tokens_to_request_size: 0.1582\n        default_max_input_tokens: 14461\n        enable_qs_overall_capacity_for_tokens: true\n        disable_bouncer: true\n        enable_pinnacle_qs_regionalization: true\n        enable_pinnacle_qs_regionalization_paygo: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"internal-test-google-pinnacle-playground-1\"\n  value {\n    quota_configs {\n      key: \"001\"\n      value {\n        bouncer_group_id_prefix: \"internal-test-google-pinnacle-playground-1@001\"\n        capacity_quota_type: CAPACITY_REGIONAL\n        default_max_output_tokens: 10\n        avg_input_tokens_to_request_size: 1\n        default_max_input_tokens: 1000\n        disable_bouncer: true\n        caching_max_input_tokens: 1000\n        caching_avg_input_tokens_to_request_size: 1\n        enable_pinnacle_qs_regionalization: true\n        enable_pinnacle_qs_regionalization_paygo: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"internal-test-google-pinnacle-quota-test-model-1\"\n  value {\n    quota_configs {\n      key: \"001\"\n      value {\n        bouncer_group_id_prefix: \"internal-test-google-pinnacle-quota-test-model-1@001\"\n        capacity_quota_type: CAPACITY_REGIONAL\n        default_max_output_tokens: 100\n        avg_input_tokens_to_request_size: 0.1656\n        default_max_input_tokens: 1000\n        enable_qs_overall_capacity_for_requests: true\n        disable_bouncer: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"internal-test-google-pinnacle-quota-test-model-2\"\n  value {\n    quota_configs {\n      key: \"001\"\n      value {\n        bouncer_group_id_prefix: \"internal-test-google-pinnacle-quota-test-model-2@001\"\n        capacity_quota_type: CAPACITY_REGIONAL\n        default_max_output_tokens: 100\n        avg_input_tokens_to_request_size: 0.1656\n        default_max_input_tokens: 1000\n        enable_qs_overall_capacity_for_tokens: true\n        disable_bouncer: true\n        enable_pinnacle_qs_regionalization: true\n        enable_pinnacle_qs_regionalization_paygo: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"internal-test-google-pinnacle-web-search-model\"\n  value {\n    quota_configs {\n      key: \"001\"\n      value {\n        bouncer_group_id_prefix: \"internal-test-google-pinnacle-web-search-model@001\"\n        capacity_quota_type: CAPACITY_REGIONAL\n        default_max_output_tokens: 100\n        avg_input_tokens_to_request_size: 0.1656\n        default_max_input_tokens: 1000\n        enable_qs_overall_capacity_for_tokens: true\n        disable_bouncer: true\n        enable_pinnacle_qs_regionalization: true\n        enable_pinnacle_qs_regionalization_paygo: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"internal-test-google-pinnacle-websearch-test\"\n  value {\n    quota_configs {\n      key: \"001\"\n      value {\n        bouncer_group_id_prefix: \"internal-test-google-pinnacle-websearch-test@001\"\n        capacity_quota_type: CAPACITY_REGIONAL\n        default_max_output_tokens: 100\n        avg_input_tokens_to_request_size: 0.1656\n        default_max_input_tokens: 1000\n        enable_qs_overall_capacity_for_tokens: true\n        disable_bouncer: true\n        enable_tpm_quota_split: true\n        enable_pinnacle_qs_regionalization: true\n        enable_pinnacle_qs_regionalization_paygo: true\n      }\n    }\n    quota_configs {\n      key: \"GLOBAL_ENDPOINT\"\n      value {\n        enable_tpm_quota_split: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"internal-test-google-test-model\"\n  value {\n    quota_configs {\n      key: \"GLOBAL_ENDPOINT\"\n      value {\n      }\n    }\n    quota_configs {\n      key: \"test-version\"\n      value {\n        capacity_quota_type: CAPACITY_REGIONAL\n        default_max_output_tokens: 1234\n        avg_input_tokens_to_request_size: 0.1656\n        default_max_input_tokens: 14000\n        enable_qs_overall_capacity_for_tokens: true\n        disable_bouncer: true\n        caching_max_input_tokens: 15347\n        caching_avg_input_tokens_to_request_size: 0.1363\n        precharging_cap_for_pt: 16987\n        enable_pinnacle_qs_regionalization: true\n        enable_pinnacle_qs_regionalization_paygo: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"kimi-k2-thinking-maas\"\n  value {\n    quota_configs {\n      key: \"001\"\n      value {\n        bouncer_group_id_prefix: \"kimi-k2-thinking-maas@001\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"lc\"\n          max_input_token_count: 163840\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"us-east1\"\n          value: \"us-east1\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1\n        enable_bouncer_global_quota: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        vta_station_id_prefix_override: \"kimi-k2-thinking-maas@001\"\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"001-autopush\"\n      value {\n        bouncer_group_id_prefix: \"kimi-k2-thinking-maas@001-autopush\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"lc\"\n          max_input_token_count: 163840\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"us-central1\"\n          value: \"us-east1\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1\n        enable_bouncer_global_quota: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        vta_station_id_prefix_override: \"kimi-k2-thinking-maas@001-autopush\"\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"llama-3.3-70b-instruct-maas\"\n  value {\n    quota_configs {\n      key: \"001\"\n      value {\n        bouncer_group_id_prefix: \"llama-3.3-70b-instruct-maas@001\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"lc\"\n          max_input_token_count: 131072\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"us-central1\"\n          value: \"us-central1\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1\n        enable_bouncer_global_quota: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"001-autopush\"\n      value {\n        bouncer_group_id_prefix: \"llama-3.3-70b-instruct-maas@001-autopush\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"lc\"\n          max_input_token_count: 131072\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"us-central1\"\n          value: \"us-central1\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1\n        enable_bouncer_global_quota: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"llama-4-maverick-17b-128e-instruct-maas\"\n  value {\n    quota_configs {\n      key: \"001\"\n      value {\n        bouncer_group_id_prefix: \"llama-4-maverick-17b-128e-instruct-maas@001\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"lc\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"us-east5\"\n          value: \"us-east5\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1\n        enable_bouncer_global_quota: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"001-autopush\"\n      value {\n        bouncer_group_id_prefix: \"llama-4-maverick-17b-128e-instruct-maas@001-autopush\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"lc\"\n          max_input_token_count: 1100000\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"us-central1\"\n          value: \"us-east5\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1\n        enable_bouncer_global_quota: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"llama-4-scout-17b-16e-instruct-maas\"\n  value {\n    quota_configs {\n      key: \"001\"\n      value {\n        bouncer_group_id_prefix: \"llama-4-scout-17b-16e-instruct-maas@001\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"lc\"\n          max_input_token_count: 10485760\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"us-east5\"\n          value: \"us-east5\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1\n        enable_bouncer_global_quota: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"001-autopush\"\n      value {\n        bouncer_group_id_prefix: \"llama-4-scout-17b-16e-instruct-maas@001-autopush\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"lc\"\n          max_input_token_count: 10485760\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"us-central1\"\n          value: \"us-east5\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1\n        enable_bouncer_global_quota: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"llama3-405b-instruct-maas\"\n  value {\n    quota_configs {\n      key: \"001\"\n      value {\n        bouncer_group_id_prefix: \"llama3-405b-instruct-maas@001\"\n        capacity_quota_type: CAPACITY_REGIONAL\n        default_max_output_tokens: 8192\n        enable_base_model_config_pt_config: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"llama3-70b-instruct-maas\"\n  value {\n    quota_configs {\n      key: \"001\"\n      value {\n        bouncer_group_id_prefix: \"llama3-70b-instruct-maas@001\"\n        capacity_quota_type: CAPACITY_REGIONAL\n        default_max_output_tokens: 8192\n        dry_run: true\n        enable_base_model_config_pt_config: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"llama3-8b-instruct-maas\"\n  value {\n    quota_configs {\n      key: \"001\"\n      value {\n        bouncer_group_id_prefix: \"llama3-8b-instruct-maas@001\"\n        capacity_quota_type: CAPACITY_REGIONAL\n        default_max_output_tokens: 8192\n        dry_run: true\n        enable_base_model_config_pt_config: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"minimax-m2-maas\"\n  value {\n    quota_configs {\n      key: \"001\"\n      value {\n        bouncer_group_id_prefix: \"minimax-maas@001\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"lc\"\n          max_input_token_count: 163840\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"us-east1\"\n          value: \"us-east1\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1\n        enable_bouncer_global_quota: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        vta_station_id_prefix_override: \"minimax-maas@001\"\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"001-autopush\"\n      value {\n        bouncer_group_id_prefix: \"minimax-maas@001-autopush\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"lc\"\n          max_input_token_count: 163840\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"us-central1\"\n          value: \"us-east1\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1\n        enable_bouncer_global_quota: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        vta_station_id_prefix_override: \"minimax-maas@001-autopush\"\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"mistralai-codestral-2\"\n  value {\n    quota_configs {\n      key: \"001\"\n      value {\n        capacity_quota_type: CAPACITY_REGIONAL\n        default_max_output_tokens: 1\n        enable_qs_overall_capacity_for_tokens: true\n        disable_bouncer: true\n        enable_tpm_quota_split: true\n        enable_pinnacle_qs_regionalization: true\n        enable_pinnacle_qs_regionalization_paygo: true\n      }\n    }\n    quota_configs {\n      key: \"GLOBAL_ENDPOINT\"\n      value {\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"mistralai-codestral-2501\"\n  value {\n    quota_configs {\n      key: \"001\"\n      value {\n        bouncer_group_id_prefix: \"mistralai-codestral-2501@001\"\n        capacity_quota_type: CAPACITY_REGIONAL\n        default_max_output_tokens: 1\n        enable_qs_overall_capacity_for_tokens: true\n        disable_bouncer: true\n        enable_pinnacle_qs_regionalization: true\n        enable_pinnacle_qs_regionalization_paygo: true\n      }\n    }\n    quota_configs {\n      key: \"GLOBAL_ENDPOINT\"\n      value {\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"mistralai-ministral-3b-2410\"\n  value {\n    quota_configs {\n      key: \"001\"\n      value {\n        disable_bouncer: true\n        disable_quota_server: true\n      }\n    }\n    quota_configs {\n      key: \"GLOBAL_ENDPOINT\"\n      value {\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"mistralai-mistral-large-2411\"\n  value {\n    quota_configs {\n      key: \"001\"\n      value {\n        bouncer_group_id_prefix: \"mistralai-mistral-large-2411@001\"\n        capacity_quota_type: CAPACITY_REGIONAL\n        default_max_output_tokens: 1\n        enable_qs_overall_capacity_for_tokens: true\n        disable_bouncer: true\n        enable_pinnacle_qs_regionalization: true\n        enable_pinnacle_qs_regionalization_paygo: true\n      }\n    }\n    quota_configs {\n      key: \"GLOBAL_ENDPOINT\"\n      value {\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"mistralai-mistral-medium-3\"\n  value {\n    quota_configs {\n      key: \"001\"\n      value {\n        capacity_quota_type: CAPACITY_REGIONAL\n        default_max_output_tokens: 1\n        enable_qs_overall_capacity_for_tokens: true\n        disable_bouncer: true\n        enable_pinnacle_qs_regionalization: true\n        enable_pinnacle_qs_regionalization_paygo: true\n      }\n    }\n    quota_configs {\n      key: \"GLOBAL_ENDPOINT\"\n      value {\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"mistralai-mistral-nemo\"\n  value {\n    quota_configs {\n      key: \"2407\"\n      value {\n        bouncer_group_id_prefix: \"mistralai-mistral-nemo@2407\"\n        capacity_quota_type: CAPACITY_REGIONAL\n        enable_qs_overall_capacity_for_tokens: true\n        disable_bouncer: true\n        enable_pinnacle_qs_regionalization: true\n        enable_pinnacle_qs_regionalization_paygo: true\n      }\n    }\n    quota_configs {\n      key: \"GLOBAL_ENDPOINT\"\n      value {\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"mistralai-mistral-ocr-2505\"\n  value {\n    quota_configs {\n      key: \"001\"\n      value {\n        bouncer_group_id_prefix: \"mistralai-mistral-ocr-2505@001\"\n        capacity_quota_type: CAPACITY_REGIONAL\n        default_max_output_tokens: 1\n        enable_qs_overall_capacity_for_tokens: true\n        disable_bouncer: true\n        enable_pinnacle_qs_regionalization: true\n        enable_pinnacle_qs_regionalization_paygo: true\n      }\n    }\n    quota_configs {\n      key: \"GLOBAL_ENDPOINT\"\n      value {\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"mistralai-mistral-small-2503\"\n  value {\n    quota_configs {\n      key: \"001\"\n      value {\n        bouncer_group_id_prefix: \"mistralai-mistral-small-2503@001\"\n        capacity_quota_type: CAPACITY_REGIONAL\n        default_max_output_tokens: 1\n        enable_qs_overall_capacity_for_tokens: true\n        disable_bouncer: true\n        enable_pinnacle_qs_regionalization: true\n        enable_pinnacle_qs_regionalization_paygo: true\n      }\n    }\n    quota_configs {\n      key: \"GLOBAL_ENDPOINT\"\n      value {\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"mistralai-mistral-staging\"\n  value {\n    quota_configs {\n      key: \"001\"\n      value {\n        capacity_quota_type: CAPACITY_REGIONAL\n        default_max_output_tokens: 1\n        enable_qs_overall_capacity_for_tokens: true\n        disable_bouncer: true\n        enable_pinnacle_qs_regionalization: true\n        enable_pinnacle_qs_regionalization_paygo: true\n      }\n    }\n    quota_configs {\n      key: \"GLOBAL_ENDPOINT\"\n      value {\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"openmaas-2.0-dsq-test\"\n  value {\n    quota_configs {\n      key: \"001\"\n      value {\n        bouncer_group_id_prefix: \"qwen3-coder-480b-a35b-instruct-maas@001\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"lc\"\n          max_input_token_count: 163840\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"asia-south2\"\n          value: \"asia-south2\"\n        }\n        capacity_quota_override {\n          key: \"us-south1\"\n          value: \"us-south1\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1\n        enable_bouncer_global_quota: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        vta_station_id_prefix_override: \"qwen3-coder-480b-a35b-instruct-maas@001\"\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"001-autopush\"\n      value {\n        bouncer_group_id_prefix: \"deepseek-r1-0528-maas@001-autopush\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"lc\"\n          max_input_token_count: 163840\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"us-central1\"\n          value: \"us-central1\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        enable_bouncer_global_quota: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        vta_station_id_prefix_override: \"deepseek-r1-0528-maas@001-autopush\"\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"001-staging\"\n      value {\n        bouncer_group_id_prefix: \"deepseek-r1-0528-maas@001-staging\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"lc\"\n          max_input_token_count: 163840\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"us-central1\"\n          value: \"us-central1\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1024\n        enable_bouncer_global_quota: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        vta_station_id_prefix_override: \"deepseek-r1-0528-maas@001-staging\"\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"qwen3-235b-a22b-instruct-2507-maas\"\n  value {\n    quota_configs {\n      key: \"001\"\n      value {\n        bouncer_group_id_prefix: \"qwen3-235b-a22b-instruct-2507-maas@001\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"lc\"\n          max_input_token_count: 262144\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"us-east5\"\n          value: \"us-east5\"\n        }\n        capacity_quota_override {\n          key: \"us-south1\"\n          value: \"us-south1\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1\n        enable_bouncer_global_quota: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"001-autopush\"\n      value {\n        bouncer_group_id_prefix: \"qwen3-235b-a22b-instruct-2507-maas@001-autopush\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"lc\"\n          max_input_token_count: 262144\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"us-south1\"\n          value: \"us-south1\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1\n        enable_bouncer_global_quota: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"qwen3-coder-480b-a35b-instruct-maas\"\n  value {\n    quota_configs {\n      key: \"001\"\n      value {\n        bouncer_group_id_prefix: \"qwen3-coder-480b-a35b-instruct-maas@001\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"lc\"\n          max_input_token_count: 262144\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"asia-south2\"\n          value: \"asia-south2\"\n        }\n        capacity_quota_override {\n          key: \"us-south1\"\n          value: \"us-south1\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1\n        enable_bouncer_global_quota: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"001-autopush\"\n      value {\n        bouncer_group_id_prefix: \"qwen3-coder-480b-a35b-instruct-maas@001-autopush\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"lc\"\n          max_input_token_count: 262144\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"us-south1\"\n          value: \"us-south1\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1\n        enable_bouncer_global_quota: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"qwen3-next-80b-a3b-instruct-maas\"\n  value {\n    quota_configs {\n      key: \"001\"\n      value {\n        bouncer_group_id_prefix: \"qwen3-next-80b-a3b-instruct-maas@001\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"lc\"\n          max_input_token_count: 262144\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"us-east5\"\n          value: \"us-east5\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1\n        enable_bouncer_global_quota: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"001-autopush\"\n      value {\n        bouncer_group_id_prefix: \"@001-autopush\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"lc\"\n          max_input_token_count: 262144\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"us-central1\"\n          value: \"us-east5\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1\n        enable_bouncer_global_quota: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"qwen3-next-80b-a3b-thinking-maas\"\n  value {\n    quota_configs {\n      key: \"001\"\n      value {\n        bouncer_group_id_prefix: \"qwen3-next-80b-a3b-thinking-maas@001\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"lc\"\n          max_input_token_count: 262144\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"us-east5\"\n          value: \"us-east5\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1\n        enable_bouncer_global_quota: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n    quota_configs {\n      key: \"001-autopush\"\n      value {\n        bouncer_group_id_prefix: \"@001-autopush\"\n        capacity_quota_type: CAPACITY_MULTIREGIONAL\n        default_max_output_tokens: 1\n        routing_configs {\n          station_identifier: \"lc\"\n          max_input_token_count: 262144\n          bouncer_project_id: \"vertex-prediction-llm-30s\"\n        }\n        capacity_quota_override {\n          key: \"us-central1\"\n          value: \"us-east5\"\n        }\n        bouncer_project_id: \"vertex-prediction-llm-2\"\n        count_input_tokens_with_rpc: true\n        min_precharged_input_tokens: 1\n        enable_bouncer_global_quota: true\n        disable_region_affix: true\n        enable_vta_admission_control: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n        uta_populate_metrics_from_resource_usage: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"text-embedding\"\n  value {\n    quota_configs {\n      key: \"005\"\n      value {\n        default_max_output_tokens: 1\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"veo-3.0-fast-generate-001\"\n  value {\n    quota_configs {\n      key: \"default\"\n      value {\n        enable_vta_admission_control: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n      }\n    }\n    quota_configs {\n      key: \"default-autopush\"\n      value {\n        enable_vta_admission_control: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n      }\n    }\n    quota_configs {\n      key: \"default-staging\"\n      value {\n        enable_vta_admission_control: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"veo-3.0-generate-001\"\n  value {\n    quota_configs {\n      key: \"default\"\n      value {\n        enable_vta_admission_control: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n      }\n    }\n    quota_configs {\n      key: \"default-autopush\"\n      value {\n        enable_vta_admission_control: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n      }\n    }\n    quota_configs {\n      key: \"default-staging\"\n      value {\n        enable_vta_admission_control: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"veo-3.1-fast-generate-001\"\n  value {\n    quota_configs {\n      key: \"default\"\n      value {\n        enable_vta_admission_control: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n      }\n    }\n    quota_configs {\n      key: \"default-autopush\"\n      value {\n        enable_vta_admission_control: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n      }\n    }\n    quota_configs {\n      key: \"default-staging\"\n      value {\n        enable_vta_admission_control: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"veo-3.1-fast-generate-preview\"\n  value {\n    quota_configs {\n      key: \"default\"\n      value {\n        enable_vta_admission_control: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n      }\n    }\n    quota_configs {\n      key: \"default-autopush\"\n      value {\n        enable_vta_admission_control: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n      }\n    }\n    quota_configs {\n      key: \"default-staging\"\n      value {\n        enable_vta_admission_control: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"veo-3.1-generate-001\"\n  value {\n    quota_configs {\n      key: \"default\"\n      value {\n        enable_vta_admission_control: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n      }\n    }\n    quota_configs {\n      key: \"default-autopush\"\n      value {\n        enable_vta_admission_control: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n      }\n    }\n    quota_configs {\n      key: \"default-staging\"\n      value {\n        enable_vta_admission_control: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n      }\n    }\n  }\n}\nmodel_version_quota_config_map {\n  key: \"veo-3.1-generate-preview\"\n  value {\n    quota_configs {\n      key: \"default\"\n      value {\n        enable_vta_admission_control: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n      }\n    }\n    quota_configs {\n      key: \"default-autopush\"\n      value {\n        enable_vta_admission_control: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n      }\n    }\n    quota_configs {\n      key: \"default-staging\"\n      value {\n        enable_vta_admission_control: true\n        enable_pt_quota_check: true\n        enable_base_model_config_pt_config: true\n      }\n    }\n  }\n}\nuta_base_model_config {\n  enable_all_models_uta_reporting: true\n  use_user_project_number_for_batch: true\n}\n"
id: 0
'
	BaseModelConfigFeature__enabled: 'name: "BaseModelConfigFeature__enabled"
type: BOOL
base_value: "TRUE"
id: 0
'
	BatchSharedQpmQuota__enabled: 'name: "BatchSharedQpmQuota__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	DisablePinnacleBilling__enabled: 'name: "DisablePinnacleBilling__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	EnableCachingMonitoringConfig__enabled: 'name: "EnableCachingMonitoringConfig__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	EnableExplicitCache__blocked_model_ids: 'name: "EnableExplicitCache__blocked_model_ids"
type: PROTO
sub_type: "experiments.proto.StringListParam"
base_value: "element: \"gemini-1.0-pro-001\"\nelement: \"gemini-1.0-pro-002\"\nelement: \"gemini-2.5-flash-preview-image-generation\"\nelement: \"gemini-2.5-flash-image-generation\"\nelement: \"gemini-2.5-pro-computer-use-preview\"\nelement: \"gemini-3.0-pro-image-preview\"\n"
id: 0
'
	EnableExportTokensMonarch__enabled: 'name: "EnableExportTokensMonarch__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	EnableMigrateDeepseekToOpenMaas2__enabled: 'name: "EnableMigrateDeepseekToOpenMaas2__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	EnableServingSpecPtConfig__enabled: 'name: "EnableServingSpecPtConfig__enabled"
type: BOOL
base_value: "FALSE"
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 702
}
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 25
}
id: 0
'
	EnableVertexAiPredictionLlmRequestsModalityTokenCountPerMinutePerModelQuotaCheckFeature__enabled: 'name: "EnableVertexAiPredictionLlmRequestsModalityTokenCountPerMinutePerModelQuotaCheckFeature__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	EnableVertexAiPredictionLlmRequestsWithLogprobsPerDayPerModelQuotaCheckFeature__enabled: 'name: "EnableVertexAiPredictionLlmRequestsWithLogprobsPerDayPerModelQuotaCheckFeature__enabled"
type: BOOL
base_value: "TRUE"
id: 0
'
	EnableVertexAiPredictionLlmRequestsWithLogprobsPerDayPerModelQuotaCheckFeature__enabled_models: 'name: "EnableVertexAiPredictionLlmRequestsWithLogprobsPerDayPerModelQuotaCheckFeature__enabled_models"
type: PROTO
sub_type: "experiments.proto.StringListParam"
base_value: "element: \"gemini-1.5-flash-001\"\nelement: \"gemini-1.5-flash-002\"\nelement: \"gemini-1.5-flash\"\nelement: \"gemini-experimental\"\nelement: \"gemini-2.0-flash\"\nelement: \"gemini-2.0-flash-lite\"\nelement: \"gemini-2.5-pro-preview\"\nelement: \"gemini-2.5-flash-preview\"\n"
id: 0
'
	EnableVertexAiPredictionLlmRequestsWithMultimodalityOutPerMinutePerModelQuotaCheckFeature__enabled: 'name: "EnableVertexAiPredictionLlmRequestsWithMultimodalityOutPerMinutePerModelQuotaCheckFeature__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	EnableVertexAiPredictionLlmRequestsWithMultimodalityOutPerMinutePerModelQuotaCheckFeature__enabled_models: 'name: "EnableVertexAiPredictionLlmRequestsWithMultimodalityOutPerMinutePerModelQuotaCheckFeature__enabled_models"
type: PROTO
sub_type: "experiments.proto.StringListParam"
base_value: "element: \"gemini-experimental\"\n"
id: 0
'
	FairAllocationTokenMonitoring__enabled: 'name: "FairAllocationTokenMonitoring__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	ForceRedirectProjectAllowlist__enabled: 'name: "ForceRedirectProjectAllowlist__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	GeminiBillingModelNameMappingFeature__base_model_id_to_billing_model_name_map: 'name: "GeminiBillingModelNameMappingFeature__base_model_id_to_billing_model_name_map"
type: STRING
base_value: "gemini-2.5-flash-preview=gemini-2.5-flash,gemini-2.5-pro-preview=gemini-2.5-pro"
id: 0
'
	GeminiBillingModelNameMappingFeature__billing_model_name_to_billing_label_model_name_map: 'name: "GeminiBillingModelNameMappingFeature__billing_model_name_to_billing_label_model_name_map"
type: STRING
base_value: "gemini-2.5-flash-preview=gemini-2.5-flash,gemini-2.5-pro-preview=gemini-2.5-pro,gemini-2.5-flash-001=gemini-2.5-flash,gemini-2.5-pro-001=gemini-2.5-pro,gemini-2.0-flash-preview-image-generation=gemini-2.0-flash,gemini-2.5-flash-lite-preview=gemini-2.5-flash-lite,gemini-2.5-pro-ga=gemini-2.5-pro,gemini-2.5-flash-image=gemini-2.5-flash-ga,gemini-2.5-flash-image-ga=gemini-2.5-flash-ga,computer-use-preview=gemini-2.5-pro"
id: 0
'
	GeminiFlashLiveApiDsq__enabled: 'name: "GeminiFlashLiveApiDsq__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	GeminiFlashLiveApiDsq__sample_rate: 'name: "GeminiFlashLiveApiDsq__sample_rate"
type: DOUBLE
base_value: "0.1"
id: 0
'
	GeminiInternalToExternalModelNameMappingFeature__internal_to_external_model_id_map: 'name: "GeminiInternalToExternalModelNameMappingFeature__internal_to_external_model_id_map"
type: STRING
base_value: "gemini-2.0-thinking=gemini-2.0-flash-thinking,gemini-2.0-flash-preview=gemini-2.5-flash-preview,gemini-2.0-flash-exp=gemini-2.0-flash-preview-image-generation,gemini-2.5-flash-preview-native-audio-dialog=gemini-2.5-flash-native-audio-dialog,gemini-2.5-pro-preview-06-05=gemini-2.5-pro-preview,gemini-2.5-flash-latest=gemini-2.5-flash-ga,gemini-2.5-flash-preview-09-2025=gemini-2.5-flash-ga,gemini-2.5-pro-latest=gemini-2.5-pro-preview,gemini-live-2.5-flash=gemini-2.5-flash-native-audio-dialog,gemini-live-2.5-flash-preview-native-audio-09-2025=gemini-2.5-flash-native-audio-dialog,gemini-2.5-flash-lite-latest=gemini-2.5-flash-lite,gemini-2.5-flash-lite-preview-09-2025=gemini-2.5-flash-lite,gemini-2.5-flash-preview-tts=gemini-2.5-flash-tts,gemini-2.5-pro-preview-tts=gemini-2.5-pro-tts,gemini-2.5-flash-preview-image=gemini-2.5-flash-image,gemini-2.5-flash-lite-tts=gemini-2.5-flash-tts,gemini-2.5-flash-direct=gemini-2.5-flash-ga,gemini-2.5-flash-lite-direct=gemini-2.5-flash-lite,gemini-2.5-pro-direct=gemini-2.5-pro,gemini-2.5-flash-image-direct=gemini-2.5-flash-image-ga"
id: 0
'
	GenaiTokenBasedBillingFeature__enable_api_location_tracking_label: 'name: "GenaiTokenBasedBillingFeature__enable_api_location_tracking_label"
type: BOOL
base_value: "TRUE"
id: 0
'
	GenaiTokenBasedBillingFeature__models_enabled_token_based_billing: 'name: "GenaiTokenBasedBillingFeature__models_enabled_token_based_billing"
type: PROTO
sub_type: "experiments.proto.StringListParam"
base_value: "element: \"gemini-2.0-pro\"\nelement: \"gemini-2.0-flash\"\nelement: \"gemini-2.0-flash-exp\"\nelement: \"gemini-2.0-flash-lite\"\nelement: \"gemini-2.0-flash-live\"\nelement: \"gemini-2.0-flash-preview-image-generation\"\nelement: \"gemini-2.5-flash-preview\"\nelement: \"gemini-2.5-pro-preview\"\nelement: \"gemini-2.5-flash\"\nelement: \"gemini-2.5-pro\"\nelement: \"gemini-2.5-pro-preview-06-05\"\nelement: \"gemini-2.5-flash-live\"\nelement: \"gemini-embedding\"\nelement: \"gemini-2.0-embedding\"\nelement: \"gemini-2.5-embedding\"\nelement: \"gemini-2.5-flash-manual-test\"\nelement: \"gemini-experimental\"\nelement: \"text-embedding-large-001\"\nelement: \"gemini-2.5-flash-native-audio-dialog\"\nelement: \"gemini-2.5-flash-ga\"\nelement: \"gemini-2.5-flash-lite\"\nelement: \"gemini-live-2.5-flash\"\nelement: \"gemini-live-2.5-flash-preview-native-audio-09-2025\"\nelement: \"gemini-2.5-pro-ga\"\nelement: \"gemini-2.5-flash-lite-preview\"\nelement: \"gemini-2.5-flash-tts\"\nelement: \"gemini-2.5-pro-tts\"\nelement: \"gemini-2.5-flash-image\"\nelement: \"gemini-2.5-flash-preview-image\"\nelement: \"gemini-2.5-flash-preview-09-2025\"\nelement: \"gemini-2.5-flash-lite-preview-09-2025\"\nelement: \"gemini-2.5-flash-image-ga\"\nelement: \"gemini-3-pro-preview\"\nelement: \"gemini-3.0-pro-image-preview\"\n"
id: 0
'
	GlobalEndpointCloudMetrics__enabled: 'name: "GlobalEndpointCloudMetrics__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	HarpoonHttp2__enabled: 'name: "HarpoonHttp2__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	OpenMaasConfig__config: 'name: "OpenMaasConfig__config"
type: PROTO_BINARY_BASE64
sub_type: "cloud_ai_platform_dataplane_prediction_proto.OpenMaaSConfig"
base_value: "model_configs {\n  publisher: \"google\"\n  model: \"openmaas-2.0-test\"\n  container_model: \"\"\n  enable_billing: true\n  base_version: \"001\"\n  model_rai_config {\n    csam {\n    }\n  }\n  clear_cached_tokens_in_response: true\n  model_functionality: MODEL_FUNCTIONALITY_CHAT_COMPLETIONS\n}\nmodel_configs {\n  publisher: \"google\"\n  model: \"openmaas-2.0-dsq-test\"\n  container_model: \"\"\n  enable_billing: true\n  base_version: \"001\"\n  enable_cached_token_billing: true\n  model_functionality: MODEL_FUNCTIONALITY_CHAT_COMPLETIONS\n  cached_token_discount: 0.9\n}\nmodel_configs {\n  publisher: \"openai\"\n  model: \"gpt-oss-20b-maas\"\n  container_model: \"\"\n  enable_billing: true\n  base_version: \"001\"\n  request_parameter_config {\n    max_output_tokens: 32768\n    enable_forced_tool_calls: false\n  }\n  model_rai_config {\n    csam {\n    }\n  }\n  enable_cached_token_billing: false\n  model_functionality: MODEL_FUNCTIONALITY_CHAT_COMPLETIONS\n}\nmodel_configs {\n  publisher: \"openai\"\n  model: \"gpt-oss-120b-maas\"\n  container_model: \"\"\n  enable_billing: true\n  base_version: \"001\"\n  request_parameter_config {\n    max_output_tokens: 131072\n    enable_forced_tool_calls: false\n  }\n  model_rai_config {\n    csam {\n    }\n  }\n  enable_cached_token_billing: false\n  model_functionality: MODEL_FUNCTIONALITY_CHAT_COMPLETIONS\n  global_endpoint_only_region_configs {\n    region: \"us-west2\"\n  }\n}\nmodel_configs {\n  publisher: \"qwen\"\n  model: \"qwen3-235b-a22b-instruct-2507-maas\"\n  container_model: \"qwen\"\n  enable_billing: true\n  base_version: \"001\"\n  request_parameter_config {\n    max_output_tokens: 16384\n  }\n  model_rai_config {\n    csam {\n    }\n  }\n  enable_cached_token_billing: false\n  model_functionality: MODEL_FUNCTIONALITY_CHAT_COMPLETIONS\n  global_endpoint_only_region_configs {\n    region: \"us-east5\"\n  }\n  reject_us_central1_requests: true\n}\nmodel_configs {\n  publisher: \"qwen\"\n  model: \"qwen3-coder-480b-a35b-instruct-maas\"\n  container_model: \"qwen\"\n  enable_billing: true\n  base_version: \"001\"\n  request_parameter_config {\n    max_output_tokens: 32768\n  }\n  model_rai_config {\n    csam {\n    }\n  }\n  enable_cached_token_billing: false\n  clear_cached_tokens_in_response: true\n  model_functionality: MODEL_FUNCTIONALITY_CHAT_COMPLETIONS\n  global_endpoint_only_region_configs {\n    region: \"asia-south2\"\n  }\n  inference_gateway_config {\n    model_name: \"qwen3-coder-480b-a35b-instruct-maas\"\n  }\n  reject_us_central1_requests: true\n}\nmodel_configs {\n  publisher: \"deepseek-ai\"\n  model: \"deepseek-v3.1-maas\"\n  enable_billing: true\n  base_version: \"001\"\n  request_parameter_config {\n    max_output_tokens: 32768\n  }\n  model_rai_config {\n    csam {\n    }\n  }\n  enable_cached_token_billing: false\n  model_functionality: MODEL_FUNCTIONALITY_CHAT_COMPLETIONS\n  inference_gateway_config {\n    model_name: \"deepseek-v3.1-maas\"\n  }\n  enable_thinking_chat_template_kwargs: \"thinking\"\n  enable_think_tag_prefixing: true\n  reject_us_central1_requests: true\n}\nmodel_configs {\n  publisher: \"deepseek-ai\"\n  model: \"deepseek-r1-0528-maas\"\n  container_model: \"deepseek-r1-0528\"\n  enable_billing: true\n  base_version: \"001\"\n  request_parameter_config {\n    max_output_tokens: 32768\n  }\n  model_rai_config {\n    csam {\n    }\n  }\n  enable_cached_token_billing: false\n  model_functionality: MODEL_FUNCTIONALITY_CHAT_COMPLETIONS\n  migration_mappings {\n    region: \"us-central1\"\n    endpoint: \"projects/68174598814/locations/us-central1/endpoints/internal-oss-placeholder-1-instruct-maas\"\n  }\n  migration_mappings {\n    region: \"asia-southeast1\"\n    endpoint: \"projects/68174598814/locations/asia-southeast1/endpoints/internal-oss-placeholder-1-instruct-maas\"\n  }\n}\nmodel_configs {\n  publisher: \"qwen\"\n  model: \"qwen3-next-80b-a3b-instruct-maas\"\n  enable_billing: true\n  base_version: \"001\"\n  model_rai_config {\n    csam {\n    }\n  }\n  enable_cached_token_billing: false\n  model_functionality: MODEL_FUNCTIONALITY_CHAT_COMPLETIONS\n  global_endpoint_only_region_configs {\n    region: \"us-east5\"\n  }\n  global_endpoint_only_region_configs {\n    region: \"europe-west4\"\n  }\n}\nmodel_configs {\n  publisher: \"qwen\"\n  model: \"qwen3-next-80b-a3b-thinking-maas\"\n  enable_billing: true\n  base_version: \"001\"\n  model_rai_config {\n    csam {\n    }\n  }\n  enable_cached_token_billing: false\n  model_functionality: MODEL_FUNCTIONALITY_CHAT_COMPLETIONS\n  global_endpoint_only_region_configs {\n    region: \"us-east5\"\n  }\n  global_endpoint_only_region_configs {\n    region: \"europe-west4\"\n  }\n}\nmodel_configs {\n  publisher: \"deepseek-ai\"\n  model: \"deepseek-ocr-maas\"\n  enable_billing: true\n  base_version: \"001\"\n  request_parameter_config {\n    max_output_tokens: 8191\n    supports_multi_turn: false\n  }\n  model_rai_config {\n    csam {\n    }\n  }\n  enable_cached_token_billing: false\n  model_functionality: MODEL_FUNCTIONALITY_CHAT_COMPLETIONS\n  global_endpoint_only_region_configs {\n    region: \"us-central1\"\n  }\n}\nmodel_configs {\n  publisher: \"minimaxai\"\n  model: \"minimax-m2-maas\"\n  enable_billing: true\n  base_version: \"001\"\n  request_parameter_config {\n    inject_implicit_max_output_tokens: 4096\n  }\n  model_rai_config {\n    csam {\n    }\n  }\n  enable_cached_token_billing: false\n  model_functionality: MODEL_FUNCTIONALITY_CHAT_COMPLETIONS\n  global_endpoint_only_region_configs {\n    region: \"us-central1\"\n  }\n}\nmodel_configs {\n  publisher: \"moonshotai\"\n  model: \"kimi-k2-thinking-maas\"\n  enable_billing: true\n  base_version: \"001\"\n  request_parameter_config {\n    inject_implicit_max_output_tokens: 4096\n  }\n  model_rai_config {\n    csam {\n    }\n  }\n  enable_cached_token_billing: false\n  model_functionality: MODEL_FUNCTIONALITY_CHAT_COMPLETIONS\n  global_endpoint_only_region_configs {\n    region: \"us-east1\"\n  }\n  reject_us_central1_requests: true\n}\nmodel_configs {\n  publisher: \"deepseek-ai\"\n  model: \"deepseek-v3.2-maas\"\n  enable_billing: true\n  base_version: \"001\"\n  request_parameter_config {\n    max_output_tokens: 32768\n    inject_implicit_max_output_tokens: 32768\n  }\n  model_rai_config {\n    csam {\n    }\n  }\n  enable_cached_token_billing: false\n  model_functionality: MODEL_FUNCTIONALITY_CHAT_COMPLETIONS\n  global_endpoint_only_region_configs {\n    region: \"asia-southeast1\"\n  }\n  reject_us_central1_requests: true\n}\nmodel_configs {\n  publisher: \"intfloat\"\n  model: \"multilingual-e5-small-maas\"\n  container_model: \"\"\n  enable_billing: true\n  base_version: \"001\"\n  model_functionality: MODEL_FUNCTIONALITY_EMBEDDINGS\n}\nmodel_configs {\n  publisher: \"intfloat\"\n  model: \"multilingual-e5-large-instruct-maas\"\n  container_model: \"\"\n  enable_billing: true\n  base_version: \"001\"\n  model_functionality: MODEL_FUNCTIONALITY_EMBEDDINGS\n}\nmodel_configs {\n  publisher: \"google\"\n  model: \"alpha-genome-001\"\n  base_version: \"001\"\n  model_functionality: MODEL_FUNCTIONALITY_ALPHA_GENOME\n}\nglobal_rai_config {\n  csam {\n    model_name: \"/vertex/safesearch-llm-safety-v2\"\n    threshold: 0.8\n    uniserve: true\n  }\n  adaptive_chunking: 1\n  adaptive_chunking: 1\n  adaptive_chunking: 16\n  adaptive_chunking: 16\n  adaptive_chunking: 32\n  adaptive_chunking: 32\n  adaptive_chunking: 48\n  adaptive_chunking: 48\n  adaptive_chunking: 64\n  min_streaming_response_count: 2\n}\nautopush_model_configs {\n  publisher: \"deepseek-ai\"\n  model: \"deepseek-r1-0528-maas\"\n  enable_billing: true\n  base_version: \"001\"\n  request_parameter_config {\n    max_output_tokens: 32768\n  }\n  model_rai_config {\n    csam {\n    }\n  }\n  enable_cached_token_billing: false\n  model_functionality: MODEL_FUNCTIONALITY_CHAT_COMPLETIONS\n  migration_mappings {\n    region: \"us-central1\"\n    endpoint: \"projects/313440725442/locations/us-central1/endpoints/internal-oss-placeholder-1-instruct-maas\"\n  }\n}\nautopush_model_configs {\n  publisher: \"minimaxai\"\n  model: \"minimax-m2-maas\"\n  enable_billing: true\n  base_version: \"001\"\n  model_rai_config {\n    csam {\n    }\n  }\n  enable_cached_token_billing: false\n  model_functionality: MODEL_FUNCTIONALITY_CHAT_COMPLETIONS\n  global_endpoint_only_region_configs {\n    region: \"us-central1\"\n  }\n  max_request_timeout_seconds: 1800\n}\nstaging_model_configs {\n  publisher: \"qwen\"\n  model: \"qwen3-coder-480b-a35b-instruct-maas\"\n  container_model: \"qwen\"\n  enable_billing: true\n  base_version: \"001\"\n  request_parameter_config {\n    max_output_tokens: 32768\n  }\n  model_rai_config {\n    csam {\n    }\n  }\n  enable_cached_token_billing: false\n  clear_cached_tokens_in_response: true\n  model_functionality: MODEL_FUNCTIONALITY_CHAT_COMPLETIONS\n  inference_gateway_config {\n    model_name: \"qwen3-coder-480b-a35b-instruct-maas-with-inference-gateway\"\n  }\n}\n"
id: 0
'
	OpenMaasConfig__enabled: 'name: "OpenMaasConfig__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	PinnacleBillingCacheMetricReporting__enabled: 'name: "PinnacleBillingCacheMetricReporting__enabled"
type: BOOL
base_value: "TRUE"
id: 0
'
	PinnacleCsamShortTermDryRun__enabled: 'name: "PinnacleCsamShortTermDryRun__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	PinnacleCsamShortTerm__enabled: 'name: "PinnacleCsamShortTerm__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	PinnacleDisablePromptCaching__enabled: 'name: "PinnacleDisablePromptCaching__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	PinnacleGlobalEndpointProjectAllowlist__enabled: 'name: "PinnacleGlobalEndpointProjectAllowlist__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	PinnacleHighRequestQos__enabled: 'name: "PinnacleHighRequestQos__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	PinnaclePtRetry__enabled: 'name: "PinnaclePtRetry__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	PinnacleRetryTransientError__enabled: 'name: "PinnacleRetryTransientError__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	PinnacleTokenizer__enabled: 'name: "PinnacleTokenizer__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	PtTrialProjects__enable_paygo_billing: 'name: "PtTrialProjects__enable_paygo_billing"
type: BOOL
base_value: "FALSE"
id: 0
'
	PtTrialProjects__enable_paygo_billing_per_model: 'name: "PtTrialProjects__enable_paygo_billing_per_model"
type: PROTO_BINARY_BASE64
sub_type: "cloud_ai_platform_dataplane_prediction_proto.PtTiralProjectsAllowlist"
base_value: ""
id: 0
'
	PublisherCriticalityOverrides__criticality_overrides: 'name: "PublisherCriticalityOverrides__criticality_overrides"
type: PROTO_BINARY_BASE64
sub_type: "cloud_ai_platform_dataplane_prediction_proto.PublisherCriticalityOverrides"
base_value: ""
id: 0
'
	PublisherCriticalityOverrides__enabled: 'name: "PublisherCriticalityOverrides__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	SharedPerBaseQuotaDeprecation__enabled: 'name: "SharedPerBaseQuotaDeprecation__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	TtftUseRequestReceiveTime__enabled: 'name: "TtftUseRequestReceiveTime__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	V1pUnifiedRetriesConfigEnabled__enabled: 'name: "V1pUnifiedRetriesConfigEnabled__enabled"
type: BOOL
base_value: "FALSE"
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 92
}
id: 0
'
	V1pUnifiedRetriesConfig__unified_retry_config_textproto: 'name: "V1pUnifiedRetriesConfig__unified_retry_config_textproto"
type: STRING
base_value: "\n## -- autogenerated do not remove comments -- ##\ndsq_retries_config {\n  config_id: \"DSQ Check Retries Config (Regional)\"\n  retry_strategy {\n    min_delay {\n      seconds: 5\n    }\n    max_delay {\n      seconds: 15\n    }\n    max_retries: 2\n    request_deadline_fraction: 1\n  }\n  match_criteria {\n    model_ids: \"gemini-[2-5].*\"\n    criticalities: CRITICAL\n    criticalities: SHEDDABLE_PLUS\n    request_sources: REQUEST_SOURCE_PAYGO\n    endpoint_type: ENDPOINT_TYPE_REGIONAL_ONLY\n  }\n}\ndsq_retries_config {\n  config_id: \"DSQ Check Retries Config (Global)\"\n  retry_strategy {\n    min_delay {\n      seconds: 5\n    }\n    max_delay {\n      seconds: 15\n    }\n    max_retries: 2\n    request_deadline_fraction: 1\n  }\n  match_criteria {\n    model_ids: \"gemini-[2-5].*\"\n    criticalities: CRITICAL_PLUS\n    request_sources: REQUEST_SOURCE_PROVISIONED_THROUGHPUT\n    endpoint_type: ENDPOINT_TYPE_GLOBAL_ONLY\n  }\n  match_criteria {\n    model_ids: \"gemini-[2-5].*\"\n    criticalities: CRITICAL\n    request_sources: REQUEST_SOURCE_PAYGO\n    endpoint_type: ENDPOINT_TYPE_GLOBAL_ONLY\n  }\n  match_criteria {\n    model_ids: \"gemini-[2-5].*\"\n    criticalities: SHEDDABLE_PLUS\n    request_sources: REQUEST_SOURCE_PAYGO\n    endpoint_type: ENDPOINT_TYPE_GLOBAL_ONLY\n  }\n}\ngenerate_multi_modal_retries_config {\n  config_id: \"GenerateMultiModal RPC retries (Regional)\"\n  error_codes: RESOURCE_EXHAUSTED\n  error_codes: UNAVAILABLE\n  error_codes: INTERNAL\n  retry_strategy {\n    min_delay {\n      seconds: 5\n    }\n    max_delay {\n      seconds: 15\n    }\n    max_retries: 2\n    request_deadline_fraction: 1\n  }\n  retry_thresholds {\n    threshold_type: PER_MODEL_RETRY_RATE\n    threshold: 0.2\n    threshold_duration {\n      seconds: 300\n    }\n  }\n  retry_thresholds {\n    threshold_type: PER_MODEL_RETRY_RATE_LONG_CONTEXT\n    threshold: 0.05\n    threshold_duration {\n      seconds: 300\n    }\n  }\n  retry_threshold_fallback_behavior: RETRY_THRESHOLD_FALLBACK_BEHAVIOR_CLOSE\n  match_criteria {\n    model_ids: \"gemini-[2-5].*\"\n    criticalities: CRITICAL\n    criticalities: CRITICAL_PLUS\n    criticalities: SHEDDABLE_PLUS\n    endpoint_type: ENDPOINT_TYPE_REGIONAL_ONLY\n  }\n}\ngenerate_multi_modal_retries_config {\n  config_id: \"GenerateMultiModal RPC retries (Global)\"\n  error_codes: RESOURCE_EXHAUSTED\n  error_codes: UNAVAILABLE\n  error_codes: INTERNAL\n  retry_strategy {\n    min_delay {\n      seconds: 1\n    }\n    max_delay {\n      seconds: 5\n    }\n    max_retries: 2\n    request_deadline_fraction: 1\n  }\n  retry_thresholds {\n    threshold_type: PER_MODEL_RETRY_RATE\n    threshold: 1\n    threshold_duration {\n      seconds: 10\n    }\n  }\n  retry_thresholds {\n    threshold_type: PER_MODEL_RETRY_RATE_LONG_CONTEXT\n    threshold: 1\n    threshold_duration {\n      seconds: 10\n    }\n  }\n  retry_threshold_fallback_behavior: RETRY_THRESHOLD_FALLBACK_BEHAVIOR_OPEN\n  match_criteria {\n    model_ids: \"gemini-[2-5].*\"\n    criticalities: CRITICAL\n    criticalities: CRITICAL_PLUS\n    criticalities: SHEDDABLE_PLUS\n    endpoint_type: ENDPOINT_TYPE_GLOBAL_ONLY\n  }\n}\ncount_tokens_retries_config {\n  config_id: \"Count Tokens RPC Retries\"\n  error_codes: DEADLINE_EXCEEDED\n  error_codes: RESOURCE_EXHAUSTED\n  error_codes: FAILED_PRECONDITION\n  error_codes: INTERNAL\n  error_codes: UNAVAILABLE\n  error_codes: INVALID_ARGUMENT\n  retry_strategy {\n    min_delay {\n      nanos: 100000000\n    }\n    max_delay {\n      seconds: 3\n    }\n    max_retries: 3\n    request_deadline_fraction: 1\n  }\n  match_criteria {\n    criticalities: CRITICAL_PLUS\n    request_sources: REQUEST_SOURCE_PROVISIONED_THROUGHPUT\n  }\n  match_criteria {\n    criticalities: CRITICAL\n    criticalities: SHEDDABLE_PLUS\n    request_sources: REQUEST_SOURCE_PAYGO\n  }\n}\n## -- autogenerated do not remove comments -- ##\n"
id: 0
'
	GERetryDiffLoc__enabled: 'name: "GERetryDiffLoc__enabled"
type: BOOL
base_value: "TRUE"
id: 0
'
	GeCacheRoutingAlg1__enabled: 'name: "GeCacheRoutingAlg1__enabled"
type: BOOL
base_value: "FALSE"
modifier {
  value_operator: OVERRIDE
  base_value: "FALSE"
  condition_group {
  }
  condition_index: 843
}
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 643
}
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 771
}
id: 0
'
	PinnacleOneHourPromptCachingReporting__enabled: 'name: "PinnacleOneHourPromptCachingReporting__enabled"
type: BOOL
base_value: "FALSE"
modifier {
  value_operator: OVERRIDE
  base_value: "FALSE"
  condition_group {
  }
  condition_index: 866
}
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 783
}
modifier {
  value_operator: OVERRIDE
  base_value: "FALSE"
  condition_group {
  }
  condition_index: 0
}
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 1121
}
id: 0
'
	PinnacleStructuredOutput__enabled: 'name: "PinnacleStructuredOutput__enabled"
type: BOOL
base_value: "FALSE"
modifier {
  value_operator: OVERRIDE
  base_value: "FALSE"
  condition_group {
  }
  condition_index: 867
}
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 786
}
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 1123
}
id: 0
'
	PinnacleStructuredOutput__enabled_models: 'name: "PinnacleStructuredOutput__enabled_models"
type: PROTO
sub_type: "experiments.proto.StringListParam"
base_value: ""
modifier {
  value_operator: OVERRIDE
  base_value: ""
  condition_group {
  }
  condition_index: 867
}
modifier {
  value_operator: OVERRIDE
  base_value: "element: \"claude-sonnet-4-5@20250929\"\n"
  condition_group {
  }
  condition_index: 786
}
modifier {
  value_operator: OVERRIDE
  base_value: "element: \"claude-sonnet-4-5@20250929\"\n"
  condition_group {
  }
  condition_index: 1123
}
id: 0
'
	PinnacleModel__enable_precharge: 'name: "PinnacleModel__enable_precharge"
type: BOOL
base_value: "FALSE"
id: 0
'
	PinnacleModel__enable_qs_overall_capacity: 'name: "PinnacleModel__enable_qs_overall_capacity"
type: BOOL
base_value: "FALSE"
id: 0
'
	PinnacleModel__enable_session_id_routing: 'name: "PinnacleModel__enable_session_id_routing"
type: BOOL
base_value: "FALSE"
id: 0
'
	PinnacleModel__enabled: 'name: "PinnacleModel__enabled"
type: PROTO_BINARY_BASE64
sub_type: "cloud_ai_platform_dataplane_prediction_proto.PinnacleModels"
base_value: "publisher_models {\n  key: \"publishers/ai21/models/jamba-1.5-large\"\n  value {\n    provisioned_throughput_listing {\n      service_id: \"services/jamba-provisioned-throughput.cloudpartnerservices.goog\"\n      service_level: \"base\"\n    }\n    enable_sla_reporting: true\n  }\n}\npublisher_models {\n  key: \"publishers/ai21/models/jamba-1.5-mini\"\n  value {\n    provisioned_throughput_listing {\n      service_id: \"services/jamba-provisioned-throughput.cloudpartnerservices.goog\"\n      service_level: \"base\"\n    }\n    enable_sla_reporting: true\n  }\n}\npublisher_models {\n  key: \"publishers/anthropic/models/anthropic-claude2\"\n  value {\n  }\n}\npublisher_models {\n  key: \"publishers/anthropic/models/claude-2p0\"\n  value {\n  }\n}\npublisher_models {\n  key: \"publishers/anthropic/models/claude-3-5-haiku\"\n  value {\n    provisioned_throughput_listing {\n      service_id: \"services/claude-3-provisioned-throughput.cloudpartnerservices.goog\"\n      service_level: \"base\"\n    }\n    enable_precharge_tokens: true\n    enable_error_count_report: true\n    retry_transient_error: true\n    enable_cache_token_report: true\n    enable_sla_reporting: true\n    enable_precharge_cache_tokens: true\n    enable_explicit_caching_monitoring: true\n    enable_content_logging: true\n    trust_and_safety_configs_map {\n      key: \"asia-east1\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3-twn\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"asia-southeast1\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3-sin\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"europe-north1\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"europe-west1\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"europe-west4\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"us-central1\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"us-east1\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"us-east5\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"us-south1\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"us-west4\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3\"\n        }\n      }\n    }\n    enable_one_hour_prompt_caching: true\n    enable_sending_request_id_header: true\n    enable_request_id_check_error: true\n  }\n}\npublisher_models {\n  key: \"publishers/anthropic/models/claude-3-5-sonnet\"\n  value {\n    provisioned_throughput_listing {\n      service_id: \"services/claude-3-provisioned-throughput.cloudpartnerservices.goog\"\n      service_level: \"base\"\n    }\n    enable_precharge_tokens: true\n    enable_error_count_report: true\n    retry_transient_error: true\n    enable_cache_token_report: true\n    enable_sla_reporting: true\n    enable_precharge_cache_tokens: true\n    enable_explicit_caching_monitoring: true\n    enable_content_logging: true\n    trust_and_safety_configs_map {\n      key: \"asia-southeast1\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"europe-west1\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"us-central1\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"us-east5\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3\"\n        }\n      }\n    }\n    enable_sending_request_id_header: true\n  }\n}\npublisher_models {\n  key: \"publishers/anthropic/models/claude-3-5-sonnet-v2\"\n  value {\n    provisioned_throughput_listing {\n      service_id: \"services/claude-3-provisioned-throughput.cloudpartnerservices.goog\"\n      service_level: \"base\"\n    }\n    enable_precharge_tokens: true\n    enable_error_count_report: true\n    retry_transient_error: true\n    enable_cache_token_report: true\n    enable_sla_reporting: true\n    enable_precharge_cache_tokens: true\n    enable_explicit_caching_monitoring: true\n    enable_content_logging: true\n    trust_and_safety_configs_map {\n      key: \"asia-southeast1\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3-sin\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"europe-north1\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"europe-west1\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"europe-west4\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"us-central1\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"us-east5\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3\"\n        }\n      }\n    }\n    enable_sending_request_id_header: true\n    enable_request_id_check_error: true\n  }\n}\npublisher_models {\n  key: \"publishers/anthropic/models/claude-3-5-sonnet-v2-staging\"\n  value {\n    provisioned_throughput_listing {\n      service_id: \"services/vertex-anthropic-test-pt.cloudpartnerservices.goog\"\n      service_level: \"base\"\n    }\n    enable_precharge_tokens: true\n    enable_error_count_report: true\n    retry_transient_error: true\n    enable_cache_token_report: true\n    enable_precharge_cache_tokens: true\n    enable_content_logging: true\n    enable_sending_request_id_header: true\n    enable_request_id_check_error: true\n  }\n}\npublisher_models {\n  key: \"publishers/anthropic/models/claude-3-7-sonnet\"\n  value {\n    provisioned_throughput_listing {\n      service_id: \"services/claude-3-provisioned-throughput.cloudpartnerservices.goog\"\n      service_level: \"base\"\n    }\n    enable_precharge_tokens: true\n    enable_error_count_report: true\n    retry_transient_error: true\n    enable_cache_token_report: true\n    enable_sla_reporting: true\n    enable_precharge_cache_tokens: true\n    enable_explicit_caching_monitoring: true\n    enable_content_logging: true\n    trust_and_safety_configs_map {\n      key: \"asia-east1\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3-twn\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"asia-southeast1\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3-sin\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"europe-west1\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"europe-west4\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"us-central1\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"us-east5\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3\"\n        }\n      }\n    }\n    enable_sending_request_id_header: true\n    enable_request_id_check_error: true\n  }\n}\npublisher_models {\n  key: \"publishers/anthropic/models/claude-3-haiku\"\n  value {\n    provisioned_throughput_listing {\n      service_id: \"services/claude-3-provisioned-throughput.cloudpartnerservices.goog\"\n      service_level: \"base\"\n    }\n    enable_precharge_tokens: true\n    enable_error_count_report: true\n    retry_transient_error: true\n    enable_cache_token_report: true\n    enable_sla_reporting: true\n    enable_precharge_cache_tokens: true\n    enable_explicit_caching_monitoring: true\n    enable_content_logging: true\n    trust_and_safety_configs_map {\n      key: \"asia-southeast1\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3-sin\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"europe-west1\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"europe-west4\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"us-central1\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"us-east5\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3\"\n        }\n      }\n    }\n    enable_one_hour_prompt_caching: true\n    enable_sending_request_id_header: true\n    enable_request_id_check_error: true\n  }\n}\npublisher_models {\n  key: \"publishers/anthropic/models/claude-3-opus\"\n  value {\n    provisioned_throughput_listing {\n      service_id: \"services/claude-3-provisioned-throughput.cloudpartnerservices.goog\"\n      service_level: \"base\"\n    }\n    enable_precharge_tokens: true\n    enable_error_count_report: true\n    retry_transient_error: true\n    enable_cache_token_report: true\n    enable_sla_reporting: true\n    enable_precharge_cache_tokens: true\n    enable_explicit_caching_monitoring: true\n    enable_content_logging: true\n    trust_and_safety_configs_map {\n      key: \"asia-southeast1\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3-sin\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"europe-west1\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"europe-west4\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"us-central1\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"us-east5\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3\"\n        }\n      }\n    }\n    enable_sending_request_id_header: true\n    enable_request_id_check_error: true\n  }\n}\npublisher_models {\n  key: \"publishers/anthropic/models/claude-3-sonnet\"\n  value {\n    provisioned_throughput_listing {\n      service_id: \"services/claude-3-provisioned-throughput.cloudpartnerservices.goog\"\n      service_level: \"base\"\n    }\n    enable_precharge_tokens: true\n    enable_error_count_report: true\n    retry_transient_error: true\n    enable_sla_reporting: true\n    enable_content_logging: true\n    trust_and_safety_configs_map {\n      key: \"asia-east1\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3-twn\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"asia-southeast1\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3-sin\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"europe-west1\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"europe-west4\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"us-central1\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"us-east1\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"us-east5\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3\"\n        }\n      }\n    }\n    enable_sending_request_id_header: true\n    enable_request_id_check_error: true\n  }\n}\npublisher_models {\n  key: \"publishers/anthropic/models/claude-3-sonnet-staging\"\n  value {\n    provisioned_throughput_listing {\n      service_id: \"services/vertex-anthropic-test-pt.cloudpartnerservices.goog\"\n      service_level: \"base\"\n    }\n    enable_precharge_tokens: true\n    enable_error_count_report: true\n    retry_transient_error: true\n    enable_cache_token_report: true\n    enable_precharge_cache_tokens: true\n    enable_explicit_caching_monitoring: true\n    enable_content_logging: true\n    enable_web_search: true\n    check_vpc_sc_restriction: true\n    web_search_tool_types: \"web_search_20250305\"\n    enable_one_hour_prompt_caching: true\n    enable_sending_request_id_header: true\n    enable_request_id_check_error: true\n  }\n}\npublisher_models {\n  key: \"publishers/anthropic/models/claude-haiku-4-5\"\n  value {\n    provisioned_throughput_listing {\n      service_id: \"services/claude-3-provisioned-throughput.cloudpartnerservices.goog\"\n      service_level: \"base\"\n    }\n    enable_precharge_tokens: true\n    enable_error_count_report: true\n    retry_transient_error: true\n    enable_cache_token_report: true\n    enable_sla_reporting: true\n    enable_precharge_cache_tokens: true\n    enable_explicit_caching_monitoring: true\n    enable_content_logging: true\n    enable_web_search: true\n    check_vpc_sc_restriction: true\n    web_search_tool_types: \"web_search_20250305\"\n    trust_and_safety_configs_map {\n      key: \"asia-east1\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3-twn\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"asia-southeast1\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3-sin\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"europe-north1\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"europe-west1\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"europe-west4\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"us-central1\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"us-east1\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"us-east5\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"us-south1\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"us-west4\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3\"\n        }\n      }\n    }\n    enable_one_hour_prompt_caching: true\n    enable_sending_request_id_header: true\n    enable_request_id_check_error: true\n  }\n}\npublisher_models {\n  key: \"publishers/anthropic/models/claude-instant-1p2\"\n  value {\n  }\n}\npublisher_models {\n  key: \"publishers/anthropic/models/claude-opus-4\"\n  value {\n    provisioned_throughput_listing {\n      service_id: \"services/claude-3-provisioned-throughput.cloudpartnerservices.goog\"\n      service_level: \"base\"\n    }\n    enable_precharge_tokens: true\n    enable_error_count_report: true\n    retry_transient_error: true\n    enable_cache_token_report: true\n    enable_sla_reporting: true\n    enable_precharge_cache_tokens: true\n    enable_explicit_caching_monitoring: true\n    enable_content_logging: true\n    enable_web_search: true\n    check_vpc_sc_restriction: true\n    web_search_tool_types: \"web_search_20250305\"\n    trust_and_safety_configs_map {\n      key: \"asia-east1\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3-twn\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"asia-southeast1\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3-sin\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"europe-west1\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"europe-west4\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"us-central1\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"us-east1\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"us-east5\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3\"\n        }\n      }\n    }\n    enable_one_hour_prompt_caching: true\n    enable_sending_request_id_header: true\n    enable_request_id_check_error: true\n  }\n}\npublisher_models {\n  key: \"publishers/anthropic/models/claude-opus-4-1\"\n  value {\n    provisioned_throughput_listing {\n      service_id: \"services/claude-3-provisioned-throughput.cloudpartnerservices.goog\"\n      service_level: \"base\"\n    }\n    enable_precharge_tokens: true\n    enable_error_count_report: true\n    retry_transient_error: true\n    enable_cache_token_report: true\n    enable_sla_reporting: true\n    enable_precharge_cache_tokens: true\n    enable_explicit_caching_monitoring: true\n    enable_content_logging: true\n    enable_web_search: true\n    check_vpc_sc_restriction: true\n    web_search_tool_types: \"web_search_20250305\"\n    trust_and_safety_configs_map {\n      key: \"europe-west1\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"europe-west4\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"us-east5\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3\"\n        }\n      }\n    }\n    enable_one_hour_prompt_caching: true\n    enable_sending_request_id_header: true\n    enable_request_id_check_error: true\n  }\n}\npublisher_models {\n  key: \"publishers/anthropic/models/claude-opus-4-5\"\n  value {\n    provisioned_throughput_listing {\n      service_id: \"services/claude-3-provisioned-throughput.cloudpartnerservices.goog\"\n      service_level: \"base\"\n    }\n    enable_precharge_tokens: true\n    enable_error_count_report: true\n    retry_transient_error: true\n    enable_cache_token_report: true\n    enable_sla_reporting: true\n    enable_precharge_cache_tokens: true\n    enable_explicit_caching_monitoring: true\n    enable_content_logging: true\n    enable_web_search: true\n    check_vpc_sc_restriction: true\n    web_search_tool_types: \"web_search_20250305\"\n    trust_and_safety_configs_map {\n      key: \"asia-east1\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3-twn\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"asia-southeast1\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3-sin\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"europe-north1\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"europe-west1\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"europe-west4\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"us-central1\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"us-east1\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"us-east5\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"us-south1\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"us-west4\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3\"\n        }\n      }\n    }\n    enable_one_hour_prompt_caching: true\n    enable_sending_request_id_header: true\n    enable_request_id_check_error: true\n  }\n}\npublisher_models {\n  key: \"publishers/anthropic/models/claude-sonnet-4\"\n  value {\n    provisioned_throughput_listing {\n      service_id: \"services/claude-3-provisioned-throughput.cloudpartnerservices.goog\"\n      service_level: \"base\"\n    }\n    enable_precharge_tokens: true\n    enable_error_count_report: true\n    retry_transient_error: true\n    enable_cache_token_report: true\n    enable_sla_reporting: true\n    enable_precharge_cache_tokens: true\n    enable_explicit_caching_monitoring: true\n    enable_content_logging: true\n    enable_web_search: true\n    check_vpc_sc_restriction: true\n    web_search_tool_types: \"web_search_20250305\"\n    trust_and_safety_configs_map {\n      key: \"asia-east1\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3-twn\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"asia-southeast1\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3-sin\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"europe-north1\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"europe-west1\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"europe-west4\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"us-central1\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"us-east1\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"us-east5\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"us-south1\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"us-west4\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3\"\n        }\n      }\n    }\n    enable_one_hour_prompt_caching: true\n    enable_sending_request_id_header: true\n    enable_request_id_check_error: true\n  }\n}\npublisher_models {\n  key: \"publishers/anthropic/models/claude-sonnet-4-5\"\n  value {\n    provisioned_throughput_listing {\n      service_id: \"services/claude-3-provisioned-throughput.cloudpartnerservices.goog\"\n      service_level: \"base\"\n    }\n    enable_precharge_tokens: true\n    enable_error_count_report: true\n    retry_transient_error: true\n    enable_cache_token_report: true\n    enable_sla_reporting: true\n    enable_precharge_cache_tokens: true\n    enable_explicit_caching_monitoring: true\n    enable_content_logging: true\n    enable_web_search: true\n    check_vpc_sc_restriction: true\n    web_search_tool_types: \"web_search_20250305\"\n    trust_and_safety_configs_map {\n      key: \"asia-east1\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3-twn\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"asia-southeast1\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3-sin\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"europe-north1\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"europe-west1\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"europe-west4\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"us-central1\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"us-east1\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"us-east5\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"us-south1\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3\"\n        }\n      }\n    }\n    trust_and_safety_configs_map {\n      key: \"us-west4\"\n      value {\n        configs {\n          classifier_name: CLASSIFIER_NAME_CSAM_IMAGE\n          thresholds {\n            threshold_name: THRESHOLD_NAME_CSAI\n            threshold_score: 0.4\n          }\n          thresholds {\n            threshold_name: THRESHOLD_NAME_PEDO\n            threshold_score: 0.9\n          }\n          failed_opened_deadline {\n            seconds: 1\n          }\n          signal_name: \"/vertex/safesearch-pixel-v3\"\n        }\n      }\n    }\n    enable_one_hour_prompt_caching: true\n    enable_sending_request_id_header: true\n    enable_request_id_check_error: true\n  }\n}\npublisher_models {\n  key: \"publishers/anthropic/models/count-tokens\"\n  value {\n    enable_sla_reporting: true\n    enable_content_logging: true\n  }\n}\npublisher_models {\n  key: \"publishers/anthropic/models/maas-infra\"\n  value {\n  }\n}\npublisher_models {\n  key: \"publishers/anthropic/models/marketplace-publisher-model-138\"\n  value {\n  }\n}\npublisher_models {\n  key: \"publishers/anthropic/models/spillover-api\"\n  value {\n  }\n}\npublisher_models {\n  key: \"publishers/anthropic/models/spillover-api-tpu-001\"\n  value {\n    enable_sending_request_id_header: true\n    enable_request_id_check_error: true\n  }\n}\npublisher_models {\n  key: \"publishers/anthropic/models/spillover-api-tpu-002\"\n  value {\n    enable_sending_request_id_header: true\n    enable_request_id_check_error: true\n  }\n}\npublisher_models {\n  key: \"publishers/anthropic/models/spillover-api-tpu-003\"\n  value {\n    enable_sending_request_id_header: true\n    enable_request_id_check_error: true\n  }\n}\npublisher_models {\n  key: \"publishers/anthropic/models/spillover-api-tpu-004\"\n  value {\n    enable_sending_request_id_header: true\n    enable_request_id_check_error: true\n  }\n}\npublisher_models {\n  key: \"publishers/anthropic/models/spillover-api-tpu-005\"\n  value {\n    enable_sending_request_id_header: true\n    enable_request_id_check_error: true\n  }\n}\npublisher_models {\n  key: \"publishers/anthropic/models/spillover-api-tpu-006\"\n  value {\n  }\n}\npublisher_models {\n  key: \"publishers/dummy-anthropic/models/test-model-2\"\n  value {\n    provisioned_throughput_listing {\n      service_id: \"services/test-model-2-provisioned-throughput.cloudpartnerservices.goog\"\n      service_level: \"base\"\n    }\n    enable_precharge_tokens: true\n    enable_error_count_report: true\n    retry_transient_error: true\n    enable_cache_token_report: true\n    enable_sla_reporting: true\n    enable_precharge_cache_tokens: true\n    enable_explicit_caching_monitoring: true\n    enable_content_logging: true\n  }\n}\npublisher_models {\n  key: \"publishers/internal-test-google/models/test-model\"\n  value {\n    provisioned_throughput_listing {\n      service_id: \"services/test-model-provisioned-throughput.cloudpartnerservices.goog\"\n      service_level: \"base\"\n    }\n    enable_precharge_tokens: true\n    enable_error_count_report: true\n    retry_transient_error: true\n    enable_cache_token_report: true\n    enable_sla_reporting: true\n    enable_precharge_cache_tokens: true\n    enable_explicit_caching_monitoring: true\n    enable_content_logging: true\n  }\n}\npublisher_models {\n  key: \"publishers/mistralai/models/codestral-2\"\n  value {\n    provisioned_throughput_listing {\n      service_id: \"services/mistral-provisioned-throughput.cloudpartnerservices.goog\"\n      service_level: \"base\"\n    }\n    enable_sla_reporting: true\n  }\n}\npublisher_models {\n  key: \"publishers/mistralai/models/codestral-2501\"\n  value {\n    provisioned_throughput_listing {\n      service_id: \"services/mistral-provisioned-throughput.cloudpartnerservices.goog\"\n      service_level: \"base\"\n    }\n    enable_sla_reporting: true\n  }\n}\npublisher_models {\n  key: \"publishers/mistralai/models/ministral-3b-2410\"\n  value {\n    provisioned_throughput_listing {\n      service_id: \"services/mistral-provisioned-throughput.cloudpartnerservices.goog\"\n      service_level: \"base\"\n    }\n    enable_sla_reporting: true\n  }\n}\npublisher_models {\n  key: \"publishers/mistralai/models/mistral-large-2411\"\n  value {\n    provisioned_throughput_listing {\n      service_id: \"services/mistral-provisioned-throughput.cloudpartnerservices.goog\"\n      service_level: \"base\"\n    }\n    enable_sla_reporting: true\n  }\n}\npublisher_models {\n  key: \"publishers/mistralai/models/mistral-medium-3\"\n  value {\n    provisioned_throughput_listing {\n      service_id: \"services/mistral-provisioned-throughput.cloudpartnerservices.goog\"\n      service_level: \"base\"\n    }\n    enable_sla_reporting: true\n  }\n}\npublisher_models {\n  key: \"publishers/mistralai/models/mistral-nemo\"\n  value {\n    provisioned_throughput_listing {\n      service_id: \"services/mistral-provisioned-throughput.cloudpartnerservices.goog\"\n      service_level: \"base\"\n    }\n    enable_sla_reporting: true\n  }\n}\npublisher_models {\n  key: \"publishers/mistralai/models/mistral-ocr-2505\"\n  value {\n    provisioned_throughput_listing {\n      service_id: \"services/mistral-provisioned-throughput.cloudpartnerservices.goog\"\n      service_level: \"base\"\n    }\n    enable_sla_reporting: true\n  }\n}\npublisher_models {\n  key: \"publishers/mistralai/models/mistral-small-2503\"\n  value {\n    provisioned_throughput_listing {\n      service_id: \"services/mistral-provisioned-throughput.cloudpartnerservices.goog\"\n      service_level: \"base\"\n    }\n    enable_sla_reporting: true\n  }\n}\npublisher_models {\n  key: \"publishers/mistralai/models/mistral-staging\"\n  value {\n    provisioned_throughput_listing {\n      service_id: \"services/mistral-provisioned-throughput.cloudpartnerservices.goog\"\n      service_level: \"base\"\n    }\n    enable_sla_reporting: true\n  }\n}\n"
id: 0
'
	PinnacleQuotaConsumptionMonitoring__enabled: 'name: "PinnacleQuotaConsumptionMonitoring__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	PinnacleSla__enabled: 'name: "PinnacleSla__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	AllowNonDrzRouting__enabled: 'name: "AllowNonDrzRouting__enabled"
type: BOOL
base_value: "TRUE"
id: 0
'
	CardolanWaitForGrootResponse__enabled: 'name: "CardolanWaitForGrootResponse__enabled"
type: BOOL
base_value: "FALSE"
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 6
}
id: 0
'
	GlobalDagRouting__nondrz_target_location: 'name: "GlobalDagRouting__nondrz_target_location"
type: STRING
base_value: "sin"
id: 0
'
	GlobalDagRouting__prefer_local_region_for_gcs_request: 'name: "GlobalDagRouting__prefer_local_region_for_gcs_request"
type: BOOL
base_value: "FALSE"
id: 0
'
	GlobalDagRouting__retries_config: 'name: "GlobalDagRouting__retries_config"
type: PROTO_BINARY_BASE64
sub_type: "cloud_ai_platform_dataplane_prediction_proto.CardolanRetriesConfig"
base_value: "config_id: \"ge_retries_config\"\nmodel_ids: \"gemini-2.0-flash-001\"\nmodel_ids: \"gemini-2.0-flash-lite-001\"\nmodel_ids: \"gemini-2.5-pro-preview\"\nmodel_ids: \"gemini-2.0-flash-preview\"\nmodel_ids: \"gemini-[2-5]\\\\..*\"\nrequest_types: \"dedicated-critical_plus\"\nrequest_types: \"shared-critical\"\nrequest_types: \"shared-sheddable_plus\"\nerror_codes: RESOURCE_EXHAUSTED\nerror_codes: UNAVAILABLE\nerror_codes: INTERNAL\nretry_strategy {\n  min_delay {\n    seconds: 1\n  }\n  max_delay {\n    seconds: 5\n  }\n  max_retries: 2\n  request_deadline_fraction: 1\n}\nretry_thresholds {\n  threshold_type: PER_MODEL_RETRY_RATE\n  threshold: 1\n  threshold_duration {\n    seconds: 10\n  }\n}\nretry_thresholds {\n  threshold_type: PER_MODEL_RETRY_RATE_LONG_CONTEXT\n  threshold: 1\n  threshold_duration {\n    seconds: 10\n  }\n}\nretry_threshold_fallback_behavior: RETRY_THRESHOLD_FALLBACK_BEHAVIOR_OPEN\n"
id: 0
'
	V1pCountTokensRetries__retries_config: 'name: "V1pCountTokensRetries__retries_config"
type: PROTO_BINARY_BASE64
sub_type: "cloud_ai_platform_dataplane_prediction_proto.CardolanRetriesConfig"
base_value: ""
id: 0
'
	V1pCountTokensRetries__enabled: 'name: "V1pCountTokensRetries__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	AnthropicMaasBatchPrediction__enabled: 'name: "AnthropicMaasBatchPrediction__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	GenaiBillingEnabledBaseModelFeature__billing_enabled_base_models: 'name: "GenaiBillingEnabledBaseModelFeature__billing_enabled_base_models"
type: PROTO
sub_type: "experiments.proto.StringListParam"
base_value: ""
modifier {
  value_operator: OVERRIDE
  base_value: "element: \"gemini-2.5-flash-preview\"\nelement: \"gemini-2.5-pro-preview\"\n"
  condition_group {
  }
  condition_index: 0
}
modifier {
  value_operator: OVERRIDE
  base_value: "element: \"text-bison\"\nelement: \"text-bison-32k\"\nelement: \"text-unicorn\"\nelement: \"chat-bison\"\nelement: \"chat-bison-32k\"\nelement: \"code-bison\"\nelement: \"code-bison-32k\"\nelement: \"codechat-bison\"\nelement: \"codechat-bison-32k\"\nelement: \"code-gecko\"\nelement: \"textembedding-gecko\"\nelement: \"text-embedding-large-001\"\nelement: \"text-bison-batch\"\nelement: \"gemini-embedding\"\nelement: \"gemini-2.0-embedding\"\nelement: \"gemini-2.5-embedding\"\nelement: \"gemini-pro\"\nelement: \"gemini-ultra\"\nelement: \"gemini-pro-vision\"\nelement: \"gemini-ultra-vision\"\nelement: \"gemini-1.5-pro\"\nelement: \"gemini-1.5-flash\"\nelement: \"medpalm2\"\nelement: \"MedLM-medium\"\nelement: \"MedLM-medium-latest\"\nelement: \"MedLM-large\"\nelement: \"MedLM-Large-1.5-preview\"\nelement: \"MedLM-Large-1.5\"\nelement: \"gemini-2.0-flash\"\nelement: \"gemini-2.0-flash-lite\"\nelement: \"gemini-2.0-flash-live\"\nelement: \"gemini-2.0-flash-preview-image-generation\"\nelement: \"gemini-unified\"\nelement: \"gemini-2.5-flash-preview\"\nelement: \"gemini-2.5-pro-preview\"\nelement: \"gemini-2.5-flash\"\nelement: \"gemini-2.5-pro\"\nelement: \"gemini-2.5-flash-live\"\nelement: \"gemini-2.5-flash-native-audio-dialog\"\nelement: \"gemini-2.5-flash-ga\"\nelement: \"gemini-2.5-flash-lite-preview\"\nelement: \"gemini-2.5-flash-lite\"\nelement: \"gemini-2.5-pro-ga\"\nelement: \"gemini-2.5-flash-tts\"\nelement: \"gemini-2.5-pro-tts\"\nelement: \"gemini-2.5-flash-image\"\nelement: \"gemini-2.5-flash-image-ga\"\nelement: \"computer-use-preview\"\n"
  condition_group {
  }
  condition_index: 0
}
modifier {
  value_operator: OVERRIDE
  base_value: "element: \"llama3-70b-chat\"\nelement: \"llama3-8b-chat\"\nelement: \"gemini-experimental\"\nelement: \"gemini-2.0-flash-exp\"\nelement: \"text-bison\"\nelement: \"text-bison-32k\"\nelement: \"text-unicorn\"\nelement: \"chat-bison\"\nelement: \"chat-bison-32k\"\nelement: \"code-bison\"\nelement: \"code-bison-32k\"\nelement: \"codechat-bison\"\nelement: \"codechat-bison-32k\"\nelement: \"code-gecko\"\nelement: \"textembedding-gecko\"\nelement: \"text-embedding-large-001\"\nelement: \"text-bison-batch\"\nelement: \"gemini-embedding\"\nelement: \"gemini-2.0-embedding\"\nelement: \"gemini-2.5-embedding\"\nelement: \"gemini-pro\"\nelement: \"gemini-ultra\"\nelement: \"gemini-pro-vision\"\nelement: \"gemini-ultra-vision\"\nelement: \"gemini-1.5-pro\"\nelement: \"gemini-1.5-flash\"\nelement: \"medpalm2\"\nelement: \"MedLM-medium\"\nelement: \"MedLM-medium-latest\"\nelement: \"MedLM-large\"\nelement: \"MedLM-Large-1.5-preview\"\nelement: \"MedLM-Large-1.5\"\nelement: \"gemini-2.0-flash\"\nelement: \"gemini-2.0-flash-lite\"\nelement: \"gemini-2.0-flash-live\"\nelement: \"gemini-2.0-flash-preview-image-generation\"\nelement: \"gemini-unified\"\nelement: \"gemini-2.5-flash-preview\"\nelement: \"gemini-2.5-pro-preview\"\nelement: \"gemini-2.5-flash\"\nelement: \"gemini-2.5-pro\"\nelement: \"gemini-2.5-flash-live\"\nelement: \"gemini-2.5-flash-native-audio-dialog\"\nelement: \"gemini-2.5-flash-ga\"\nelement: \"gemini-2.5-flash-lite-preview\"\nelement: \"gemini-2.5-flash-lite\"\nelement: \"gemini-2.5-pro-ga\"\nelement: \"gemini-2.5-flash-tts\"\nelement: \"gemini-2.5-pro-tts\"\nelement: \"gemini-2.5-flash-image\"\nelement: \"gemini-2.5-flash-image-ga\"\nelement: \"computer-use-preview\"\n"
  condition_group {
  }
}
id: 0
'
	EnableAisExpModel__ais_routing_project: 'name: "EnableAisExpModel__ais_routing_project"
type: STRING
base_value: ""
id: 0
'
	EnableAisExpModel__ais_total_request_size_limit: 'name: "EnableAisExpModel__ais_total_request_size_limit"
type: INT
base_value: "104857600"
id: 0
'
	EnableAisExpModel__enabled: 'name: "EnableAisExpModel__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	EnableAisExpModel__enabled_models: 'name: "EnableAisExpModel__enabled_models"
type: PROTO
sub_type: "experiments.proto.StringListParam"
base_value: ""
id: 0
'
	EnableAisExpModel__gcs_video_file_size_byte_limit: 'name: "EnableAisExpModel__gcs_video_file_size_byte_limit"
type: INT
base_value: "52428800"
id: 0
'
	RouteLlmCallsToRapidStack__enabled: 'name: "RouteLlmCallsToRapidStack__enabled"
type: BOOL
base_value: "FALSE"
modifier {
  name: "Layer1"
  value_operator: IDENTITY
  condition_group {
  }
}
modifier {
  name: "Layer2"
  value_operator: IDENTITY
  condition_group {
  }
}
modifier {
  name: "Layer3"
  value_operator: IDENTITY
  condition_group {
  }
}
modifier {
  name: "Layer4"
  value_operator: IDENTITY
  condition_group {
  }
}
modifier {
  name: "Layer5"
  value_operator: IDENTITY
  condition_group {
  }
}
modifier {
  name: "Layer6"
  value_operator: IDENTITY
  condition_group {
  }
}
modifier {
  name: "Layer7"
  value_operator: IDENTITY
  condition_group {
  }
}
id: 0
'
	RouteLlmCallsToRapidStack__exclude_publisher_models: 'name: "RouteLlmCallsToRapidStack__exclude_publisher_models"
type: PROTO
sub_type: "experiments.proto.StringListParam"
base_value: ""
modifier {
  name: "Layer1"
  value_operator: IDENTITY
  condition_group {
  }
}
id: 0
'
	RouteLlmCallsToRapidStack__publisher_model_substrings: 'name: "RouteLlmCallsToRapidStack__publisher_model_substrings"
type: PROTO
sub_type: "experiments.proto.StringListParam"
base_value: ""
modifier {
  name: "Layer1"
  value_operator: IDENTITY
  condition_group {
  }
}
modifier {
  name: "Layer2"
  value_operator: IDENTITY
  condition_group {
  }
}
modifier {
  name: "Layer3"
  value_operator: IDENTITY
  condition_group {
  }
}
modifier {
  name: "Layer7"
  value_operator: IDENTITY
  condition_group {
  }
}
id: 0
'
	RouteLlmCallsToRapidStack__publisher_models: 'name: "RouteLlmCallsToRapidStack__publisher_models"
type: PROTO
sub_type: "experiments.proto.StringListParam"
base_value: ""
modifier {
  name: "Layer4"
  value_operator: IDENTITY
  condition_group {
  }
}
modifier {
  name: "Layer5"
  value_operator: IDENTITY
  condition_group {
  }
}
modifier {
  name: "Layer6"
  value_operator: IDENTITY
  condition_group {
  }
}
id: 0
'
	RouteRequestsToGroot__allowed_traffic_categories: 'name: "RouteRequestsToGroot__allowed_traffic_categories"
type: PROTO
sub_type: "experiments.proto.Int32ListParam"
base_value: ""
id: 0
'
	RouteRequestsToGroot__enabled: 'name: "RouteRequestsToGroot__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	RouteRequestsToGroot__exclude_publisher_models: 'name: "RouteRequestsToGroot__exclude_publisher_models"
type: PROTO
sub_type: "experiments.proto.StringListParam"
base_value: ""
id: 0
'
	RouteRequestsToGroot__override_to_sheddable_criticality_for_groot: 'name: "RouteRequestsToGroot__override_to_sheddable_criticality_for_groot"
type: BOOL
base_value: "FALSE"
id: 0
'
	RouteRequestsToGroot__publisher_model_substrings: 'name: "RouteRequestsToGroot__publisher_model_substrings"
type: PROTO
sub_type: "experiments.proto.StringListParam"
base_value: ""
id: 0
'
	RouteRequestsToGroot__publisher_models: 'name: "RouteRequestsToGroot__publisher_models"
type: PROTO
sub_type: "experiments.proto.StringListParam"
base_value: ""
id: 0
'
	VeoRegionalMigration__enabled: 'name: "VeoRegionalMigration__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	AnthropicBatchPredictionQuotaServer__models_using_quota_server: 'name: "AnthropicBatchPredictionQuotaServer__models_using_quota_server"
type: PROTO
sub_type: "experiments.proto.StringListParam"
base_value: ""
id: 0
'
	AnthropicMaasBatchPrediction__check_per_project_per_base_model_output_quota: 'name: "AnthropicMaasBatchPrediction__check_per_project_per_base_model_output_quota"
type: BOOL
base_value: "FALSE"
id: 0
'
	AnthropicMaasBatchPrediction__check_per_project_per_base_model_quota: 'name: "AnthropicMaasBatchPrediction__check_per_project_per_base_model_quota"
type: BOOL
base_value: "FALSE"
id: 0
'
	AnthropicMaasBatchPrediction__check_per_user_per_base_model_quota: 'name: "AnthropicMaasBatchPrediction__check_per_user_per_base_model_quota"
type: BOOL
base_value: "FALSE"
id: 0
'
	AnthropicMaasBatchPrediction__ignore_batch_traffic_from_bouncer_or_qs: 'name: "AnthropicMaasBatchPrediction__ignore_batch_traffic_from_bouncer_or_qs"
type: BOOL
base_value: "TRUE"
id: 0
'
	ArthedainFeature__enabled: 'name: "ArthedainFeature__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	AuxModelsDynamicPool__enabled: 'name: "AuxModelsDynamicPool__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	BatchPredictionUseEucForMediaRead__enabled: 'name: "BatchPredictionUseEucForMediaRead__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	BillingModuleLargeModel__enabled: 'name: "BillingModuleLargeModel__enabled"
type: BOOL
base_value: "FALSE"
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 3
}
id: 0
'
	BlockFreeTierAccessGeminiExperimental__enabled: 'name: "BlockFreeTierAccessGeminiExperimental__enabled"
type: BOOL
base_value: "TRUE"
id: 0
'
	CardolanGRootBrightLaunch__enabled: 'name: "CardolanGRootBrightLaunch__enabled"
type: BOOL
base_value: "FALSE"
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 5
}
id: 0
'
	CardolanMethodRestriction__enabled: 'name: "CardolanMethodRestriction__enabled"
type: BOOL
base_value: "TRUE"
id: 0
'
	CardolanRestriction__allowed_peer_users: 'name: "CardolanRestriction__allowed_peer_users"
type: PROTO
sub_type: "experiments.proto.StringListParam"
base_value: ""
id: 0
'
	CardolanRestriction__enabled: 'name: "CardolanRestriction__enabled"
type: BOOL
base_value: "TRUE"
id: 0
'
	ChatCompletionsRestriction__enabled: 'name: "ChatCompletionsRestriction__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	DeadlineOverrideExtension__deadline_reduction_ms: 'name: "DeadlineOverrideExtension__deadline_reduction_ms"
type: INT
base_value: "0"
id: 0
'
	DeadlineOverrideExtension__enabled: 'name: "DeadlineOverrideExtension__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	DeferGwsLogTillAfterFinish__enabled: 'name: "DeferGwsLogTillAfterFinish__enabled"
type: BOOL
base_value: "FALSE"
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 7
}
id: 0
'
	DsqCapQuotaOnlyUseGlobal__enabled: 'name: "DsqCapQuotaOnlyUseGlobal__enabled"
type: BOOL
base_value: "TRUE"
id: 0
'
	EnableAsyncGenerateContentGrootGlobalEndpoint__enabled: 'name: "EnableAsyncGenerateContentGrootGlobalEndpoint__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	EnableBillingConfigInModelConfigResolverModule__enabled: 'name: "EnableBillingConfigInModelConfigResolverModule__enabled"
type: BOOL
base_value: "FALSE"
modifier {
  value_operator: OVERRIDE
  base_value: "FALSE"
  condition_group {
  }
  condition_index: 795
}
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 641
}
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 765
}
id: 0
'
	EnableCplusInLiveApi__allowlist: 'name: "EnableCplusInLiveApi__allowlist"
type: PROTO
sub_type: "experiments.proto.Int64ListParam"
base_value: "element: 295331639341\nelement: 53908422466\n"
id: 0
'
	EnableCplusInLiveApi__enabled: 'name: "EnableCplusInLiveApi__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	EnableExplicitCache__enabled: 'name: "EnableExplicitCache__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	EnableExplicitCache__explicit_cache_enabled_models: 'name: "EnableExplicitCache__explicit_cache_enabled_models"
type: PROTO
sub_type: "experiments.proto.StringListParam"
base_value: "element: \"gemini-1.5-pro\"\nelement: \"gemini-1.5-flash\"\nelement: \"gemini-2.0-flash-001\"\nelement: \"gemini-2.0-flash-lite-001\"\nelement: \"gemini-2.0-flash-preview\"\nelement: \"gemini-2.0-flash-preview-image-generation\"\nelement: \"gemini-2.5-pro\"\nelement: \"gemini-2.5-flash\"\n"
modifier {
  value_operator: OVERRIDE
  base_value: "element: \"gemini-1.5-pro\"\nelement: \"gemini-1.5-flash\"\nelement: \"gemini-2.0-flash-001\"\nelement: \"gemini-2.0-flash-lite-001\"\nelement: \"gemini-2.0-flash-preview\"\nelement: \"gemini-2.0-flash-preview-image-generation\"\nelement: \"gemini-2.5-pro\"\nelement: \"gemini-2.5-flash\"\nelement: \"gemini-eval-gemini-1-5-flash-001\"\n"
  condition_group {
  }
  condition_index: 0
}
modifier {
  value_operator: OVERRIDE
  base_value: "element: \"gemini-1.5-pro\"\nelement: \"gemini-1.5-flash\"\nelement: \"gemini-2.0-flash-001\"\nelement: \"gemini-2.0-flash-lite-001\"\nelement: \"gemini-2.0-flash-preview\"\nelement: \"gemini-2.0-flash-preview-image-generation\"\nelement: \"gemini-2.5-pro\"\nelement: \"gemini-2.5-flash\"\n"
  condition_group {
  }
}
id: 0
'
	EnableExplicitCache__explicit_cache_project_allowlist_by_model: 'name: "EnableExplicitCache__explicit_cache_project_allowlist_by_model"
type: PROTO_BINARY_BASE64
sub_type: "cloud_ai_platform_dataplane_prediction_proto.ExplicitCachingProjectAllowlist"
base_value: "NO_PRINTER_FOR_PROTO"
modifier {
  value_operator: OVERRIDE
  base_value: "NO_PRINTER_FOR_PROTO"
  condition_group {
  }
}
id: 0
'
	EnableGlobalImplicitCache__enabled: 'name: "EnableGlobalImplicitCache__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	EnableGlobalProjectConfigRead__enabled: 'name: "EnableGlobalProjectConfigRead__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	EnableLlama3170bServoModel__enabled: 'name: "EnableLlama3170bServoModel__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	EnableLlama318bServoModel__enabled: 'name: "EnableLlama318bServoModel__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	EnableLoadBalancingDynamicConfig__enabled: 'name: "EnableLoadBalancingDynamicConfig__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	EnableModelRoutingService__enabled: 'name: "EnableModelRoutingService__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	EnablePaygoDynamicPoolFromUta__enabled_models: 'name: "EnablePaygoDynamicPoolFromUta__enabled_models"
type: PROTO
sub_type: "experiments.proto.StringListParam"
base_value: ""
id: 0
'
	EnableRagGenerationPromptWithLlmPriorKnowledge__enabled: 'name: "EnableRagGenerationPromptWithLlmPriorKnowledge__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	EnableRegionalQuotaServer__enabled: 'name: "EnableRegionalQuotaServer__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	EnableTranslationLlmBilling__enabled: 'name: "EnableTranslationLlmBilling__enabled"
type: BOOL
base_value: "FALSE"
modifier {
  value_operator: OVERRIDE
  base_value: "FALSE"
  condition_group {
  }
  condition_index: 809
}
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 642
}
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 769
}
id: 0
'
	EnableUniserveMirror__enabled: 'name: "EnableUniserveMirror__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	EnableUniserveMirror__uniserve_mirror_config: 'name: "EnableUniserveMirror__uniserve_mirror_config"
type: PROTO_BINARY_BASE64
sub_type: "cloud_ai_platform_dataplane_prediction_proto.TrafficMirrorToUniserveConfig"
base_value: "NO_PRINTER_FOR_PROTO"
id: 0
'
	EnableVertexAiPredictionLlmPerMinuteBaseModelQuotaCheckFeature__enabled: 'name: "EnableVertexAiPredictionLlmPerMinuteBaseModelQuotaCheckFeature__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	ExplicitCacheV1__enabled: 'name: "ExplicitCacheV1__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	ExtendCardolanRequestDeadline__enabled: 'name: "ExtendCardolanRequestDeadline__enabled"
type: BOOL
base_value: "FALSE"
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 28
}
id: 0
'
	ForceSkipGslbRouting__enabled: 'name: "ForceSkipGslbRouting__enabled"
type: BOOL
base_value: "TRUE"
id: 0
'
	GERoutesToDrzs__enabled: 'name: "GERoutesToDrzs__enabled"
type: BOOL
base_value: "TRUE"
id: 0
'
	GRootBranchViaBGRM__enabled: 'name: "GRootBranchViaBGRM__enabled"
type: BOOL
base_value: "FALSE"
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 33
}
id: 0
'
	Gemini25ProPreviewDynamicPool__enabled: 'name: "Gemini25ProPreviewDynamicPool__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	GeminiEnablePaygoDynamicPool__enabled_models: 'name: "GeminiEnablePaygoDynamicPool__enabled_models"
type: PROTO
sub_type: "experiments.proto.StringListParam"
base_value: ""
id: 0
'
	GeminiMmDataHandlingInLvm__enabled_models: 'name: "GeminiMmDataHandlingInLvm__enabled_models"
type: PROTO
sub_type: "experiments.proto.StringListParam"
base_value: ""
id: 0
'
	GeminiPaygoDynamicPoolEnableMultimodal__enabled: 'name: "GeminiPaygoDynamicPoolEnableMultimodal__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	GeminiPaygoDynamicPoolEnableStreaming__enabled: 'name: "GeminiPaygoDynamicPoolEnableStreaming__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	GeminiPaygoDynamicPoolEnabledTraffic__enabled_traffic_types: 'name: "GeminiPaygoDynamicPoolEnabledTraffic__enabled_traffic_types"
type: PROTO
sub_type: "experiments.proto.StringListParam"
base_value: ""
id: 0
'
	GeminiTieredBillingFeature__enabled: 'name: "GeminiTieredBillingFeature__enabled"
type: BOOL
base_value: "TRUE"
id: 0
'
	GenAiBatchImplicitCaching__disable_metadata_model_list: 'name: "GenAiBatchImplicitCaching__disable_metadata_model_list"
type: PROTO
sub_type: "experiments.proto.StringListParam"
base_value: ""
modifier {
  value_operator: OVERRIDE
  base_value: ""
  condition_group {
  }
  condition_index: 380
}
id: 0
'
	GenAiBatchImplicitCaching__enabled: 'name: "GenAiBatchImplicitCaching__enabled"
type: BOOL
base_value: "TRUE"
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 380
}
id: 0
'
	GenaiBillingV2EnabledBaseModels__billing_v2_enabled_base_models: 'name: "GenaiBillingV2EnabledBaseModels__billing_v2_enabled_base_models"
type: PROTO
sub_type: "experiments.proto.StringListParam"
base_value: ""
id: 0
'
	GenaiFairAllocationQuota__enable_bouncer: 'name: "GenaiFairAllocationQuota__enable_bouncer"
type: BOOL
base_value: "FALSE"
id: 0
'
	GenaiFairAllocationQuota__enable_quota_server: 'name: "GenaiFairAllocationQuota__enable_quota_server"
type: BOOL
base_value: "FALSE"
id: 0
'
	GenaiFairAllocationQuota__enable_throttling: 'name: "GenaiFairAllocationQuota__enable_throttling"
type: BOOL
base_value: "FALSE"
id: 0
'
	GenerateContentInputTokensQuota__enabled: 'name: "GenerateContentInputTokensQuota__enabled"
type: BOOL
base_value: "TRUE"
id: 0
'
	GlobalDagRouting__enable_context_length_for_routing_decision: 'name: "GlobalDagRouting__enable_context_length_for_routing_decision"
type: BOOL
base_value: "TRUE"
id: 0
'
	GlobalDagRouting__enable_uta_routing: 'name: "GlobalDagRouting__enable_uta_routing"
type: BOOL
base_value: "TRUE"
id: 0
'
	GlobalDagRouting__enable_uta_routing_decision_context_logging: 'name: "GlobalDagRouting__enable_uta_routing_decision_context_logging"
type: BOOL
base_value: "TRUE"
id: 0
'
	GlobalDagRouting__enabled: 'name: "GlobalDagRouting__enabled"
type: BOOL
base_value: "TRUE"
id: 0
'
	GlobalEndpointDsq__allow_regional_pt_report_global_dsq_quota: 'name: "GlobalEndpointDsq__allow_regional_pt_report_global_dsq_quota"
type: BOOL
base_value: "TRUE"
id: 0
'
	GlobalEndpointDsq__enabled: 'name: "GlobalEndpointDsq__enabled"
type: BOOL
base_value: "TRUE"
id: 0
'
	GlobalEndpointSuperquotaMetric__allow_superquota_check_for_global_endpoint: 'name: "GlobalEndpointSuperquotaMetric__allow_superquota_check_for_global_endpoint"
type: BOOL
base_value: "TRUE"
id: 0
'
	GlobalEndpointSuperquotaMetric__disable_superquota_check_for_pt: 'name: "GlobalEndpointSuperquotaMetric__disable_superquota_check_for_pt"
type: BOOL
base_value: "FALSE"
id: 0
'
	GlobalEndpointSuperquotaMetric__enable_global_endpoint_quota_check: 'name: "GlobalEndpointSuperquotaMetric__enable_global_endpoint_quota_check"
type: BOOL
base_value: "TRUE"
id: 0
'
	GlobalRoutingSessionCache__enabled: 'name: "GlobalRoutingSessionCache__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	GroundingCitationFeature__enabled: 'name: "GroundingCitationFeature__enabled"
type: BOOL
base_value: "TRUE"
id: 0
'
	GroundingCitationFeature__prune_web_title: 'name: "GroundingCitationFeature__prune_web_title"
type: BOOL
base_value: "FALSE"
id: 0
'
	GroundingCitationFeature__use_web_search_query_for_cited_uri: 'name: "GroundingCitationFeature__use_web_search_query_for_cited_uri"
type: BOOL
base_value: "FALSE"
id: 0
'
	ImplicitCacheFeature__enabled: 'name: "ImplicitCacheFeature__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	ImplicitCacheFeature__implicit_cache_enabled_models: 'name: "ImplicitCacheFeature__implicit_cache_enabled_models"
type: PROTO
sub_type: "experiments.proto.StringListParam"
base_value: "element: \"gemini-1.5-pro\"\nelement: \"gemini-1.5-flash\"\nelement: \"gemini-2.0-flash\"\nelement: \"gemini-2.0-flash-preview-image-generation\"\nelement: \"gemini-2.0-flash-lite\"\nelement: \"gemini-2.5-pro\"\nelement: \"gemini-2.5-flash\"\n"
modifier {
  value_operator: OVERRIDE
  base_value: "element: \"gemini-1.5-pro\"\nelement: \"gemini-1.5-flash\"\nelement: \"gemini-2.0-flash\"\nelement: \"gemini-2.0-flash-preview-image-generation\"\nelement: \"gemini-2.0-flash-lite\"\nelement: \"gemini-2.5-pro\"\nelement: \"gemini-2.5-flash\"\nelement: \"gemini-experimental\"\nelement: \"gemini-eval-gemini-1-5-flash-001\"\n"
  condition_group {
  }
  condition_index: 0
}
id: 0
'
	InCountryProcessingEnabledForEmbedding__enabled: 'name: "InCountryProcessingEnabledForEmbedding__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	LiveApiSessionPerChargeToken__enabled: 'name: "LiveApiSessionPerChargeToken__enabled"
type: BOOL
base_value: "FALSE"
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 42
}
id: 0
'
	Llama3MaasBilling__enabled: 'name: "Llama3MaasBilling__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	MaasLoraConsistentHashingRouting__enabled: 'name: "MaasLoraConsistentHashingRouting__enabled"
type: BOOL
base_value: "TRUE"
id: 0
'
	MaasLoraSignedUrl__enabled: 'name: "MaasLoraSignedUrl__enabled"
type: BOOL
base_value: "TRUE"
id: 0
'
	MaasLoraSignedUrl__url_ttl: 'name: "MaasLoraSignedUrl__url_ttl"
type: INT
base_value: "10"
id: 0
'
	ModelGardenProdDogfood__enabled: 'name: "ModelGardenProdDogfood__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	MultiregionLlmRouting__enabled: 'name: "MultiregionLlmRouting__enabled"
type: BOOL
base_value: "TRUE"
id: 0
'
	PinServiceConfigId__enabled: 'name: "PinServiceConfigId__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	PinnacleContentLogging__enabled: 'name: "PinnacleContentLogging__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	PinnacleModelBqmlLowPriority__enabled: 'name: "PinnacleModelBqmlLowPriority__enabled"
type: BOOL
base_value: "TRUE"
id: 0
'
	PinnacleRejectRequestsRestrictedByVpcSc__enabled: 'name: "PinnacleRejectRequestsRestrictedByVpcSc__enabled"
type: BOOL
base_value: "FALSE"
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 66
}
id: 0
'
	PinnacleWebSearchDenylist__denylisted_projects: 'name: "PinnacleWebSearchDenylist__denylisted_projects"
type: PROTO
sub_type: "experiments.proto.StringListParam"
base_value: ""
modifier {
  value_operator: OVERRIDE
  base_value: ""
  condition_group {
  }
  condition_index: 388
}
id: 0
'
	PinnacleWebSearch__enabled: 'name: "PinnacleWebSearch__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	PriorityPaygo__enabled: 'name: "PriorityPaygo__enabled"
type: BOOL
base_value: "FALSE"
modifier {
  value_operator: OVERRIDE
  base_value: "FALSE"
  condition_group {
  }
  condition_index: 819
}
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 649
}
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 788
}
id: 0
'
	PtConversion__enabled: 'name: "PtConversion__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	PtFutureStartAllTerms__enabled: 'name: "PtFutureStartAllTerms__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	PtGlobalFallbackQuotaCheck__enabled: 'name: "PtGlobalFallbackQuotaCheck__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	PtSpilloverMetricLabel__enabled: 'name: "PtSpilloverMetricLabel__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	PublisherModelAllowlistMigration__enabled: 'name: "PublisherModelAllowlistMigration__enabled"
type: BOOL
base_value: "TRUE"
id: 0
'
	QsAlwaysDedcuctForResponse__enabled: 'name: "QsAlwaysDedcuctForResponse__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	RewriteServoDeadlineExceededNonPt__enabled: 'name: "RewriteServoDeadlineExceededNonPt__enabled"
type: BOOL
base_value: "TRUE"
id: 0
'
	RewriteUnavailableLlmResponse__enabled: 'name: "RewriteUnavailableLlmResponse__enabled"
type: BOOL
base_value: "TRUE"
id: 0
'
	RhudaurForceGenaiMethodRestriction__enabled: 'name: "RhudaurForceGenaiMethodRestriction__enabled"
type: BOOL
base_value: "TRUE"
id: 0
'
	SampleRequestToRouter__enabled: 'name: "SampleRequestToRouter__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	SampleRequestToRouter__router_models: 'name: "SampleRequestToRouter__router_models"
type: PROTO
sub_type: "experiments.proto.StringListParam"
base_value: "element: \"smart-router-001\"\n"
id: 0
'
	SampleRequestToRouter__sample_model_ids: 'name: "SampleRequestToRouter__sample_model_ids"
type: PROTO
sub_type: "experiments.proto.StringListParam"
base_value: "element: \"gemini-1.5-flash-001\"\nelement: \"gemini-1.5-pro-001\"\n"
id: 0
'
	ShouldUseNewUtaLocationSelection__enabled: 'name: "ShouldUseNewUtaLocationSelection__enabled"
type: BOOL
base_value: "FALSE"
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 82
}
id: 0
'
	ShouldUseUtaCacheBasedRouting__enabled: 'name: "ShouldUseUtaCacheBasedRouting__enabled"
type: BOOL
base_value: "FALSE"
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 83
}
id: 0
'
	ThreadDetachMigration__enabled: 'name: "ThreadDetachMigration__enabled"
type: BOOL
base_value: "TRUE"
id: 0
'
	TrafficDirector__enabled: 'name: "TrafficDirector__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	TunedModelUseBaseModelQCD__enabled: 'name: "TunedModelUseBaseModelQCD__enabled"
type: BOOL
base_value: "FALSE"
modifier {
  value_operator: OVERRIDE
  base_value: "TRUE"
  condition_group {
  }
  condition_index: 85
}
id: 0
'
	TurnOnPublisherModelServingSpecKey__enabled: 'name: "TurnOnPublisherModelServingSpecKey__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	UtaRoutingWithUniserveEndpointMatch__enabled: 'name: "UtaRoutingWithUniserveEndpointMatch__enabled"
type: BOOL
base_value: "TRUE"
id: 0
'
	V1pChatCompletionsTopLevelRetries__enabled: 'name: "V1pChatCompletionsTopLevelRetries__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	V1pChatCompletionsTopLevelRetries__retries_config: 'name: "V1pChatCompletionsTopLevelRetries__retries_config"
type: PROTO_BINARY_BASE64
sub_type: "cloud_ai_platform_dataplane_prediction_proto.CardolanRetriesConfig"
base_value: ""
id: 0
'
	VertexLlmManagedServingSpec__enabled: 'name: "VertexLlmManagedServingSpec__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'
	YzhTestFeature__enabled: 'name: "YzhTestFeature__enabled"
type: BOOL
base_value: "FALSE"
id: 0
'