•  


Introducing Vertex AI for Firebase

Introducing Vertex AI for Firebase

Build AI features with the Vertex AI Gemini API using Firebase's new client SDKs

Vertex AI is Google’s development platform for building scalable generative AI applications, ensuring developers and enterprises benefit from Google’s highest quality standards for privacy, scalability, reliability, and performance.

Up until now, using Vertex AI required you to set up a backend, using languages like Python, Java, Node.js, or Go, and then expose your AI features to your client apps via a service layer. Not all applications require or benefit from adding a service layer - many use cases can be implemented using a serverless paradigm.

Today, we’re excited to announce Vertex AI for Firebase , including a suite of client SDKs for your favorite languages ? Swift, Kotlin, Java, Dart, and JavaScript ? bringing serverless to AI.

The Vertex AI for Firebase client SDKs enable you to harness the capabilities of the Gemini family of models directly from your mobile and web apps. You can now easily and securely run inference across all Gemini models and versions hosted by Vertex AI, including Gemini 1.5 Pro and 1.5 Flash.

Here’s how Vertex AI for Firebase makes developing AI features easier, better, and more secure for mobile and web developers:

  1. Effortless onboarding
  2. Access to the Vertex AI Gemini API from your client
  3. Use Firebase App Check to protect the Gemini API from unauthorized clients
  4. Streamline file uploads for multimodal AI prompts via Cloud Storage for Firebase
  5. Seamless model and prompt updates with Firebase Remote Config

Effortless onboarding

Onboarding to use the Vertex AI Gemini API through Firebase is seamless, feeling just like any other Firebase product. The Firebase console’s new “Build with the Gemini API” page streamlines getting started with the Vertex AI Gemini API . A guided assistant simplifies enrollment in the pay-as-you-go plan, activating necessary Vertex AI services, and generating a Firebase configuration file for your app. And with those steps done, you’re just a few lines of code away from using Gemini models in your app.

Access to the Vertex AI Gemini API from your client

Firebase provides a gateway to the Vertex AI Gemini API, allowing you to use the Gemini models directly from your app via Kotlin / Java, Swift, Dart, and JavaScript SDKs.

These Firebase client SDKs give you granular control over the model’s behavior through parameters and safety settings that dictate response generation and prompting. And whether you’re crafting a single interaction or a multi-turn conversation (like chat) , you can guide the model with system instructions even before processing the user prompt. You can also generate text responses from diverse multimodal prompts, incorporating text, images, PDFs, videos, and audio.

Using system instructions
// Initialize Vertex AI for Firebase

let
 vertex 
=
 VertexAI
.
vertexAI
(
)


// Initialize the generative model

let
 model 
=
 vertex
.
generativeModel
(

  modelName
:
 "gemini-1.5-pro-preview-0409"
,

  systemInstruction
:
 "You write inspirational, original, and impactful"
 +

    "quotes for posters that motivate people and offer insightful"
 +

    "perspectives on life."

)


// Provide an image

guard
 let
 image 
=
 UIImage
(
...
)
 else
 {
 fatalError
(
)
 }


// Generate text from a multimodal prompt (text and image)

let
 response 
=
 try
 await
 model
.
generateContent
(
image
,

  "What quote should I put on this poster?"
)

if
 let
 text 
=
 response
.
text 
{

  print
(
text
)

}
Copied!

For scenarios demanding immediate feedback, you can stream responses in real-time.

Streaming
val
 generativeModel 
=
 Firebase
.
vertexAI
        
.
generativeModel
(
"gemini-1.5-pro-preview-0409"
)


// Provide a text prompt

val
 prompt 
=
 "Write a story about a magic backpack."


// To stream generated text output,

// call generateContentStream and pass in the prompt

var
 response 
=
 ""

generativeModel
.
generateContentStream
(
prompt
)

    .
collect
 {
 chunk 
->

        print
(
chunk
.
text
)

        response 
+=
 chunk
.
text
    
}
Copied!

To enable further processing, the Gemini API can provide JSON files. This allows the model to return structured data types that can be easily parsed into your objects. Additionally, function calling connects Gemini models to external systems, ensuring the generated content incorporates the most up-to-date information. You provide the model with function descriptions, and during interactions, it may request the execution of a function to enhance its understanding of your query, leading to more comprehensive and informed answers.

Use Firebase App Check to protect the Gemini API from unauthorized clients

When apps run on user devices, protecting them from abuse becomes challenging. The unique code identifying users (API key or tokens) needs to be stored on the client-side, but these locations are not secure. Malicious actors can extract the key, leading to unexpected costs, data breaches, or quota overages, negatively impacting the user experience.

To solve this problem, Vertex AI for Firebase SDKs are integrated with Firebase App Check . App Check verifies the authenticity of each call to the Vertex AI Gemini API, ensuring that only legitimate requests from genuine apps and devices are processed. This proactive defense mechanism prevents unauthorized access and safeguards against potential abuse. This added layer of security empowers you to confidently deploy your mobile and web apps, knowing that your valuable assets are shielded from unauthorized access and potential harm.

Streamline file uploads via Cloud Storage for Firebase

Cloud Storage for Firebase offers an efficient and flexible approach for uploading files for use in your multimodal prompts ? especially in conditions with less reliable network conditions. Your app can directly upload user files (like images, videos, and PDFs) to a Cloud Storage bucket, and then you can easily reference these files in your multimodal prompts. Additionally, Firebase Security Rules provide granular control over file access, ensuring only authorized users can interact with the uploaded content.

Using images
const
 firebaseConfig 
=
 {

  // Get your Firebase configuration from the Firebase console

}
;


const
 firebaseApp 
=
 initializeApp
(
firebaseConfig
)
;


// Initialize the Vertex AI service

const
 vertexAI 
=
 getVertexAI
(
firebaseApp
)
;


// Initialize the generative model 

const
 model 
=
 getGenerativeModel
(
vertexAI
,
 {

  model
:
 "gemini-1.5-pro-preview-0409"
 
}
)
;


const
 prompt 
=
 "What's in this picture?"
;


// Explicitly include the MIME type and Cloud Storage URL

const
 imagePart 
=
 {
 fileData
:
 {

  mimeType
:
 'image/jpeg'
,

  fileUri
:
 "gs://bucket-name/path/image.jpg"

}
}
;


const
 result 
=
 await
 model
.
generateContent
(
[
prompt
,
 imagePart
]
)
;

console
.
log
(
result
.
response
.
text
(
)
)
;
Copied!

Seamless model and prompt updates with Firebase Remote Config

Fine-tuning the right prompt for your specific use cases takes time and effort, often involving trial and error. Unexpected scenarios can arise, leading to unwanted user experiences, and you may need to update your prompts to keep things running smoothly. Additionally, with the rapid pace of innovation in AI models and features, new model versions are released multiple times per year. You want the flexibility to update the prompts and model versions in your apps without forcing users to download a full update.

Firebase Remote Config is the perfect tool for this situation. It’s a cloud service that lets you adjust your app’s behavior on the fly, without requiring users to download an app update. With Remote Config, you set up default values within your app to manage your model and prompts. Then, you can use the Firebase console to change these defaults for all users, or target specific groups to experiment and conduct A/B testing.

Integrating with Remote Config
// Initialize Firebase

await
 Firebase
.
initializeApp
(
)
;


// Get Remote Config instance

final
 remoteConfig 
=
 FirebaseRemoteConfig
.
instance
;


// Get prompt text and Gemini model from Remote Config

final
 prompt 
=
 remoteConfig
.
getString
(
'promptText'
)
;

final
 geminiModel 
=
 remoteConfig
.
getString
(
'geminiModel'
)
;


// Initialize the generative model

final
 model 
=
 FirebaseVertexAI
.
instance
  
.
generativeModel
(
model
:
 geminiModel
)
;


const
 imagePart 
=
 {
 
  fileData
:
 {
 
    mimeType
:
 'image/jpeg'
,
 
    fileUri
:
 'gs://bucket-name/path/image.jpg'

  }

}
;


const
 result 
=
 await
 model
.
generateContent
(
[
prompt
,
 imagePart
]
)
;

print
(
result
.
response
.
text
(
)
)
;
Copied!

Get started now

Here’s where to get started:

  • Begin building your vision now with our public preview
  • Be ready to release your app to production in the Fall this year when Vertex AI for Firebase plans its general availability release.
  • Check out our quick starts and read our documentation

Your feedback is invaluable. Please report bugs, request features, or contribute code directly to our Firebase SDKs’ repositories . We also encourage you to participate in Firebase’s UserVoice to share your ideas and vote on existing ones.

We can’t wait to see what you build with Vertex AI for Firebase!

- "漢字路" 한글한자자동변환 서비스는 교육부 고전문헌국역지원사업의 지원으로 구축되었습니다.
- "漢字路" 한글한자자동변환 서비스는 전통문화연구회 "울산대학교한국어처리연구실 옥철영(IT융합전공)교수팀"에서 개발한 한글한자자동변환기를 바탕하여 지속적으로 공동 연구 개발하고 있는 서비스입니다.
- 현재 고유명사(인명, 지명등)을 비롯한 여러 변환오류가 있으며 이를 해결하고자 많은 연구 개발을 진행하고자 하고 있습니다. 이를 인지하시고 다른 곳에서 인용시 한자 변환 결과를 한번 더 검토하시고 사용해 주시기 바랍니다.
- 변환오류 및 건의,문의사항은 juntong@juntong.or.kr로 메일로 보내주시면 감사하겠습니다. .
Copyright ⓒ 2020 By '전통문화연구회(傳統文化硏究會)' All Rights reserved.
 한국   대만   중국   일본