Advertisement
Singapore markets open in 6 minutes
  • Straits Times Index

    3,367.90
    +29.33 (+0.88%)
     
  • S&P 500

    5,509.01
    +33.92 (+0.62%)
     
  • Dow

    39,331.85
    +162.33 (+0.41%)
     
  • Nasdaq

    18,028.76
    +149.46 (+0.84%)
     
  • Bitcoin USD

    62,143.70
    -735.59 (-1.17%)
     
  • CMC Crypto 200

    1,337.52
    -6.99 (-0.52%)
     
  • FTSE 100

    8,121.20
    -45.56 (-0.56%)
     
  • Gold

    2,340.70
    +7.30 (+0.31%)
     
  • Crude Oil

    83.00
    +0.19 (+0.23%)
     
  • 10-Yr Bond

    4.4360
    -0.0430 (-0.96%)
     
  • Nikkei

    40,329.88
    +255.19 (+0.64%)
     
  • Hang Seng

    17,769.14
    +50.53 (+0.29%)
     
  • FTSE Bursa Malaysia

    1,597.96
    -0.24 (-0.02%)
     
  • Jakarta Composite Index

    7,125.14
    -7,139.63 (-50.05%)
     
  • PSE Index

    6,358.96
    -39.81 (-0.62%)
     

How Google Cloud is helping Culture Amp and others be AI-assisted

It is providing the infrastructure and solutions that make it easier for organisations to build and use generative AI at scale.

Being heard and feeling valued are among the top reasons why employees stay. As such, most companies conduct workplace surveys to measure their staff’s involvement and enthusiasm in their work. However, extracting insights from those surveys to take the right actions in a timely manner can be challenging.

This is why Culture Amp has enhanced its employee experience platform offering with generative AI capabilities, which will be available later this year. Powered by Google Cloud’s machine learning platform Vertex AI, Culture Amp’s platform will be able to summarise up to tens of thousands of employee survey comments into topics and actionable insights in near real-time. This process typically takes HR administrators of medium-to-large organisations up to hundreds of hours to complete.

ADVERTISEMENT

HR administrators can also retain full oversight of which employee feedback traces back to the summarised insights produced by Vertex AI. This allows them to correct the results for potential bias and organisational context while maintaining the anonymity of employees’ responses.

“Capabilities like summation of comments allow our customers to respond to employee feedback at scale in as close to real time as possible. This bolsters employee confidence in their employers to act on workplace issues in a timely and decisive fashion,” says Doug English, Culture Amp’s co-founder and chief technology officer.

He continues: “The training and modelling capabilities that Vertex AI provides help us ensure such solutions lead to a much more intuitive and responsive employee experience, without negating HR’s expertise in negotiating complex workforce dynamics or undermining people’s trust in their employers.”

Google Cloud’s commitment to responsible AI is another reason why Culture Amp selected Vertex AI as the foundation for the generative AI features on its employee experience platform. Besides providing users with control over the desired insights, Vertex AI ensures customer data remains with Culture Amp, which helps ensure data privacy, English shares with DigitalEdge at the Google Cloud Next 2023 conference in San Francisco last week. He adds that Vertex AI’s open architecture will also support Culture Amp’s efforts to involve customers in co-creating future generative AI functions for its platform.

A platform to develop generative AI apps

At the recent Google Cloud Next 2023 event, CEO Thomas Kurian shared the company's approach to helping organisations build and embrace generative AI as their assistants more easily and quickly.

Offering Vertex AI is one way it is enabling organisations to build generative AI apps. “Our Vertex AI platform offers access to more than 100 foundation models [including third-party and popular open source versions like Meta’s Llama 2 and Falcon LLM] for customers to choose from. It also provides services like digital watermarking and grounding,” says Thomas Kurian, CEO of Google Cloud.

Powered by Google DeepMind SynthID, digital watermarking embeds the watermark directly into the image of pixels, making it invisible to the human eye and difficult to tamper with. It allows organisations to create and identify AI-generated images responsibly at scale.

Meanwhile, the grounding service can help prevent hallucination, where large language models (LLMs) that are usually used in generative AI generate false information. Since grounding involves giving LLMs access to enterprise data, it can help the model deliver more accurate responses.

Besides those, Google Cloud announced that its PaLM 2 LLM for Text and Chat now supports 38 languages including Simplified Chinese, Traditional Chinese, Indonesian, Thai, and Vietnamese. This will enable Southeast Asian organisations to build generative AI applications on Vertex AI in local languages while grounding responses with their own enterprise data or private corpus.

Google Cloud is planning to host PaLM 2 for Text and Chat in its Singapore cloud region later this year. PaLM 2 for Text and Chat will also support 32,000-token context windows (which is enough to include an 85-page document in a prompt) to support longer question-answer chats and summarise and analyse large documents like research papers, books, and legal briefs.

Additionally, improvements have been made to the quality of Codey (Google Cloud’s first-party model for generating and fixing software code) by up to 25% in major supported languages for code generation and code chat. Enterprises can access Codey through Vertex AI’s Model Garden alongside adapter tuning capabilities. Google Cloud is also planning to host Codey in its Singapore cloud region later this year.

An AI-optimised infrastructure

Google Cloud is providing an AI-optimised infrastructure for companies to train and serve models too. It has a global network of 38 cloud regions, including Indonesia and Singapore. To further help organisations in Singapore run demanding AI workloads more cost-effectively and at scale, the cloud giant is planning to bring its Cloud TPU v5e to its Singapore cloud region later this year.

With Cloud TPU v5e, customers can use a single Cloud Tensor Processing Unit (TPU) platform to run both large-scale AI training and inferencing. Cloud TPU v5e delivers up to two times higher training performance per dollar and up to 2.5 times higher inference performance per dollar for LLMs and generative AI models compared to Cloud TPU v4, making it possible for organisations to train and deploy larger, more complex AI models.

Making generative AI accessible

To make generative AI accessible to more users, Duet AI is now deeply integrated into Google Workspace and Google Cloud. Duet AI is the company’s always-on collaborator powered by AI.

Thanks to Duet AI in Google Meet, meeting attendees will always look and sound their best with studio look, lighting, and sound. With the launch of automatic translated captions for 18 languages – including Simplified Chinese, Traditional Chinese, Indonesian, Thai, and Vietnamese – Google Meet will automatically detect when another language is spoken and display the translation in real time.

Moreover, Duet AI in Google Meet can capture notes, action items, and video snippets in real time with the new “take notes for me” feature and send a summary to attendees after the meeting. It can even help latecomers get up to speed with a “summary so far,” which gives a quick snapshot of everything they have missed.

For users who cannot make a meeting but have inputs to share, the “attend for me” feature allows Duet AI to join the meeting on their behalf, delivering their message and ensuring they get the recap.

Duet AI can also provide AI assistance in other Google Cloud products and services. This includes offering expert assistance across the entire software development lifecycle, enabling developers to stay in flow-state longer by minimising context switching to help them be more productive. In addition to code completion and code generation, it can assist with code refactoring and building application programming interfaces (APIs) using simple natural language prompts.

Besides that, Duet AI in Looker enables fast and simple conversational queries that empower non-technical users to “talk” with their business data, similar to how they “ask Google” a question, get answers, and refine results into visuals and reports.

For organisations that are worried about data privacy, Kurian asserts that no other user will see their data and Google does not use their data to train its models without their permission. “Duet AI was designed using Google’s comprehensive approach to help protect customers' security and privacy, as well as our AI principles. With Duet AI, your data is your data. Your code, your inputs to Duet AI, and your recommendations generated by Duet AI will not be used to train any shared models nor used to develop any products,” he claims.

According to Kurian, we’re now in the next phase of the cloud, with artificial intelligence (AI) as the next big focus. “[The next phase of the cloud is ensuring that AI] assists users. [We’re doing so] by providing a world-class infrastructure and platform to build AI, offering always-on collaborative Duet AI to help users use AI in Workspace and Google Cloud, and investing in your success with our broad ecosystem of partners that have the expertise to deliver solutions [effectively],” he says.

See Also: