Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Only plain objects can be passed to Client Components from Server Components #1501

Open
Iven2132 opened this issue May 6, 2024 · 11 comments
Labels

Comments

@Iven2132
Copy link

Iven2132 commented May 6, 2024

Description

The server-action example on AI SDK docs gives me an error Warning: Only plain objects can be passed to Client Components from Server Components

Code example

"use server";

import { createStreamableValue } from "ai/rsc";
import { CoreMessage, streamText } from "ai";
import { createOpenAI } from "@ai-sdk/openai";

const together = createOpenAI({
  apiKey: "myapi",
  baseURL: "https://api.together.xyz/v1",
});

export async function continueConversation(messages: CoreMessage[]) {

  const result = await streamText({
    model: together.completion("mistralai/Mixtral-8x7B-Instruct-v0.1"),
    messages,
  });

  const stream = createStreamableValue(result.textStream);
  return stream.value;
}
'use client'

import { type CoreMessage } from 'ai'
import { useState } from 'react'
import { continueConversation } from '@/lib/chat';
import { readStreamableValue } from 'ai/rsc'

export default function Chat() {
  const [messages, setMessages] = useState<CoreMessage[]>([])
  const [input, setInput] = useState('')
  return (
    <div className="flex flex-col w-full max-w-md py-24 mx-auto stretch">
      {messages.map((m, i) => (
        <div key={i} className="whitespace-pre-wrap">
          {m.role === 'user' ? 'User: ' : 'AI: '}
          {m.content as string}
        </div>
      ))}

      <form
        action={async () => {
          const newMessages: CoreMessage[] = [
            ...messages,
            { content: input, role: 'user' }
          ]

          setMessages(newMessages)
          setInput('')

          const result = await continueConversation(newMessages)

          for await (const content of readStreamableValue(result)) {
            setMessages([
              ...newMessages,
              {
                role: 'assistant',
                content: content as string
              }
            ])
          }
        }}
      >
        <input
          className="fixed bottom-0 w-full max-w-md p-2 mb-8 border border-gray-300 rounded shadow-xl"
          value={input}
          placeholder="Say something..."
          onChange={e => setInput(e.target.value)}
        />
      </form>
    </div>
  )
}

Additional context

using latest version of AI SDK 3.1.1

@shaded-blue
Copy link

shaded-blue commented May 6, 2024

I encountered the same issue after migration to 3.1.x. I'm not sure if it's just gotten more strict and is no longer facilitating existing bad practices in my code, if I migrated wrong, or if this is just a bug.

My migration was as simple as the very straightforward swap from render => streamUI, unstable_onGet/SetState in AIProvider, and I tried both direct { openai } and createProvider from the ai-sdk for the new provider.

For me, the issue appears even when just returning a text response with nothing else provided as an option (no tool calls, etc).

Loading chats seems to work for text responses, but initial send throws error after textStream finishes according to server console.

To be clear, it's the same error as @Iven2132 . 'Only plain objects can be passed @ stringify ' or something like that to paraphrase (I ended up rolling back to 3.0.x., and I didn't record the exact error).

Like I'm sure Iven is, happy to test out any suggestions. If I get time I will put together a minimal reproduction and hopefully shed additional light.

@lgrammel
Copy link
Collaborator

lgrammel commented May 7, 2024

@Iven2132 @shaded-blue which next.js versions are you using? I just tested the example on the latest next js and it worked for me.

@Iven2132
Copy link
Author

Iven2132 commented May 7, 2024

@Iven2132 @shaded-blue which next.js versions are you using? I just tested the example on the latest next js and it worked for me.

Using Next.js 14.2.3 and AI SDK 3.1.1 and @ai-sdk/openai 0.0.9

@lgrammel
Copy link
Collaborator

lgrammel commented May 7, 2024

@Iven2132 hm I'm using the same. Can you share your layout.tsx content as well?

@Iven2132
Copy link
Author

Iven2132 commented May 7, 2024

@Iven2132 hm I'm using the same. Can you share your layout.tsx content as well?

Yes:


import type { Metadata } from "next";
import { Inter } from "next/font/google";
import "./globals.css";

const inter = Inter({ subsets: ["latin"] });

export const metadata: Metadata = {
  title: "Create Next App",
  description: "Generated by create next app",
};

export default function RootLayout({
  children,
}: Readonly<{
  children: React.ReactNode;
}>) {
  return (
    <html lang="en">
      <body className={inter.className}>{children}</body>
    </html>
  );
}

@lgrammel
Copy link
Collaborator

lgrammel commented May 7, 2024

Strange. Here is my working setup:

actions.tsx

'use server';

import { createStreamableValue } from 'ai/rsc';
import { CoreMessage, streamText } from 'ai';
import { openai } from '@ai-sdk/openai';

export async function continueConversation(messages: CoreMessage[]) {
  const result = await streamText({
    model: openai('gpt-4-turbo'),
    messages,
  });

  const stream = createStreamableValue(result.textStream);
  return stream.value;
}

layout.tsx

import type { Metadata } from 'next';
import { Inter } from 'next/font/google';
import './globals.css';

const inter = Inter({ subsets: ['latin'] });

export const metadata: Metadata = {
  title: 'Create Next App',
  description: 'Generated by create next app',
};

export default function RootLayout({
  children,
}: Readonly<{
  children: React.ReactNode;
}>) {
  return (
    <html lang="en">
      <body className={inter.className}>{children}</body>
    </html>
  );
}

page.tsx

'use client';

import { type CoreMessage } from 'ai';
import { useState } from 'react';
import { continueConversation } from './actions';
import { readStreamableValue } from 'ai/rsc';

export default function Chat() {
  const [messages, setMessages] = useState<CoreMessage[]>([]);
  const [input, setInput] = useState('');
  return (
    <div className="flex flex-col w-full max-w-md py-24 mx-auto stretch">
      {messages.map((m, i) => (
        <div key={i} className="whitespace-pre-wrap">
          {m.role === 'user' ? 'User: ' : 'AI: '}
          {m.content as string}
        </div>
      ))}

      <form
        action={async () => {
          const newMessages: CoreMessage[] = [
            ...messages,
            { content: input, role: 'user' },
          ];

          setMessages(newMessages);
          setInput('');

          const result = await continueConversation(newMessages);

          for await (const content of readStreamableValue(result)) {
            setMessages([
              ...newMessages,
              {
                role: 'assistant',
                content: content as string,
              },
            ]);
          }
        }}
      >
        <input
          className="fixed bottom-0 w-full max-w-md p-2 mb-8 border border-gray-300 rounded shadow-xl"
          value={input}
          placeholder="Say something..."
          onChange={e => setInput(e.target.value)}
        />
      </form>
    </div>
  );
}

next: 14.2.3 react: 18.2.0 ai-sdk: 3.1.1

can you try the above? it should be very similar to what you have.

@Iven2132
Copy link
Author

Iven2132 commented May 7, 2024

Strange. Here is my working setup:

actions.tsx

'use server';

import { createStreamableValue } from 'ai/rsc';
import { CoreMessage, streamText } from 'ai';
import { openai } from '@ai-sdk/openai';

export async function continueConversation(messages: CoreMessage[]) {
  const result = await streamText({
    model: openai('gpt-4-turbo'),
    messages,
  });

  const stream = createStreamableValue(result.textStream);
  return stream.value;
}

layout.tsx

import type { Metadata } from 'next';
import { Inter } from 'next/font/google';
import './globals.css';

const inter = Inter({ subsets: ['latin'] });

export const metadata: Metadata = {
  title: 'Create Next App',
  description: 'Generated by create next app',
};

export default function RootLayout({
  children,
}: Readonly<{
  children: React.ReactNode;
}>) {
  return (
    <html lang="en">
      <body className={inter.className}>{children}</body>
    </html>
  );
}

page.tsx

'use client';

import { type CoreMessage } from 'ai';
import { useState } from 'react';
import { continueConversation } from './actions';
import { readStreamableValue } from 'ai/rsc';

export default function Chat() {
  const [messages, setMessages] = useState<CoreMessage[]>([]);
  const [input, setInput] = useState('');
  return (
    <div className="flex flex-col w-full max-w-md py-24 mx-auto stretch">
      {messages.map((m, i) => (
        <div key={i} className="whitespace-pre-wrap">
          {m.role === 'user' ? 'User: ' : 'AI: '}
          {m.content as string}
        </div>
      ))}

      <form
        action={async () => {
          const newMessages: CoreMessage[] = [
            ...messages,
            { content: input, role: 'user' },
          ];

          setMessages(newMessages);
          setInput('');

          const result = await continueConversation(newMessages);

          for await (const content of readStreamableValue(result)) {
            setMessages([
              ...newMessages,
              {
                role: 'assistant',
                content: content as string,
              },
            ]);
          }
        }}
      >
        <input
          className="fixed bottom-0 w-full max-w-md p-2 mb-8 border border-gray-300 rounded shadow-xl"
          value={input}
          placeholder="Say something..."
          onChange={e => setInput(e.target.value)}
        />
      </form>
    </div>
  );
}

next: 14.2.3 react: 18.2.0 ai-sdk: 3.1.1

can you try the above? it should be very similar to what you have.

Can you share your project? like a GitHub repo?

@lgrammel
Copy link
Collaborator

lgrammel commented May 7, 2024

i've modified examples/next-ai-rsc locally with the changes above

@lgrammel
Copy link
Collaborator

lgrammel commented May 7, 2024

@Iven2132
Copy link
Author

Iven2132 commented May 7, 2024

Here's the branch: https://github.com/vercel/ai/tree/lg/issue-1501/examples/next-ai-rsc

the error only comes when I use mistralai/Mixtral-8x7B-Instruct-v0.1 by together.ai, the code works fine with gpt-3.5.

Can you please try to use models by together ai?

@Iven2132
Copy link
Author

Iven2132 commented May 9, 2024

@lgrammel let me know if you can reproduce it

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants