}>\r\n \u003CPhotoDetail id={params.id} isModal={false} />\r\n \u003C/Suspense>\r\n \u003C/HydrateClient>\r\n );\r\n}\r\n\r\n```\r\n\r\nClient Component Hook:\r\n\r\n```tsx\r\nexport function useGetPhotoById(id: string) {\r\n return trpc.photos.getPhotoById.useSuspenseQuery(id, {\r\n staleTime: 1000 * 60 * 5,\r\n gcTime: 1000 * 60 * 30,\r\n refetchOnMount: false,\r\n refetchOnWindowFocus: false,\r\n });\r\n}\r\n\r\n```\n\n### Additional information\n\n# Expected Behavior\r\n\r\nThe component should transition smoothly from Suspense fallback directly to the data without showing an empty screen in between.\r\n\r\n# Actual Behavior\r\n\r\nThe loading sequence is:\r\n\r\n1. Suspense fallback (skeleton)\r\n2. Empty screen\r\n3. Actual data\r\n\r\n# Additional Information\r\n\r\n- Using Next.js App Router\r\n- Following RSC patterns from tRPC documentation\r\n- Issue occurs consistently on initial load\r\n- Using latest tRPC RC version (11.0.0-rc.467)\r\n- Using latest React Query version (5.40.0)\r\n\r\nWould appreciate guidance on how to achieve a smooth transition from skeleton to data without the empty screen flash.\n\n### 👨👧👦 Contributing\n\n- [X] 🙋♂️ Yes, I'd be down to file a PR fixing this bug!",[3028],{"name":3029,"color":3030},"🐛 bug: unconfirmed","e99695",6266,"Suspense fallback shows empty screen before data when using prefetch in Server Components","https://github.com/trpc/trpc/issues/6266",0.6418011,{"description":3036,"labels":3037,"number":3038,"owner":3019,"repository":3039,"state":3020,"title":3040,"updated_at":3041,"url":3042,"score":3043},"Hey there! We're writing an app that needs server-side events (it requires proxying events to an LLM like OpenAI). We're currently using trpc-openapi and love it; wondering if there's any way to keep our same architecture and add in SSE?\r\n\r\nThis is within a NextJS app.",[],429,"trpc-openapi","Support for server-side events?","2023-12-16T06:28:18Z","https://github.com/trpc/trpc-openapi/issues/429",0.70577997,{"description":3045,"labels":3046,"number":3050,"owner":3019,"repository":3019,"state":3051,"title":3052,"updated_at":3053,"url":3054,"score":3055},"### Area of Improvement\n\nWhen users use valibot, the tRPC documentation guides them to use `@decs/typeschema` for wrap, but in reality, this is redundant. The typeschema library is very complex, and if the project is built with the goal of creating a cloudflare worker, it will also cause a series of problems. I think the typeschema should be removed from the documentation, and instead, the user should be told to provide a validate function:\r\n\r\n```tsx\r\n// valibot-trpc.ts\r\nimport type { BaseSchema } from 'valibot'\r\nimport * as v from 'valibot'\r\n\r\nexport const validate = \u003CT extends BaseSchema>(schema: T) => \r\n (input: unknown) => v.parse(schema, input)\r\n\r\n// router.ts\r\nprotectedProcedure.input( validate(v.object( {} ) ).query( ( { input } ) => { ... } )\r\n```\n\n### Link to related docs\n\n_No response_\n\n### Additional information\n\n\u003Cimg width=\"1239\" alt=\"image\" src=\"https://github.com/trpc/trpc/assets/1720349/53fe9b5c-fa7e-4486-ba04-c0d62d9ace94\">\r\n\n\n### 👨👧👦 Contributing\n\n- [x] 🙋♂️ Yes, I'd be down to file a PR implementing the suggested changes!",[3047],{"name":3048,"color":3049},"📚 documentation / examples","0075ca",5460,"closed","docs: valibot + @decs/typeschema is unnecessary","2025-03-20T15:42:14Z","https://github.com/trpc/trpc/issues/5460",0.6610475,{"description":3057,"labels":3058,"number":3059,"owner":3019,"repository":3019,"state":3051,"title":3060,"updated_at":3061,"url":3062,"score":3063},"### Describe the feature you'd like to request\n\nI'm trying to chain a number of `experimental_standaloneMiddleware`s using `.use()` and trying to process `ctx` gradually. I can't get the types right.\n\n### Describe the solution you'd like to see\n\n```ts\r\nconst mw1 = experimental_standaloneMiddleware().create(...); // returns { ctx: { a, b } }\r\n\t\t\t\t\t\t\t\t\t\t\t\t// We should be able to access partial `ctx` of the procedure.\r\nconst mw2 = experimental_standaloneMiddleware\u003C{ ctx: { a: string } }>().create(({ ctx }) => ctx.a )\r\n```\r\n\r\nThis currently errors. Potentially related to #5047.\n\n### Describe alternate solutions\n\nIf there is any other way to achieve this right now please let me know. I don't think regular middleware works in theses situations either.\n\n### Additional information\n\nUsing T3 stack, Next.js\n\n### 👨👧👦 Contributing\n\n- [ ] 🙋♂️ Yes, I'd be down to file a PR implementing this feature!",[],5120,"feat: Use `ctx` values created by `experimental_standaloneMiddleware` inside another middleware after chaining them","2025-03-20T15:42:06Z","https://github.com/trpc/trpc/issues/5120",0.6746516,{"description":3065,"labels":3066,"number":3067,"owner":3019,"repository":3019,"state":3051,"title":3068,"updated_at":3069,"url":3070,"score":3071},"### Describe the feature you'd like to request\r\n\r\nFollowing #3482 & coming over from https://github.com/typescript-eslint/typescript-eslint/issues/6760: typescript-eslint is soon going to release a new v6 major version with reworked configurations. You can read up on it here: [typescript-eslint.io/blog/announcing-typescript-eslint-v6-beta](https://typescript-eslint.io/blog/announcing-typescript-eslint-v6-beta)\r\n\r\nSpecifically, the [configurations are reworked](https://typescript-eslint.io/blog/announcing-typescript-eslint-v6-beta#reworked-configuration-names) so that the recommended equivalents to what trpc uses now are:\r\n\r\n* `'plugin:@typescript-eslint/recommended-type-checked'`\r\n* `'plugin:@typescript-eslint/stylistic-type-checked'`\r\n\r\n### Describe the solution you'd like to see\r\n\r\nI can send a PR updating to them!\r\n\r\n### Describe alternate solutions\r\n\r\nI suppose trpc could stay on typescript-eslint@v5, or use its own bespoke lint configs... neither option seems very appealing to me. 😄 \r\n\r\n### Additional information\r\n\r\n_No response_\r\n\r\n### 👨👧👦 Contributing\r\n\r\n- [X] 🙋♂️ Yes, I'd be down to file a PR implementing this feature!",[],4540,"feat: Use typescript-eslint@v6 & its reworked configs","2025-03-20T15:41:45Z","https://github.com/trpc/trpc/issues/4540",0.6749168,{"description":3073,"labels":3074,"number":3076,"owner":3019,"repository":3019,"state":3051,"title":3077,"updated_at":3078,"url":3079,"score":3080},"### Provide environment information\n\n```\r\n System:\r\n OS: macOS 13.5.2\r\n CPU: (10) arm64 Apple M1 Max\r\n Memory: 271.91 MB / 32.00 GB\r\n Shell: 5.9 - /bin/zsh\r\n Binaries:\r\n Node: 18.16.0 - ~/.nvm/versions/node/v18.16.0/bin/node\r\n Yarn: 3.6.3 - ~/.nvm/versions/node/v18.16.0/bin/yarn\r\n npm: 9.5.1 - ~/.nvm/versions/node/v18.16.0/bin/npm\r\n Watchman: 2023.09.04.00 - /opt/homebrew/bin/watchman\r\n Browsers:\r\n Chrome: 116.0.5845.187\r\n Safari: 16.6\r\n```\n\n### Describe the bug\n\nWe've seen sporadic error reports in our React Native app of `Cannot read property 'message' of undefined` during construction of a `TRPCClientError`.\r\n\r\nHere's a screenshot from Sentry:\r\n\r\n\u003Cimg width=\"600\" alt=\"image\" src=\"https://github.com/trpc/trpc/assets/712727/c6320116-84b4-4e15-bf9d-67bb4bcd06a2\">\r\n\n\n### Link to reproduction\n\nN/A\n\n### To reproduce\n\nThis occurs sporadically and thus far we've been unable to reproduce this bug reliably in development.\n\n### Additional information\n\n_No response_\n\n### 👨👧👦 Contributing\n\n- [X] 🙋♂️ Yes, I'd be down to file a PR fixing this bug!",[3075],{"name":3029,"color":3030},4794,"bug: sporadic `Cannot read property 'message' of undefined` when constructing `TRPCClientError`","2025-03-20T15:41:55Z","https://github.com/trpc/trpc/issues/4794",0.6767649,{"description":3082,"labels":3083,"number":3084,"owner":3019,"repository":3019,"state":3051,"title":3085,"updated_at":3086,"url":3087,"score":3088},"### Describe the feature you'd like to request\r\n\r\nWhen using batched requests, the response time is limited by the slowest request of the batch. This feature request is to investigate the feasibility of out-of-order response streaming so that the fastest responses come in first.\r\n\r\n### Describe the solution you'd like to see\r\n\r\nCurrently a batch response looks like this:\r\n```json\r\n[{\"result\":{\"data\":{\"json\":{\"foo\": 1}}}},{\"result\":{\"data\":{\"json\":{\"bar\": 2}}}}]\r\n```\r\n\r\nThough it could be rearranged into\r\n```json\r\n[\r\n{\"result\":{\"data\":{\"json\":{\"foo\": 1}}}},\r\n{\"result\":{\"data\":{\"json\":{\"bar\": 2}}}}\r\n]\r\n```\r\nwhich remains a valid JSON, but can now be parsed line by line.\r\n\r\nIf we take this a step further, the following will, for all intents and purposes, be equivalent:\r\n```json\r\n{\r\n\"1\":{\"result\":{\"data\":{\"json\":{\"bar\": 2}}}},\r\n\"0\":{\"result\":{\"data\":{\"json\":{\"foo\": 1}}}}\r\n}\r\n```\r\n\r\nWhich could be treated as a stream parsed line-by-line on the client:\r\n\r\n```js\r\n\tawait response.body\r\n\t\t.pipeThrough(new TextDecoderStream('utf-8'))\r\n\t\t.pipeThrough(new ChunksToLinesStream())\r\n\t\t.pipeThrough(new LineToObjectStream())\r\n\t\t.pipeTo(actionnableSink)\r\n```\r\ncode example from \r\nhttps://github.com/Sheraff/partial-json-stream/blob/1bea84df3d0ff7c6b5177443a00b9945ad8b0aff/js/script.js#L26-L30\r\n\r\nSince tRPC in batch mode doesn't rely on HTTP status codes (except for the `207`), but uses its own internal error system, adding the downside of a stream — that you can't know the final status of your response before you're done sending it — doesn't seem like a big issue.\r\n\r\nThis would require for the response to be\r\n- serialized request by request, instead of everything together at the end\r\n- flushed to a streamed response after every request finishes\r\n\r\nAnd for the client to\r\n- handle the response as a stream, decode it and chunk it into individual lines\r\n- parse out the index `\"1\":` and the optional trailing comma `,`\r\n- deserialize line by line, and update the react-query cache as soon as a response is ready\r\n\r\nThe performance implication of using JS streams instead of letting the browsers use their native methods should be considered. The example I'm referring to here (https://github.com/Sheraff/partial-json-stream) was written by me so I don't think it's close to optimal, but it is much slower than a simple `await response.json()` for all but the biggest jsons. However this doesn't take into account the possible slowness server-side (for example, when some requests do filesystem or database stuff).\r\n\r\n### Describe alternate solutions\r\n\r\nThe alternative is to leave it as is. **It already works wondrously well**. In any case, even though this feature request still proposes that we send a valid json, the (potential) resulting solution should still be opt-in, probably in the form of an alternative to the `httpBatchLink`.\r\n\r\n### Additional information\r\n\r\nI'd be down to work on this if there is interest, and if someone can point me to the right files in the trpc codebase (like where does the awaiting, serialization and deserialization take place).\r\n\r\n### 👨👧👦 Contributing\r\n\r\n- [X] 🙋♂️ Yes, I'd be down to file a PR implementing this feature!\n\n\u003Csub>[TRP-35](https://linear.app/trpc/issue/TRP-35/feat-out-of-order-batch-response-streaming)\u003C/sub>",[],4343,"feat: out-of-order batch response streaming","2025-03-20T15:41:40Z","https://github.com/trpc/trpc/issues/4343",0.6771008,{"description":3090,"labels":3091,"number":3096,"owner":3019,"repository":3019,"state":3051,"title":3097,"updated_at":3098,"url":3099,"score":3100},"### Area of Improvement\n\nWhen users click on the dropdown for selecting a version, the list is not displayed in the correct order. This can create confusion for users trying to choose a specific version.\n\n### Link to related docs\n\nhttps://trpc.io/\n\n### Additional information\n\n\r\n\n\n### 👨👧👦 Contributing\n\n- [X] 🙋♂️ Yes, I'd be down to file a PR implementing the suggested changes!",[3092,3093],{"name":3048,"color":3049},{"name":3094,"color":3095},"✅ accepted-PRs-welcome","0052cc",5351,"docs: Incorrect Order of Version List in Dropdown","2025-03-20T15:42:12Z","https://github.com/trpc/trpc/issues/5351",0.6779001,{"description":3102,"labels":3103,"number":3105,"owner":3019,"repository":3019,"state":3051,"title":3106,"updated_at":3107,"url":3108,"score":3109},"### Provide environment information\r\n\r\n```\r\n System:\r\n OS: Linux 6.7 Arch Linux\r\n CPU: (24) x64 AMD Ryzen 9 3900X 12-Core Processor\r\n Memory: 54.47 GB / 62.71 GB\r\n Container: Yes\r\n Shell: 5.9 - /bin/zsh\r\n Binaries:\r\n Node: 21.6.2 - /usr/bin/node\r\n Yarn: 4.0.2 - /usr/bin/yarn\r\n npm: 10.4.0 - /usr/bin/npm\r\n Watchman: 4.9.0 - /usr/bin/watchman\r\n Browsers:\r\n Chromium: 122.0.6261.69\r\n```\r\n\r\n### Describe the bug\r\n\r\nAfter sending 16kB of WebSocket messages via the `wsLink` using fastify on the backend, the backend can no longer process messages from that client.\r\n\r\nThis issue comes from both the hacky way of handling backpressure in `@fastify/websocket` (reported there: https://github.com/fastify/fastify-websocket/issues/289) and the improper way of accessing the `WebSocket` object in `@trpc/server`.\r\n\r\nhttps://github.com/trpc/trpc/blob/466e6ca1de358a2ce99a6a57d5711bb11a772371/packages/server/src/adapters/fastify/fastifyTRPCPlugin.ts#L66\r\n\r\nCreating an empty handler like this causes `Duplex` objects to be created while they aren't actually used, which leads in the underlying `WebSocket`s getting paused. `@fastify/websocket` has a way of handling that that is kind of broken and only works if using the recommended way of accessing the objects from the handler (that is left empty here).\r\n\r\n### Link to reproduction\r\n\r\nhttps://github.com/mat-sz/trpc-wslink-bug-reproduction\r\n\r\n### To reproduce\r\n\r\n1. Clone the repository.\r\n2. `yarn install`\r\n3. `yarn build`\r\n4. `yarn dev`\r\n5. Go to http://localhost:3000/\r\n6. Click the button and observe received/sent counts.\r\n\r\nThe button sends 8192 bytes worth of text, after two clicks, the further attempts result in no response (no errors either).\r\n\r\n### Additional information\r\n\r\nThe solution is to export the `connection` handler separately from `adapters/ws.ts` and use that instead of `applyWSSHandler`. I can submit a pull request here since I have a working version of this.\r\n\r\n### 👨👧👦 Contributing\r\n\r\n- [X] 🙋♂️ Yes, I'd be down to file a PR fixing this bug!",[3104],{"name":3029,"color":3030},5530,"bug: When using wsLink and the fastify adapter, after 16kB of messages sent to the server no further messages are processed.","2025-03-20T15:42:17Z","https://github.com/trpc/trpc/issues/5530",0.6827433,["Reactive",3111],{},["Set"],["ShallowReactive",3114],{"$fTRc1wZytZ_XrK4EfJfei_Sz-An4H4Yy6syhVxH_PVJc":-1,"$fSIGy6Cb2wyofu3dDIl7X3N8Ov1gyW4DShc46EwumqYc":-1},"/trpc/trpc/5837"]