-
I need to make specific data both real-time and not. I have not found in docs what to do if the same data should be real-time in certain cases and globally accessible but should not affect the data that is real-time. In the first screen "User profile" where I stream real-time posts, likes and comments of currently authenticated user, that actually made them, filtered in PowerSync by token parameters user id. It compares posts author id and current authenticated user id, takes these posts ids and in data returns all posts with these ids, likes of these posts and comments. This section works perfectly fine, but when it takes to combine with the second, it becomes tricky. In the second screen "Feed" I need to get all posts globally by all users, where posts need to be paginated with infinity scroll and fetched as user scroll. New data that is posted by other users should not be appeared in real-time in "Feed", however it is only when someone else posts something. So, whenever you are the one who post, newly created post should be in real-time displayed at the top of the all posts that was fetched. In the addition, likes should not be changed in the "Feed" whenever someone else likes the post, but when you do this, right in the "Feed" view, or from anywhere else, it should be changed in real-time BUT only for you and synced. Overall, I have no idea how to combine all of these and makes it working in sync rules. Separately, first screen is implemented and I have no problem with it. But now, when I began to implement second screen I was stuck. Here are current sync rules that I am using. I have implemented only streaming the data that currently authenticated user owns, so with the rest I need help. bucket_definitions:
user_profiles:
parameters:
select id as profile_id from profiles where id = token_parameters.user_id
data:
- select * from profiles where id = bucket.profile_id
# Makes all posts, likes and comments real-time where publisher user id equals
# to current authenticated user id from token parameters.
user_posts:
parameters:
select id as post_id from posts where user_id = token_parameters.user_id
data:
- select * from posts where posts.id = bucket.post_id
- select * from comments where comments.post_id = bucket.post_id
- select * from likes where likes.post_id = bucket.post_id |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 3 replies
-
The sync rules you have for the user's own posts looks good. Building the feeds of other users' posts is indeed a more tricky problem. How much data are you working with? You mentioned "all posts globally by all users" - will the data volumes be low enough that you can sync and display everything (or all recent posts), or do you want to display personalized recommendations, similar to Reddit or Twitter? You can split this into two separate problems - syncing the data, and displaying the data on the client. There are lots of specifics you'd have to figure out, but I'll try to give a broad overview of some available options. Syncing dataIf you're working with all posts, with PowerSync you can do something support something like "all posts created within the last 7 days". Not directly in sync rules, but you can add add a column such as "created_within_7_days", which you periodically update, and filter according to that in sync rules. PowerSync does not currently support syncing a more dynamic set of data, e.g. syncing more data as the user scrolls down. Sync rules are designed to specify upfront which data should be synced, so that everything is available offline. If you do need to load more data dynamically as the user scrolls down, a custom solution that loads the data via an API, and caches in a local table, may be more suitable for this part. If you're doing personalized feeds for each user instead, you'd need to build the feeds upfront, and can implement sync rules like this: parameters: select post_id from user_feed_posts where user_id = token_parameters.user_id
# data can be the same as for user_posts Just note the limit of 1000 buckets per user. If the feeds are more dynamic (don't want to build it upfront for each user), loading it via an API may once again be more applicable. Displaying dataSo the main problem you have here is that you have some data that needs to be always up to date, while others should only be refreshed at certain times (e.g. when the user loads the page?), but all merged into one feed. You can do this either on the database level, or on the app level. The exact approach differs a bit on the approach taken for syncing data. If you're syncing all posts continuously, you can create a local-only table persisting the current feed, updating when needed, for example: INSERT INTO local_feed(post_id, like_count) SELECT post_id, like_count FROM posts You can then use a single query that will combine the static feed with the user's posts, e.g.: # Not checked for accuracy. Need to be adjusted to account for likes
SELECT posts.* FROM posts WHERE user_id = ? UNION SELECT posts.* FROM posts JOIN local_feed ON posts.id = LOCAL_FEED.id ORDER BY created_at DESC If you're going with a custom cache implementation for the feed into a local table, the same type of query would work. Handling likes would make the query more complicated, but you can use static data like above for the global like count, and add 1 if the user liked the post. Instead of using the database for the static feed, you could also do something similar in memory, and merge the feed with the user's own posts in app-level code. You could have a watched query for the user's posts and likes, and only load the rest of the data once. |
Beta Was this translation helpful? Give feedback.
Also posted on Discord, but repeating here: When you query data on the client, you query from the tables, not the buckets. The buckets are only used in the sync process, not in client-side queries. So it's as simple defining two different queries on the client: one that filters according to user_id, and one that doesn't.