-
-
Notifications
You must be signed in to change notification settings - Fork 487
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
API PROPOSAL: Remove start() #415
Comments
How would the service be able to handle Android Auto connections? In the current implementation, if there is nothing currently playing, android auto will run the onGetRoot and onLoadChildren methods, but nothing will be returned because isolate isn't actually around. It feels like the background task should be able to respond (maybe spin up the isolate) at any time without the need for an item to be played. I also think coupling the audio service to It would also be useful to keep the media notification around without the foreground service and wakelock. I think there's another issue for that though. |
Hi @keaganhilliard , Android Auto was definitely one of the motivating use cases for this change. The lifecycle will be:
|
Maybe the new media control in Android 11 will also be the motivating case, which use onGetRoot and onLoadChildren methods, too. https://developer.android.com/preview/features/media-controls |
Yes, there's also lots of other cool things in media2 that I would like to eventually use. The support library doesn't expose any of it yet, so I'll have to wait, but at least getting |
Awesome! That makes a lot of sense and it should solve a lot of the pain points I'm finding with the service. Thanks for all of the hard work. I'm happy to jump in and implement something to help out 😄 |
@keaganhilliard I'm going to create a branch for it so maybe it will be possible to collaborate. I actually attempted to do this a couple of months ago but ran into a bug in the Flutter engine, so I put it on hold. When I get a chance (hopefully soon) I will try again and get the ball rolling. If the Flutter engine bug still exists, it will be good if I actually reported it to the Flutter team ASAP. |
Actually, it seems I had that wrong. media2 supersedes the support libraries and is also an unbundled library, so it's technically safe for me to upgrade now. It may make sense to upgrade to media2 first, and then see how the current proposal can be built on top of that, because media2 has a different lifecycle than the old APIs and that may affect how I implement the proposal. |
That sounds good! Another thought I had was that we will need to start supporting the search functions if we want to be able to publish apps with Android Auto available. We will need to expose |
I have one thought about this. My current implementation depends on being able to start different tasks depending on what type of streams i'm about to play that require different kind of behaviours. For example, one for podcasts and another for live streaming. Would this API change remove the option of having different BackgroundAudioPlayerTask implementations?
|
@keaganhilliard sounds good to me! @snaeji Technically you "could" pass in different entrypoints by managing the connection yourself. However, the intention will be to have a 1-to-1 correspondence between the background isolate and the service, so it would be better to move this decision point into the background isolate itself. Within the background isolate, you could create a "player" abstraction with two implementations and delegate all requests to that, and have a custom action to switch between them. This would avoid having to shut down and restart the isolate which is an expensive operation. |
Sounds good @ryanheise this logic can of course just be placed in the isolate 👍 |
This is reason enough for me! I currently have an overly complex system, where the ui and background isolate rely on the same data store... but the data store (hivedb) only works single threaded. So the UI has to switch between using the data store and querying the audio_service, at just the right times (too soon, and there's nothing to query because the background service isn't running, too late and everthing fails because of multiple access to the datastore). |
I think this is going to be more important as the new media resumption features are rolled out in Android 11. I'm already noticing some odd things (multiple dead notifications) with Audio service on 11. I'll file an issue once I can make a replication project, but I figured I would drop a note here. |
I'm just downloading the Android 11 update now to see what you mean. The options |
I noticed the dead notification thing too while briefly testing the Android 11 GSI, with my own app (so the issue's not in the example). |
I haven't been able to reproduce it yet. Can either of you share what steps are required to make it happen? A dead notification can also be caused by an uncaught exception in |
@ryanheise Created an issue (#462) to track this separately. Outlined how I was able to replicate the issue. Hope that helps! |
Any updates on this? I'm not in a hurry, just curious ;) |
Sort of. I don't have any updates for you to get excited about, but I basically have 2 local dev branches, one is a really old branch with my first attempt to implement this (which ran into a bug in the Flutter engine), and the other is my attempt to convert the library to media2. I figured it would be a good idea to make the switch to media2 first and then build everything on top of that, but it turns out media2 has very poor documentation, no examples, and it seems no other open source projects are using it yet. I'm getting there, but honestly I plan to put that effort aside and for the moment see how far I can push the old app-compat media library. Now, one of the issues with my original attempt to implement the current proposal was that I hit a deadlock in the Flutter engine when trying to dispose of it in the activity's HOWEVER, there is another idea that has been bubbling in my head, actually since the inception of the project, that I also want to try out which may help to avoid this sort of issue, and bring with it a whole lot of other benefits. When I describe it, you might wonder why I hadn't tried this earlier, or you might wonder why you didn't think of it yourself or question it ;-) Basically what I want to try to do is use a "single" Flutter engine for everything, and avoid this whole isolate business. If this is possible, it will result in lower startup overheads, lower memory overheads, less hassle, a simpler iOS implementation, and no deadlock when I try to shutdown the second engine. How could this potentially work? Scenario 1: The user launches the app. The activity is started and the FlutterEngine is created. Now we immediately "bind" to the service which runs entirely in this engine. When the app "starts" the service, no new FlutterEngine is created, the existing service running in the main FlutterEngine just changes states to started. If the user swipes away and destroys the activity while the service is running, we convert the existing FlutterEngine into a headless FlutterEngine and keep running. If the user launches the activity again while the service is still started, that activity reconnects to the same FlutterEngine and runs Scenario 2: The app's process is no longer alive, but the media notification is still present. The user clicks the play button, the OS reactivates the media session and starts the service. We detect that there's no FlutterEngine so we create one and from this point carry on as if we were in the middle of Scenario 1. The issue here is how to supply the entrypoint to the FlutterEngine. I might run into some show stoppers here but I think it's worth a try. Conceptually it should be possible given the way FlutterEngine is supposed to work, but if there is anything I need from FlutterEngine that is not currently there, I'll need to submit a feature request. As for why I hadn't tried to do it yet, the background execution APIs were quite immature when I started developing this thing, so I went with what was the most straightforward use of those APIs. Since then, I just never had the time, as it was always more important to work on stability and missing features. Having said all of the above, I think it's quite important to consider that the fact that you can currently very easily communicate with the background audio task from another FlutterEngine (e.g. one provided by android_alarm_manager) is very convenient, and it's a use case I wouldn't want to throw out. So if all of the above works, I still plan to keep the code that allows one FlutterEngine to communicate to another FlutterEngine. |
Ah, I see. The documentation for I have just pushed an update fixing the docs. |
Would anyone object if I removed In other words, the following code would be changed: PlaybackStateCompat.Builder stateBuilder = new PlaybackStateCompat.Builder()
.setActions(PlaybackStateCompat.ACTION_PLAY_PAUSE | actionBits)
.setState(getPlaybackState(), position, speed, updateTime)
.setBufferedPosition(bufferedPosition); so that long allActions = PlaybackStateCompat.ACTION_STOP | PlaybackStateCompat.ACTION_PAUSE
| PlaybackStateCompat.ACTION_PLAY | PlaybackStateCompat.ACTION_REWIND
| PlaybackStateCompat.ACTION_SKIP_TO_PREVIOUS
| PlaybackStateCompat.ACTION_SKIP_TO_NEXT
| PlaybackStateCompat.ACTION_FAST_FORWARD
| PlaybackStateCompat.ACTION_SET_RATING
| PlaybackStateCompat.ACTION_SEEK_TO | PlaybackStateCompat.ACTION_PLAY_PAUSE
| PlaybackStateCompat.ACTION_PLAY_FROM_MEDIA_ID
| PlaybackStateCompat.ACTION_PLAY_FROM_SEARCH
| PlaybackStateCompat.ACTION_SKIP_TO_QUEUE_ITEM
| PlaybackStateCompat.ACTION_PLAY_FROM_URI | PlaybackStateCompat.ACTION_PREPARE
| PlaybackStateCompat.ACTION_PREPARE_FROM_MEDIA_ID
| PlaybackStateCompat.ACTION_PREPARE_FROM_SEARCH
| PlaybackStateCompat.ACTION_PREPARE_FROM_URI
| PlaybackStateCompat.ACTION_SET_REPEAT_MODE
| PlaybackStateCompat.ACTION_SET_SHUFFLE_MODE
| PlaybackStateCompat.ACTION_SET_CAPTIONING_ENABLED; |
Why? For example I don't use play or prepare from uri in my app, this can be applied to any other action |
Ah I see why you suggest this
If Android made this in media2 it should be safe for us to do the same |
I'm not using |
Actually, mSessionCompat.setFlags(MediaSessionCompat.FLAG_HANDLES_QUEUE_COMMANDS); But that's actually quite surprising. I would go two steps further and do this: mediaSession.setFlags(MediaSessionCompat.FLAG_HANDLES_MEDIA_BUTTONS | MediaSessionCompat.FLAG_HANDLES_TRANSPORT_CONTROLS | MediaSessionCompat.FLAG_HANDLES_QUEUE_COMMANDS); As for what media2 does with https://developer.android.com/about/versions/10/features#media-notifications |
MediaSessionCompat.FLAG_HANDLES_MEDIA_BUTTONS and MediaSessionCompat.FLAG_HANDLES_TRANSPORT_CONTROLS are enabled by default from API 26 and were deprecated. I suppose compat libraries set them automatically for older APIs |
Thanks for that, I've confirmed they are indeed on by default, and so by copying media2's approach I can now get rid of the That leaves the So there's still a reason to keep Or, alternatively I could hack the iOS implementation to emulate the Android behaviour, and remove Not really sure which way I prefer... |
Why just not check if there's a duration in media item and enable it automatically? If we do that, it should be clearly documented, so there are no surprises for developer. Also, I'm not really aware - is there a seekbar on web somewhere? |
That's what I meant by this:
Testing it further, there are some subtle differences between enabling the seek action and setting a null duration. If the duration is set but the seek action is not, then you get a horizontal line in place of the seek bar, with the duration shown on it, but with no draggable seek handle. So these could be set independently with different effect as desired. Yes the web implementation includes support for the seek bar. |
I've just pushed some commits which enable the automatically enables the android queue and also automatically enables all action bits except for seek. I spent a lot of time thinking about whether to treat seek as a special case like this, but in the end I think it is a special case because it is the only action on Android where disabling it would have an effect on the media style notification, so it's only fair to allow the user to configure it. It's also easier to migrate to since it's effectively the same as the old behaviour, and less complicated to consider all the implications (but I can revisit it in the future.) I'm going to start doing a final review of the documentation and preparing a pre-release. |
Yeah, I thought about it more too, actually it should be explicit as you concluded as well. It feels like a notification control. With #633 I believe it should be possible to move it to the controls API, but for now keeping it the system actions seems ok |
Can you merge the #735 before you make a preview? The isolate is not ended currently, since streams are not working in it |
Do I need to do something else besides this to enable the seek bar now? systemActions: const {
MediaAction.seek,
}, I see the seek bar on the iPhone simulator but not the Android emulator. |
For the seek bar to show, the other requirement is that the current media item must have a duration. And one other subtle thing is that it may not show in the compact notification view so you may need to expand the notification. Does the example work for you on the Android emulator? |
0.18.0-beta.0 has dropped. Phew! |
OK, now let me know if I've missed anything ;-)
|
@ryanheise Congratulations on getting the beta out! Regarding the seek bar, I also don't see the seek bar on the example project. (Video below) Screen.Recording.2021-07-13.at.08.56.54.movA separate issue is the audio is a little choppy (Android emulator only) when I first start playing something. It could be that both issues are related to my emulator setup. I can open a separate issue for one or both of these things if you think they might issues larger than just my emulator setup. |
I should be able to push this to our beta application soon for testing (it's probably not ready for production, but better than nothing), thanks! |
I'm already using the pre-beta version of the one-isolate branch in production and it seems to be working well for my use case. I'll be testing the beta version too this week. |
Does this mean that there's no need to check for |
That's correct. |
Migration is working well for me, only issue is that AudioService.position doesn't seem to fire a value when initially listened too. This causes my seek bar to show no value when the player screen is accessed when paused, since no value is fired. I'm using rxdart similar to the example to combine streams (the other streams have values, it's just position): https://github.com/UnicornsOnLSD/finamp/blob/master/lib/services/progressStateStream.dart My seekbar code can be seen here: https://github.com/UnicornsOnLSD/finamp/blob/master/lib/components/PlayerScreen/ProgressSlider.dart This is likely because v17 stored position as a ValueStream, while v18 stores it as a Stream. Also, one last thing: Is it now possible to store classes in a MediaItem's extras? Before, it wasn't possible as Dart couldn't pass classes through isolates. I tried it in v18 and it managed to get the value over to the AudioHandler but an exception is thrown at some point and the player notification doesn't show up. Here is the exception (BaseItemDto is the class):
Edit: Didn't see the comment on the extras field, guess it still isn't supported |
0.18.0 is released! A BIG thank you again to the helpers mentioned in the CHANGELOG below and everyone else who gave feedback over the past year.
|
This issue has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs, or use StackOverflow if you need help with audio_service. |
For historical reasons, and the fact that higher priority issues have tended to get in the way, I have never addressed this until now. But this is an important API improvement that I will be turning my attention to now that we have achieved feature parity between Android and iOS.
The background audio task is intended to be a self-sufficient isolate that can run even when the UI is absent. It should therefore contain all audio playback logic, and should also be self-sufficient in terms of being able to access the library of media items that it wishes to play. This background task was originally intended to be the source of truth in an app regarding audio content and state.
The problem is that the UI can't access this data until the service is started. When the app starts, it would like to be able to display the queue and any other state information, and it should be able to do this even before the service is started.
In Android, a service actually can run even before it is started, as long as something is bound to it, and that is typically the UI. So what's currently missing from this plugin is the ability to communicate with the service before it has been started. Under the hood, the service is only started when
onPlay()
is called, and so we don't actually need an explicit API to start the audio service. We can have this happen transparently under the hoodIn iOS, there are no special requirements on how this is done, so the lowest common denominator is to do it in a way that fits into Android's service model.
Once this change is implemented, the background audio task would be created on the first connection, it will be (internally) started via
onPlay
, and it will be destroyed when BOTH no active connections exist AND the service has been stopped (implicitly either viaonPause
oronStop
). Thus, callingAudioService.stop()
would not necessarily cause the background audio task to disappear, as long as the UI is still present, and as long as the UI is still present, it will be able to query the background audio task for any data from this "single source of truth".This could take a few weeks to implement (on the optimistic end), and so in the meantime I would like to pin this issue to collect feedback and suggestions regarding the API change.
The text was updated successfully, but these errors were encountered: