Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add the live stream example #68

Merged
merged 1 commit into from
Mar 15, 2013
Merged

add the live stream example #68

merged 1 commit into from
Mar 15, 2013

Conversation

He-Pin
Copy link

@He-Pin He-Pin commented Mar 15, 2013

Hi,sarxos,this example contains an server and an client,both works fine.Them server side accept the connection and encode the video stream ,then send it inside an frame message,
the client side receive the message from the server side,and then decode it first frame decoder and then the h264 decoder.

I decode the image from the stream.and then display it on the client side window.The client side can open as many as you can ,and the stream will stop when there is no clients,and will start to stream when there at least one client connected ,the h264 encoder is not attach to the netty's channel pipeline for the performance issue .and the frame encoder can move out of the channel pipeline too.

thanks for your great project ,really thanks,feel free to modify it .

sarxos added a commit that referenced this pull request Mar 15, 2013
add the live stream example
@sarxos sarxos merged commit 27eb6d3 into sarxos:master Mar 15, 2013
@sarxos
Copy link
Owner

sarxos commented Mar 15, 2013

Thank you :) I really appreciate your help. Tomorrow I will Mavenize it.

@He-Pin
Copy link
Author

He-Pin commented Mar 16, 2013

@sarxos glad to help~

@sarxos
Copy link
Owner

sarxos commented Mar 17, 2013

Hi,

I mavenized your example and tested it a little bit - really good job!

Feel free to send pull requests if you would like to change anything :)

@He-Pin
Copy link
Author

He-Pin commented Mar 18, 2013

@sarxos Thanks ,I will check it out to see how the pom.xml be wrote.I am so glad to help.I will provide some example ,If i come up with some idea:)

@sarxos
Copy link
Owner

sarxos commented Mar 18, 2013

Hi,

Just FYI, the pom.xml is connected to the parent project (Maven projects to be organized in tree structure). If you would like to have standalone project, just remove <parent> and replace it by your own <groupId> and <version>.

Eclipse supports Maven projects by default - you have to choose Import / Other / Existing Maven Project.

@He-Pin
Copy link
Author

He-Pin commented Mar 18, 2013

My home PC run eclipse a little slow,so for simplify I just write it as an simple java project.
I found the IDEA 12 is easy to use~
Thanks,and I will update the full request as the netty 3.6.4 release.cause these is an issue.

@rizaldim
Copy link

hey, first of all thanks sarxos for the project and hepin1989 for the example. I need to know is it possible to use this example but using android phone as the client? how?

@sarxos
Copy link
Owner

sarxos commented Jun 23, 2013

Hi, unfortunately there is no possibility to use Webcam Capture on Android because it requires some of the AWT classes (especially BufferedImage). But on the Android you could try to get Bitmap from native camera, and then transcode it with jcodec project. I think I saw people discussing this somewhere in their issues list.

@He-Pin
Copy link
Author

He-Pin commented Jun 23, 2013

For the android side,some class is missing~and please make use of the android API.

@He-Pin
Copy link
Author

He-Pin commented Jun 24, 2013

@Rizaldi at android side,you should try ffmpeg for decoding the live stream.
I have implement that by this way.you should have an try.

@He-Pin
Copy link
Author

He-Pin commented Jun 24, 2013

@sarxos ,in this example ,I transport the image in h264 packet not the original bufferedImage bytes,so I think for him try out this thing ,should just decode the h264 packet into bitmap just as what you just said.and by the way,thanks for you great job of the lib.:P

@rizaldim
Copy link

@sarxos Actually I plan to use this example at the server side (webcam at the server) and then display its capture on the android device
@hepin1989 How to transport the image in h.264 packet?

@He-Pin
Copy link
Author

He-Pin commented Jun 24, 2013

@Rizaldi using ffmpeg android jni binding

@sarxos
Copy link
Owner

sarxos commented Jun 24, 2013

@hepin1989, indeed, you are right :) By the way, I'm planning to port Webcam Capture to Android as you suggested in one of the issues, but for now I do not have enough time to do that :(

@He-Pin
Copy link
Author

He-Pin commented Jun 24, 2013

@sarxos Hah:I am quite busy too,I am in the capital of China now,the mother company in Beijing,They are planing to make some project for codename MDM,and ask some android deverloper to write the push server...what a big joke~.
And for the porting,I think you may need to write some Native codeAnd there an one nice project
https://github.com/yixia/VitamioBundle
http://www.vitamio.org/
which you may make use of if you plan to do that

by the way,Take careI am learning the JVM now ,than then scala and akka:P

@rizaldim
Copy link

@hepin1989 Do you mean that the server side of your example doesn't need any modification anymore and I just need to decode the received h.264 packet on android using ffmpeg android into bitmap?

@He-Pin
Copy link
Author

He-Pin commented Jun 25, 2013

@Rizaldi yeap,Your are right,and you must notice that,In the example,it's using tcp connection.and you may need to have a look at the codec of the netty pipleline.

@rizaldim
Copy link

@hepin1989 Ok. I have read your codes and I am still not clear about some things. Why do you add LengthFieldPrepender on the server side and LengthFieldBasedFrameDecoder on the client side as channel handler? Is it to send back the information about how many bytes are sent/received? Pardon me for asking too many questions, it's my first time using xuggler and netty library, still trying to understand how your code examples work.

@He-Pin
Copy link
Author

He-Pin commented Jun 26, 2013

@Rizaldi ,for that just for the TLV pattern of message ,which means Type Length Value.Here ,cause the Type of the message is H264 packet ,so just LV.so the L indicates the length of an complete message and the V indicates the bytes.

I am not good at English for explain thing,so sorry.

@He-Pin
Copy link
Author

He-Pin commented Jun 26, 2013

by the way,It cause the TCP is steam oriented connection.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants