-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Question on frame rate #599
Comments
Hi @neilyoung 😄 Indeed, the quantum uncertainty may be to blame, and it may be actually caused by a different quantum states of electrons in the CPU due to the corresponding wave function: But in your specific case this is much simpler to explain. You see, when you take a look on how Now add next thread to the solution (the one from your example), so we have two threads now. Because
You may ask: if this is true, why the hell I see 30 PFS displayed in preview? This is because preview displays FPS of camera, not the thread which capture image. If you sum FPS from all threads then you will get ca. 30 FPS. Solution? Yes, there is and it's fairy simple. You have to make I have to think about a way of how you can make |
@sarxos Cool :) Thanks for the detailed explanations. Could have found that by myself... Well, I experimented with async=true and getImage instead of getImageBytes. Now the frame rate is literally infinite (1000 fps). I suppose, this is because getImage is non blocking anymore. But one step back regarding "async":
|
BTW, would be extremely helpful to have a faster getImageBytes (requirement for image processing). If you would have a solution - awesome. |
Hi @neilyoung, First of all, in regards to your question:
It's already as fast as possible. The fastest method to use it is to have pre-created import java.awt.Dimension;
import java.nio.ByteBuffer;
import com.github.sarxos.webcam.Webcam;
import com.github.sarxos.webcam.WebcamResolution;
public class DirectByteBufferExample {
public static void main(String[] args) {
final Dimension resolution = WebcamResolution.VGA.getSize();
final ByteBuffer buffer = ByteBuffer.allocateDirect(resolution.width * resolution.height * 3);
final Webcam webcam = Webcam.getDefault();
webcam.setViewSize(resolution);
webcam.open();
for (int i = 0; i < 100; i++) {
long t1 = System.currentTimeMillis();
webcam.getImageBytes(buffer);
long t2 = System.currentTimeMillis();
System.out.println(1000 / (t2 - t1 + 1));
buffer.rewind();
// do something with buffer, perform image analysis, etc
}
}
} But in this case you have to either forget about having multiple threads or come up with efficient synchronization mechanism. Please note that different drivers may implement different methods to access RAW image bytes. Default driver access direct (native) memory itself and therefore it's very fast. The other drivers, however, may implement different mechanisms, and thus be slower than default implementation :( In regards to our earlier conversation. This took me a while (~3.5h), but finally I came up with a very nice solution to solve parallelism issue with
IMPORTANT! You will need newest snapshot JAR for this to work. Let the code to speak for itself: import java.awt.Dimension;
import java.awt.Transparency;
import java.awt.color.ColorSpace;
import java.awt.image.BufferedImage;
import java.awt.image.ComponentColorModel;
import java.awt.image.ComponentSampleModel;
import java.awt.image.DataBuffer;
import java.awt.image.DataBufferByte;
import java.awt.image.Raster;
import java.awt.image.WritableRaster;
import java.nio.BufferUnderflowException;
import java.nio.ByteBuffer;
import java.util.Collections;
import java.util.LinkedHashSet;
import java.util.Set;
import java.util.concurrent.Exchanger;
import java.util.concurrent.TimeUnit;
import java.util.concurrent.TimeoutException;
import java.util.concurrent.atomic.AtomicReference;
import javax.swing.JFrame;
import com.github.sarxos.webcam.Webcam;
import com.github.sarxos.webcam.WebcamPanel;
import com.github.sarxos.webcam.WebcamPanel.ImageSupplier;
import com.github.sarxos.webcam.WebcamResolution;
/**
* This example demonstrate how to implement exchange mechanism which will make
* {@link Webcam#getImageBytes()} to run in parallel without causing FPS drop.
*
* @author Bartosz Firyn (sarxos)
*/
public class ParallelGetImageBytesExample {
private static class ByteBufferExchanger extends Exchanger<ByteBuffer> implements AutoCloseable {
private final AsyncWebcamBuffer owner;
public ByteBufferExchanger(final AsyncWebcamBuffer owner) {
this.owner = owner;
}
/**
* Await for new {@link ByteBuffer} to be ready.
*/
public void await() {
awaitAndGet();
}
/**
* Await for new {@link ByteBuffer} to be available and return it.
*
* @return The {@link ByteBuffer}
*/
public ByteBuffer awaitAndGet() {
try {
return exchange(null);
} catch (InterruptedException e) {
throw new IllegalStateException(e);
}
}
/**
* To be used only from {@link AsyncWebcamBuffer}. Please do not invoke this method from the
* other classes.
*
* @param bb the {@link ByteBuffer} to exchange
*/
public void ready(ByteBuffer bb) {
try {
exchange(bb, 0, TimeUnit.SECONDS);
} catch (InterruptedException | TimeoutException e) {
throw new IllegalStateException(e);
}
}
@Override
public void close() {
owner.dispose(this);
}
}
private static class AsyncWebcamBuffer extends Thread {
private final Webcam webcam;
private AtomicReference<ByteBuffer> buffer = new AtomicReference<ByteBuffer>();
private Set<ByteBufferExchanger> exchangers = Collections.synchronizedSet(new LinkedHashSet<ByteBufferExchanger>());
private final int length;
public AsyncWebcamBuffer(Webcam webcam) {
this.webcam = webcam;
this.length = getLength(webcam.getViewSize());
this.setDaemon(true);
this.start();
}
public int getLength(Dimension size) {
return size.width * size.height * 3;
}
public int length() {
return length;
}
public Webcam getWebcam() {
return webcam;
}
@Override
public void run() {
while (webcam.isOpen()) {
// get buffer from webcam (this is direct byte buffer located in off-heap memory)
final ByteBuffer bb = webcam.getImageBytes();
bb.rewind();
buffer.set(bb);
// notify all exchangers
for (ByteBufferExchanger exchanger : exchangers) {
exchanger.ready(bb);
}
}
}
/**
* Be careful when using this reference! It's non synchronized so you have to take special
* care to synchronize and maintain position in buffer to avoid
* {@link BufferUnderflowException}.
*
* @return Non synchronized {@link ByteBuffer}
*/
public ByteBuffer getByteBuffer() {
return buffer.get();
}
/**
* @return New {@link ByteBufferExchanger}
*/
public ByteBufferExchanger exchanger() {
final ByteBufferExchanger exchanger = new ByteBufferExchanger(this);
exchangers.add(exchanger);
return exchanger;
}
public void dispose(ByteBufferExchanger exchanger) {
exchangers.remove(exchanger);
}
/**
* Rewrite {@link ByteBuffer} data to the provided byte[] array.
*
* @param bytes the byte[] array to rewrite {@link ByteBuffer} into
*/
public void read(byte[] bytes) {
final ByteBuffer buffer = getByteBuffer();
// all operations on buffer need to be synchronized
synchronized (buffer) {
buffer.rewind();
buffer.get(bytes);
buffer.rewind();
}
}
/**
* Rewrite {@link ByteBuffer} to newly created byte[] array and return it.
*
* @return Newly created byte[] array with data from {@link ByteBuffer}
*/
public byte[] read() {
final byte[] bytes = new byte[length];
final ByteBuffer buffer = getByteBuffer();
// all operations on buffer need to be synchronized
synchronized (buffer) {
buffer.rewind();
buffer.get(bytes);
buffer.rewind();
}
return bytes;
}
public boolean isReady() {
return buffer.get() != null;
}
}
private static class WebcamPanelImageSupplier implements ImageSupplier {
private final int[] imageOffset = new int[] { 0 };
private final int[] bandOffsets = new int[] { 0, 1, 2 };
private final int[] bits = { 8, 8, 8 };
private final int dataType = DataBuffer.TYPE_BYTE;
private final Dimension size;
private final AsyncWebcamBuffer buffer;
private final ComponentSampleModel sampleModel;
private final ColorSpace colorSpace;
private final ComponentColorModel colorModel;
public WebcamPanelImageSupplier(AsyncWebcamBuffer buffer) {
this.buffer = buffer;
this.size = buffer.getWebcam().getViewSize();
this.sampleModel = new ComponentSampleModel(dataType, size.width, size.height, 3, size.width * 3, bandOffsets);
this.colorSpace = ColorSpace.getInstance(ColorSpace.CS_sRGB);
this.colorModel = new ComponentColorModel(colorSpace, bits, false, false, Transparency.OPAQUE, dataType);
}
@Override
public BufferedImage get() {
while (!buffer.isReady()) {
return null;
}
final byte[] bytes = new byte[size.width * size.height * 3];
final byte[][] data = new byte[][] { bytes };
buffer.read(bytes);
final DataBufferByte dataBuffer = new DataBufferByte(data, bytes.length, imageOffset);
final WritableRaster raster = Raster.createWritableRaster(sampleModel, dataBuffer, null);
final BufferedImage image = new BufferedImage(colorModel, raster, false, null);
return image;
}
}
public static void main(String[] args) throws InterruptedException {
final Dimension size = WebcamResolution.VGA.getSize();
final Webcam webcam = Webcam.getDefault();
webcam.setViewSize(size);
webcam.open();
final AsyncWebcamBuffer buffer = new AsyncWebcamBuffer(webcam);
final ImageSupplier supplier = new WebcamPanelImageSupplier(buffer);
final WebcamPanel panel = new WebcamPanel(webcam, size, true, supplier);
panel.setFPSDisplayed(true);
panel.setDisplayDebugInfo(true);
panel.setImageSizeDisplayed(true);
panel.setMirrored(true);
final JFrame window = new JFrame("Test webcam panel");
window.add(panel);
window.setResizable(true);
window.setDefaultCloseOperation(JFrame.EXIT_ON_CLOSE);
window.pack();
window.setVisible(true);
// this thread will get underlying ByteBuffer and perform synchronized op to
// get rewrite it into bytes[] array
final Thread t1 = new Thread() {
@Override
public void run() {
// make sure to close exchanger because you will end up with memory leak
try (final ByteBufferExchanger exchanger = buffer.exchanger()) {
while (webcam.isOpen()) {
long t1 = System.currentTimeMillis();
final ByteBuffer bb = exchanger.awaitAndGet();
long t2 = System.currentTimeMillis();
System.out.println(getName() + " : " + 1000 / (t2 - t1 + 1));
final byte[] bytes = new byte[buffer.length];
// make sure to synchronize or you will end up
synchronized (bb) {
bb.rewind();
bb.get(bytes);
bb.rewind();
}
// do processing on bytes[] array
}
}
}
};
t1.start();
// this thread will await for underlying ByteBuffer to be ready and perform
// synchronized op to get rewrite it into new bytes[] array
final Thread t2 = new Thread() {
@Override
public void run() {
try (final ByteBufferExchanger exchanger = buffer.exchanger()) {
while (webcam.isOpen()) {
long t1 = System.currentTimeMillis();
exchanger.await();
long t2 = System.currentTimeMillis();
System.out.println(getName() + " : " + 1000 / (t2 - t1 + 1));
final byte[] bytes = buffer.read();
// do processing on bytes[] array
}
}
}
};
t2.start();
// this thread will await for underlying ByteBuffer to be ready and perform
// synchronized op to get rewrite it into pre-created bytes[] array
final Thread t3 = new Thread() {
@Override
public void run() {
try (final ByteBufferExchanger exchanger = buffer.exchanger()) {
final byte[] bytes = new byte[buffer.length()];
while (webcam.isOpen()) {
long t1 = System.currentTimeMillis();
exchanger.await();
long t2 = System.currentTimeMillis();
System.out.println(getName() + " : " + 1000 / (t2 - t1 + 1));
buffer.read(bytes);
// do processing on bytes[] array
}
}
}
};
t3.start();
}
} |
Getting this currently... Exception in thread "main" java.lang.NoClassDefFoundError: org/slf4j/LoggerFactory |
OK, could fix this, but now:
|
The |
Sorry, my bad. Ok runs fine for a while but crashes then.
|
@neilyoung, my first guess is that it's caused by GC, but will have to test this later to be sure. |
Hi @neilyoung, After checking the code I found that it will happen when there is no consumer waiting for the image on the exchanger other side. That is, when webcam produced image, and sent it to The solution for this differs in terms of what you would like to achieve, because you can either:
To implement 1, simply change code to: public void ready(ByteBuffer bb) {
try {
exchange(bb, 0, TimeUnit.SECONDS);
} catch (InterruptedException | TimeoutException e) {
// do nothing, simply drop frame
}
} To implement 2, simply change code to: public void ready(ByteBuffer bb) {
try {
// wait max 10 seconds for frame to be consumed
exchange(bb, 10, TimeUnit.SECONDS);
} catch (InterruptedException | TimeoutException e) {
throw new IllegalStateException(e);
}
} Or 1 and 2 mixin (this is what I did in example from source code, but I'm waiting 500 ms instead of 10 seconds): public void ready(ByteBuffer bb) {
try {
exchange(bb, 500, TimeUnit.MILLISECONDS);
} catch (InterruptedException | TimeoutException e) {
// do nothing, frame is dropped
}
} And there is no easy solution for 3, but it's doable. If you are interested in concurrect patterns in Java you can take a look at this repository. It contains very good examples on how Java concurrent structures can be used in practice: https://github.com/LeonardoZ/java-concurrency-patterns |
@sarxos Thanks for your patience. Will try to check that out later |
I was using the CalculateFPSExample.java app in order to test the frame rate. As expected with my HD cam it is always about 30 fps.
Then I combined it with the preview display and was surprised, that the frame rate is only about 15 fps now. The same happens, if I do a
webcam.open(false)
w/o previewAny idea, why? For me this is a bit too much "Heisenbergsche Unschärfe" (if you know what I mean) :)
My sample app:
The text was updated successfully, but these errors were encountered: