Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question on frame rate #599

Open
neilyoung opened this issue Dec 10, 2017 · 11 comments
Open

Question on frame rate #599

neilyoung opened this issue Dec 10, 2017 · 11 comments

Comments

@neilyoung
Copy link

I was using the CalculateFPSExample.java app in order to test the frame rate. As expected with my HD cam it is always about 30 fps.

Then I combined it with the preview display and was surprised, that the frame rate is only about 15 fps now. The same happens, if I do a webcam.open(false) w/o preview

Any idea, why? For me this is a bit too much "Heisenbergsche Unschärfe" (if you know what I mean) :)

My sample app:

public static void main(String[] args) {
        final Webcam webcam = Webcam.getDefault();
        webcam.setViewSize(WebcamResolution.VGA.getSize());
        webcam.open();

//        Uncommenting this reduces the framerate by 50 %

//        final WebcamPanel panel = new WebcamPanel(webcam);
//        panel.setFPSDisplayed(true);
//        panel.setImageSizeDisplayed(true);
//        panel.setMirrored(true);
//        JFrame window = new JFrame("Test webcam panel");
//        window.setLayout(new FlowLayout());
//        window.add(panel);
//
//        window.setResizable(true);
//        window.setDefaultCloseOperation(WindowConstants.EXIT_ON_CLOSE);
//        window.pack();
//        window.setVisible(true);

        new Thread(new Runnable() {
            @Override
            public void run() {
                long t1 = 0;
                long t2 = 0;

                int p = 100;
                int r = 10;

                for (int k = 0; k < p; k++) {
                    t1 = System.currentTimeMillis();
                    for (int i = 0; ++i <= r; webcam.getImageBytes()) {
                    }
                    t2 = System.currentTimeMillis();
                    System.out.println("FPS " + k + ": " + (1000 * r / (t2 - t1)));
                }
            }
        }).start();
}
@sarxos
Copy link
Owner

sarxos commented Dec 11, 2017

Hi @neilyoung 😄

Indeed, the quantum uncertainty may be to blame, and it may be actually caused by a different quantum states of electrons in the CPU due to the corresponding wave function:

image

But in your specific case this is much simpler to explain. You see, when you take a look on how WebcamPanel is implemented, you may notice a Thread running in background which constantly invokes getImage() and renders it on a screen. This happens with maximum FPS rate to have as smooth preview as possible.

Now add next thread to the solution (the one from your example), so we have two threads now. Because getImage() is blocking by default, these two threads are fighting for access time. Since both have exactly the same priority, JVM gives 50% of access time to first one, and 50% of access time to the second one. Do you see connection here? 100% of access time is 30 FPS (max what your camera can give), then 50% is 15 FPS, and when you add one more thread this will drop down to 33%, which is 10 FPS:

FPS thread 2: 5: 9
FPS thread 1: 5: 9
FPS thread 2: 6: 10
FPS thread 1: 6: 10

You may ask: if this is true, why the hell I see 30 PFS displayed in preview? This is because preview displays FPS of camera, not the thread which capture image. If you sum FPS from all threads then you will get ca. 30 FPS.

Solution? Yes, there is and it's fairy simple. You have to make getImage() non blocking. This can be done by doing webcam.open(true) which will open webcam in a asynchronous (non-blocking) mode. But please note that this will only work for getImage(), not for getImageBytes().

I have to think about a way of how you can make getImageBytes() non blocking and will come back to you later.

@neilyoung
Copy link
Author

@sarxos Cool :) Thanks for the detailed explanations. Could have found that by myself...

Well, I experimented with async=true and getImage instead of getImageBytes. Now the frame rate is literally infinite (1000 fps). I suppose, this is because getImage is non blocking anymore.

But one step back regarding "async":

  • The app above produces 30 fps measured with async=false and NO preview
  • As reported and explained this goes down to 15 fps WITH preview
  • The app produces 15 fps measured with async=true with and without preview

@neilyoung
Copy link
Author

BTW, would be extremely helpful to have a faster getImageBytes (requirement for image processing). If you would have a solution - awesome.

@sarxos
Copy link
Owner

sarxos commented Dec 13, 2017

Hi @neilyoung,

First of all, in regards to your question:

would be extremely helpful to have a faster getImageBytes

It's already as fast as possible. The fastest method to use it is to have pre-created ByteBuffer and use it as a target where data from native webcam buffer is copied into:

import java.awt.Dimension;
import java.nio.ByteBuffer;

import com.github.sarxos.webcam.Webcam;
import com.github.sarxos.webcam.WebcamResolution;


public class DirectByteBufferExample {

	public static void main(String[] args) {

		final Dimension resolution = WebcamResolution.VGA.getSize();
		final ByteBuffer buffer = ByteBuffer.allocateDirect(resolution.width * resolution.height * 3);

		final Webcam webcam = Webcam.getDefault();
		webcam.setViewSize(resolution);
		webcam.open();

		for (int i = 0; i < 100; i++) {

			long t1 = System.currentTimeMillis();
			webcam.getImageBytes(buffer);
			long t2 = System.currentTimeMillis();

			System.out.println(1000 / (t2 - t1 + 1));

			buffer.rewind();

			// do something with buffer, perform image analysis, etc
		}
	}
}

But in this case you have to either forget about having multiple threads or come up with efficient synchronization mechanism.

Please note that different drivers may implement different methods to access RAW image bytes. Default driver access direct (native) memory itself and therefore it's very fast. The other drivers, however, may implement different mechanisms, and thus be slower than default implementation :(

In regards to our earlier conversation. This took me a while (~3.5h), but finally I came up with a very nice solution to solve parallelism issue with Webcam.getImageBytes(). The example is built on top of AtomicReference, Exchanger<ByteBuffer> and new interface from WebcamPanel, namely ImageSupplier. The main idea is not to use getImage() together with getImageBytes() but completely switch to the second one. I tested this on my Ubuntu laptop and was able to get ~30 FPS from every thread (tested with 3) together with one extra thread from WebcamPanel which renders video feed on the screen.

Thread-5 : 31
Thread-4 : 30
Thread-6 : 32
Thread-5 : 29
Thread-6 : 28
Thread-4 : 30
Thread-4 : 25
Thread-5 : 25
Thread-6 : 24
Thread-5 : 37
Thread-6 : 37
Thread-4 : 38
Thread-5 : 27
Thread-4 : 28
Thread-6 : 27

IMPORTANT! You will need newest snapshot JAR for this to work.

Let the code to speak for itself:

import java.awt.Dimension;
import java.awt.Transparency;
import java.awt.color.ColorSpace;
import java.awt.image.BufferedImage;
import java.awt.image.ComponentColorModel;
import java.awt.image.ComponentSampleModel;
import java.awt.image.DataBuffer;
import java.awt.image.DataBufferByte;
import java.awt.image.Raster;
import java.awt.image.WritableRaster;
import java.nio.BufferUnderflowException;
import java.nio.ByteBuffer;
import java.util.Collections;
import java.util.LinkedHashSet;
import java.util.Set;
import java.util.concurrent.Exchanger;
import java.util.concurrent.TimeUnit;
import java.util.concurrent.TimeoutException;
import java.util.concurrent.atomic.AtomicReference;

import javax.swing.JFrame;

import com.github.sarxos.webcam.Webcam;
import com.github.sarxos.webcam.WebcamPanel;
import com.github.sarxos.webcam.WebcamPanel.ImageSupplier;
import com.github.sarxos.webcam.WebcamResolution;


/**
 * This example demonstrate how to implement exchange mechanism which will make
 * {@link Webcam#getImageBytes()} to run in parallel without causing FPS drop.
 *
 * @author Bartosz Firyn (sarxos)
 */
public class ParallelGetImageBytesExample {

	private static class ByteBufferExchanger extends Exchanger<ByteBuffer> implements AutoCloseable {

		private final AsyncWebcamBuffer owner;

		public ByteBufferExchanger(final AsyncWebcamBuffer owner) {
			this.owner = owner;
		}

		/**
		 * Await for new {@link ByteBuffer} to be ready.
		 */
		public void await() {
			awaitAndGet();
		}

		/**
		 * Await for new {@link ByteBuffer} to be available and return it.
		 *
		 * @return The {@link ByteBuffer}
		 */
		public ByteBuffer awaitAndGet() {
			try {
				return exchange(null);
			} catch (InterruptedException e) {
				throw new IllegalStateException(e);
			}
		}

		/**
		 * To be used only from {@link AsyncWebcamBuffer}. Please do not invoke this method from the
		 * other classes.
		 *
		 * @param bb the {@link ByteBuffer} to exchange
		 */
		public void ready(ByteBuffer bb) {
			try {
				exchange(bb, 0, TimeUnit.SECONDS);
			} catch (InterruptedException | TimeoutException e) {
				throw new IllegalStateException(e);
			}
		}

		@Override
		public void close() {
			owner.dispose(this);
		}
	}

	private static class AsyncWebcamBuffer extends Thread {

		private final Webcam webcam;
		private AtomicReference<ByteBuffer> buffer = new AtomicReference<ByteBuffer>();
		private Set<ByteBufferExchanger> exchangers = Collections.synchronizedSet(new LinkedHashSet<ByteBufferExchanger>());
		private final int length;

		public AsyncWebcamBuffer(Webcam webcam) {
			this.webcam = webcam;
			this.length = getLength(webcam.getViewSize());
			this.setDaemon(true);
			this.start();
		}

		public int getLength(Dimension size) {
			return size.width * size.height * 3;
		}

		public int length() {
			return length;
		}

		public Webcam getWebcam() {
			return webcam;
		}

		@Override
		public void run() {
			while (webcam.isOpen()) {

				// get buffer from webcam (this is direct byte buffer located in off-heap memory)

				final ByteBuffer bb = webcam.getImageBytes();
				bb.rewind();

				buffer.set(bb);

				// notify all exchangers

				for (ByteBufferExchanger exchanger : exchangers) {
					exchanger.ready(bb);
				}
			}
		}

		/**
		 * Be careful when using this reference! It's non synchronized so you have to take special
		 * care to synchronize and maintain position in buffer to avoid
		 * {@link BufferUnderflowException}.
		 *
		 * @return Non synchronized {@link ByteBuffer}
		 */
		public ByteBuffer getByteBuffer() {
			return buffer.get();
		}

		/**
		 * @return New {@link ByteBufferExchanger}
		 */
		public ByteBufferExchanger exchanger() {
			final ByteBufferExchanger exchanger = new ByteBufferExchanger(this);
			exchangers.add(exchanger);
			return exchanger;
		}

		public void dispose(ByteBufferExchanger exchanger) {
			exchangers.remove(exchanger);
		}

		/**
		 * Rewrite {@link ByteBuffer} data to the provided byte[] array.
		 *
		 * @param bytes the byte[] array to rewrite {@link ByteBuffer} into
		 */
		public void read(byte[] bytes) {
			final ByteBuffer buffer = getByteBuffer();
			// all operations on buffer need to be synchronized
			synchronized (buffer) {
				buffer.rewind();
				buffer.get(bytes);
				buffer.rewind();
			}
		}

		/**
		 * Rewrite {@link ByteBuffer} to newly created byte[] array and return it.
		 *
		 * @return Newly created byte[] array with data from {@link ByteBuffer}
		 */
		public byte[] read() {
			final byte[] bytes = new byte[length];
			final ByteBuffer buffer = getByteBuffer();
			// all operations on buffer need to be synchronized
			synchronized (buffer) {
				buffer.rewind();
				buffer.get(bytes);
				buffer.rewind();
			}
			return bytes;
		}

		public boolean isReady() {
			return buffer.get() != null;
		}
	}

	private static class WebcamPanelImageSupplier implements ImageSupplier {

		private final int[] imageOffset = new int[] { 0 };
		private final int[] bandOffsets = new int[] { 0, 1, 2 };
		private final int[] bits = { 8, 8, 8 };
		private final int dataType = DataBuffer.TYPE_BYTE;
		private final Dimension size;
		private final AsyncWebcamBuffer buffer;
		private final ComponentSampleModel sampleModel;
		private final ColorSpace colorSpace;
		private final ComponentColorModel colorModel;

		public WebcamPanelImageSupplier(AsyncWebcamBuffer buffer) {
			this.buffer = buffer;
			this.size = buffer.getWebcam().getViewSize();
			this.sampleModel = new ComponentSampleModel(dataType, size.width, size.height, 3, size.width * 3, bandOffsets);
			this.colorSpace = ColorSpace.getInstance(ColorSpace.CS_sRGB);
			this.colorModel = new ComponentColorModel(colorSpace, bits, false, false, Transparency.OPAQUE, dataType);
		}

		@Override
		public BufferedImage get() {

			while (!buffer.isReady()) {
				return null;
			}

			final byte[] bytes = new byte[size.width * size.height * 3];
			final byte[][] data = new byte[][] { bytes };

			buffer.read(bytes);

			final DataBufferByte dataBuffer = new DataBufferByte(data, bytes.length, imageOffset);
			final WritableRaster raster = Raster.createWritableRaster(sampleModel, dataBuffer, null);
			final BufferedImage image = new BufferedImage(colorModel, raster, false, null);

			return image;
		}
	}

	public static void main(String[] args) throws InterruptedException {

		final Dimension size = WebcamResolution.VGA.getSize();

		final Webcam webcam = Webcam.getDefault();
		webcam.setViewSize(size);
		webcam.open();

		final AsyncWebcamBuffer buffer = new AsyncWebcamBuffer(webcam);
		final ImageSupplier supplier = new WebcamPanelImageSupplier(buffer);

		final WebcamPanel panel = new WebcamPanel(webcam, size, true, supplier);
		panel.setFPSDisplayed(true);
		panel.setDisplayDebugInfo(true);
		panel.setImageSizeDisplayed(true);
		panel.setMirrored(true);

		final JFrame window = new JFrame("Test webcam panel");
		window.add(panel);
		window.setResizable(true);
		window.setDefaultCloseOperation(JFrame.EXIT_ON_CLOSE);
		window.pack();
		window.setVisible(true);

		// this thread will get underlying ByteBuffer and perform synchronized op to
		// get rewrite it into bytes[] array

		final Thread t1 = new Thread() {

			@Override
			public void run() {

				// make sure to close exchanger because you will end up with memory leak

				try (final ByteBufferExchanger exchanger = buffer.exchanger()) {

					while (webcam.isOpen()) {

						long t1 = System.currentTimeMillis();
						final ByteBuffer bb = exchanger.awaitAndGet();
						long t2 = System.currentTimeMillis();

						System.out.println(getName() + " : " + 1000 / (t2 - t1 + 1));

						final byte[] bytes = new byte[buffer.length];

						// make sure to synchronize or you will end up

						synchronized (bb) {
							bb.rewind();
							bb.get(bytes);
							bb.rewind();
						}

						// do processing on bytes[] array
					}
				}
			}
		};
		t1.start();

		// this thread will await for underlying ByteBuffer to be ready and perform
		// synchronized op to get rewrite it into new bytes[] array

		final Thread t2 = new Thread() {

			@Override
			public void run() {

				try (final ByteBufferExchanger exchanger = buffer.exchanger()) {
					while (webcam.isOpen()) {

						long t1 = System.currentTimeMillis();
						exchanger.await();
						long t2 = System.currentTimeMillis();

						System.out.println(getName() + " : " + 1000 / (t2 - t1 + 1));

						final byte[] bytes = buffer.read();

						// do processing on bytes[] array
					}
				}
			}
		};
		t2.start();

		// this thread will await for underlying ByteBuffer to be ready and perform
		// synchronized op to get rewrite it into pre-created bytes[] array

		final Thread t3 = new Thread() {

			@Override
			public void run() {

				try (final ByteBufferExchanger exchanger = buffer.exchanger()) {

					final byte[] bytes = new byte[buffer.length()];

					while (webcam.isOpen()) {

						long t1 = System.currentTimeMillis();
						exchanger.await();
						long t2 = System.currentTimeMillis();

						System.out.println(getName() + " : " + 1000 / (t2 - t1 + 1));

						buffer.read(bytes);

						// do processing on bytes[] array
					}
				}
			}
		};
		t3.start();
	}
}

@neilyoung
Copy link
Author

neilyoung commented Dec 13, 2017

Getting this currently...

Exception in thread "main" java.lang.NoClassDefFoundError: org/slf4j/LoggerFactory
at com.github.sarxos.webcam.Webcam.(Webcam.java:101)
at com.me.HelloWorld.main(HelloWorld.java:237)
Caused by: java.lang.ClassNotFoundException: org.slf4j.LoggerFactory
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 2 more

@neilyoung
Copy link
Author

OK, could fix this, but now:

SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
Exception in thread "main" com.github.sarxos.webcam.WebcamException: java.util.concurrent.ExecutionException: com.github.sarxos.webcam.WebcamException: Cannot execute task
	at com.github.sarxos.webcam.WebcamDiscoveryService.getWebcams(WebcamDiscoveryService.java:124)
	at com.github.sarxos.webcam.Webcam.getWebcams(Webcam.java:893)
	at com.github.sarxos.webcam.Webcam.getDefault(Webcam.java:956)
	at com.github.sarxos.webcam.Webcam.getDefault(Webcam.java:933)
	at com.github.sarxos.webcam.Webcam.getDefault(Webcam.java:911)
	at com.accuware.HelloWorld.main(HelloWorld.java:237)
Caused by: java.util.concurrent.ExecutionException: com.github.sarxos.webcam.WebcamException: Cannot execute task
	at java.util.concurrent.FutureTask.report(FutureTask.java:122)
	at java.util.concurrent.FutureTask.get(FutureTask.java:192)
	at com.github.sarxos.webcam.WebcamDiscoveryService.getWebcams(WebcamDiscoveryService.java:116)
	... 5 more
Caused by: com.github.sarxos.webcam.WebcamException: Cannot execute task
	at com.github.sarxos.webcam.WebcamProcessor$AtomicProcessor.process(WebcamProcessor.java:72)
	at com.github.sarxos.webcam.WebcamProcessor.process(WebcamProcessor.java:140)
	at com.github.sarxos.webcam.WebcamTask.process(WebcamTask.java:46)
	at com.github.sarxos.webcam.ds.buildin.WebcamDefaultDriver$WebcamNewGrabberTask.newGrabber(WebcamDefaultDriver.java:45)
	at com.github.sarxos.webcam.ds.buildin.WebcamDefaultDriver.getDevices(WebcamDefaultDriver.java:117)
	at com.github.sarxos.webcam.WebcamDiscoveryService$WebcamsDiscovery.call(WebcamDiscoveryService.java:36)
	at com.github.sarxos.webcam.WebcamDiscoveryService$WebcamsDiscovery.call(WebcamDiscoveryService.java:26)
	at java.util.concurrent.FutureTask.run$$$capture(FutureTask.java:266)
	at java.util.concurrent.FutureTask.run(FutureTask.java)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NoClassDefFoundError: org/bridj/cpp/CPPObject
	at java.lang.ClassLoader.defineClass1(Native Method)
	at java.lang.ClassLoader.defineClass(ClassLoader.java:760)
	at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
	at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
	at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:368)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:362)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:361)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at com.github.sarxos.webcam.ds.buildin.WebcamDefaultDriver$WebcamNewGrabberTask.handle(WebcamDefaultDriver.java:55)
	at com.github.sarxos.webcam.WebcamProcessor$AtomicProcessor.run(WebcamProcessor.java:81)
	... 3 more
Caused by: java.lang.ClassNotFoundException: org.bridj.cpp.CPPObject
	at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	... 17 more
Disconnected from the target VM, address: '127.0.0.1:56517', transport: 'socket'


@sarxos
Copy link
Owner

sarxos commented Dec 13, 2017

The java.lang.ClassNotFoundException: org.bridj.cpp.CPPObject is cause by no BridJ in a classpath. Just add it (the BridJ JAR) and you should be fine.

@neilyoung
Copy link
Author

Sorry, my bad.

Ok runs fine for a while but crashes then.

Thread-6 : 33
Thread-5 : 33
Thread-6 : 32
Thread-4 : 34
Thread-4 : 31
Thread-6 : 31
Thread-5 : 32
Thread-4 : 30
Thread-5 : 30
Thread-6 : 30
Thread-4 : 33
Thread-5 : 33
Thread-6 : 32
Thread-6 : 24
Thread-5 : 24
Thread-4 : 24
Thread-5 : 11
Thread-6 : 11
Thread-4 : 11
Thread-5 : 500
Thread-4 : 333
Exception in thread "Thread-2" java.lang.IllegalStateException: java.util.concurrent.TimeoutException
	at com.me.HelloWorld$ByteBufferExchanger.ready(HelloWorld.java:77)
	at com.me.HelloWorld$AsyncWebcamBuffer.run(HelloWorld.java:127)
Caused by: java.util.concurrent.TimeoutException
	at java.util.concurrent.Exchanger.exchange(Exchanger.java:626)
	at com.me.HelloWorld$ByteBufferExchanger.ready(HelloWorld.java:75)
	... 1 more

@sarxos
Copy link
Owner

sarxos commented Dec 14, 2017

@neilyoung, my first guess is that it's caused by GC, but will have to test this later to be sure.

@sarxos
Copy link
Owner

sarxos commented Jan 15, 2018

Hi @neilyoung,

After checking the code I found that it will happen when there is no consumer waiting for the image on the exchanger other side. That is, when webcam produced image, and sent it to ready(..) but there was no consumer waiting on the exchanger other side to take the image (the awaitAndGet() was not invoked before image was ready).

The solution for this differs in terms of what you would like to achieve, because you can either:

  1. Drop this image and immediately wait for the next one (this is fine when you accept some frames to be dropped), or
  2. Or wait some time with hope someone consume it (this is fine when you accept some small delays in video feed), or
  3. Put it on some exchange queue and leave it be (this is when you require no drops and no delays, however to make this work you have to take special care not to overflow the heap or your application will be OOM-ed).

To implement 1, simply change code to:

public void ready(ByteBuffer bb) {
	try {
		exchange(bb, 0, TimeUnit.SECONDS);
	} catch (InterruptedException | TimeoutException e) {
		// do nothing, simply drop frame
	}
}

To implement 2, simply change code to:

public void ready(ByteBuffer bb) {
	try {
		// wait max 10 seconds for frame to be consumed
		exchange(bb, 10, TimeUnit.SECONDS);
	} catch (InterruptedException | TimeoutException e) {
		throw new IllegalStateException(e);
	}
}

Or 1 and 2 mixin (this is what I did in example from source code, but I'm waiting 500 ms instead of 10 seconds):

public void ready(ByteBuffer bb) {
	try {
		exchange(bb, 500, TimeUnit.MILLISECONDS);
	} catch (InterruptedException | TimeoutException e) {
		// do nothing, frame is dropped
	}
}

And there is no easy solution for 3, but it's doable. If you are interested in concurrect patterns in Java you can take a look at this repository. It contains very good examples on how Java concurrent structures can be used in practice:

https://github.com/LeonardoZ/java-concurrency-patterns

@neilyoung
Copy link
Author

@sarxos Thanks for your patience. Will try to check that out later

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants