Originally published on melvinkoh.me
Would love to see more of such stories
Thanks! Your article comes up #3 in google for
python concurrency for me. Must be getting a lot of reads.
I think it makes the message even clearer if you use two functions, as below. I initially made the interpretation mistake that Python only imposes some kind of extra overhead for shared memory (which would not be a big deal), carelessly mistaking x as a shared variable (embarrassing, I know, especially for a CS phd… haven’t done concurrent programming in a while).
def countdown(): x = 100000000 while x > 0: x -= 1 def countup(): y = 0 while y < 100000000: y += 1 # Implementation 1: Multi-threading def implementation_1(): thread_1 = threading.Thread(target=countdown) thread_2 = threading.Thread(target=countup) thread_1.start() thread_2.start() thread_1.join() thread_2.join() # Implementation 2: Run in serial def implementation_2(): countdown() countup()
Apparently PyPy is not a fix: http://doc.pypy.org/en/latest/faq.html#does-pypy-have-a-gil-why
They suggest the STM version, but I haven’t tried it: http://doc.pypy.org/en/latest/stm.html#python-3-cpython-and-others