The “freeze_support()“ line can be omitted if the program is not going to be frozen to produce an ex

PyTorch错误:The “freeze_support()” line can be omitted if the program is not going to be frozen to produce an executable.
错误代码如下:

RuntimeError: 
        An attempt has been made to start a new process before the
        current process has finished its bootstrapping phase.
 
        This probably means that you are not using fork to start your
        child processes and you have forgotten to use the proper idiom
        in the main module:
 
            if __name__ == '__main__':
                freeze_support()
                ...
 
        The "freeze_support()" line can be omitted if the program
        is not going to be frozen to produce an executable.RuntimeError: 
        An attempt has been made to start a new process before the
        current process has finished its bootstrapping phase.
 
        This probably means that you are not using fork to start your
        child processes and you have forgotten to use the proper idiom
        in the main module:
 
            if __name__ == '__main__':
                freeze_support()
                ...
 
        The "freeze_support()" line can be omitted if the program
        is not going to be frozen to produce an executable.

解决方法

将你要运行的代码块放到main函数中运行即可

if __name__ == '__main__':
    #your code

问题产生原因:
问题:

import Queue
from multiprocessing.managers import BaseManager

BaseManager.register('get_queue', callable=lambda:  Queue.Queue())

manager = BaseManager(address=('', 5000), authkey='abc')
manager.start()
manager.shutdown()

This code will throw a exception

RuntimeError: 
        Attempt to start a new process before the current process
        has finished its bootstrapping phase.

        This probably means that you are on Windows and you have
        forgotten to use the proper idiom in the main module:

            if __name__ == '__main__':
                freeze_support()
                ...

        The "freeze_support()" line can be omitted if the program
        is not going to be frozen to produce a Windows executable.

回答:
This error message is displayed when using multiprocessing with the ‘spawn’ start method (default on platforms lacking fork like windows), and not protecting your code with a if name = ‘main’ guard.

The reason is that with the ‘spawn’ start method a new python process is spawned, which then in turn has to import the main module before it can proceed to do it’s work. If your program does not have the mentioned guard, that subprocess would try to execute the same code as the parent process again, spawning another process and so on, until your program (or computer) crashes.

The message is not ment to tell you to add the freeze_support() line, but to guard your program:

import Queue
from multiprocessing.managers import BaseManager

def main():
    BaseManager.register('get_queue', callable=lambda:  Queue.Queue())

    manager = BaseManager(address=('', 5000), authkey='abc')
    manager.start()
    manager.shutdown()

if __name__ == '__main__':
    # freeze_support() here if program needs to be frozen
    main()  # execute this only when run directly, not when imported!

来源:https://stackoverflow.com/questions/29690091/python2-7-exception-the-freeze-support-line-can-be-omitted-if-the-program

也就是问题在于在使用了pytorch的多线程,如下,然后没有使用if __name__ == '__main__':来保护只有主程序才能执行if __name__ == '__main__':下面的语句,然后多线程创建出来相同的子进程也在不停的和主进程一样创建子进程,导致问题出现。

import torch.multiprocessing as mp

mp.spawn(demo_fn,
             args=(model, world_size, num_epochs),
             nprocs=world_size,
             join=True)

要运行pytorch多线程结构应该类似下方示例:

import .....


........

def run(demo_fn,model,world_size, num_epochs):
    mp.spawn(demo_fn,
             args=(model, world_size, num_epochs),
             nprocs=world_size,
             join=True)

...................

if __name__ == '__main__':
    run(train_model, model_ft, world_size, 300)

文章出处登录后可见!

已经登录?立即刷新

共计人评分,平均

到目前为止还没有投票!成为第一位评论此文章。

(0)
心中带点小风骚的头像心中带点小风骚普通用户
上一篇 2023年3月10日 上午10:32
下一篇 2023年3月10日 上午10:33

相关推荐