You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
(deblur) [zhuge@node006 CPCR]$ python setup.py install
running install
running bdist_egg
running egg_info
creating fastconv_nu.egg-info
writing fastconv_nu.egg-info/PKG-INFO
writing dependency_links to fastconv_nu.egg-info/dependency_links.txt
writing top-level names to fastconv_nu.egg-info/top_level.txt
writing manifest file 'fastconv_nu.egg-info/SOURCES.txt'
/mnt/xfs1/home/zhuge/anaconda3/envs/deblur/lib/python3.6/site-packages/torch/utils/cpp_extension.py:352: UserWarning: Attempted to use ninja as the BuildExtension backend but we could not find ninja.. Falling back to using the slow distutils backend.
warnings.warn(msg.format('we could not find ninja.'))
reading manifest file 'fastconv_nu.egg-info/SOURCES.txt'
writing manifest file 'fastconv_nu.egg-info/SOURCES.txt'
installing library code to build/bdist.linux-x86_64/egg
running install_lib
running build_ext
/mnt/xfs1/home/zhuge/anaconda3/envs/deblur/lib/python3.6/site-packages/torch/utils/cpp_extension.py:294: UserWarning:
!! WARNING !!
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
Your compiler (g++ 4.8.5) may be ABI-incompatible with PyTorch!
Please use a compiler that is ABI-compatible with GCC 5.0 and above.
See https://gcc.gnu.org/onlinedocs/libstdc++/manual/abi.html.
warnings.warn(ABI_INCOMPATIBILITY_WARNING.format(compiler))
building 'custom_conv' extension
creating build
creating build/temp.linux-x86_64-3.6
creating build/temp.linux-x86_64-3.6/src
gcc -pthread -B /mnt/xfs1/home/zhuge/anaconda3/envs/deblur/compiler_compat -Wl,--sysroot=/ -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -I/mnt/xfs1/home/zhuge/anaconda3/envs/deblur/lib/python3.6/site-packages/torch/include -I/mnt/xfs1/home/zhuge/anaconda3/envs/deblur/lib/python3.6/site-packages/torch/include/torch/csrc/api/include -I/mnt/xfs1/home/zhuge/anaconda3/envs/deblur/lib/python3.6/site-packages/torch/include/TH -I/mnt/xfs1/home/zhuge/anaconda3/envs/deblur/lib/python3.6/site-packages/torch/include/THC -I/cm/shared/apps/cuda10.1/toolkit/10.1.243/include -I/mnt/xfs1/home/zhuge/anaconda3/envs/deblur/include/python3.6m -c src/nu_conv_cuda.cpp -o build/temp.linux-x86_64-3.6/src/nu_conv_cuda.o -std=c++14 -DTORCH_API_INCLUDE_EXTENSION_H -DPYBIND11_COMPILER_TYPE="_gcc" -DPYBIND11_STDLIB="_libstdcpp" -DPYBIND11_BUILD_ABI="_cxxabi1011" -DTORCH_EXTENSION_NAME=custom_conv -D_GLIBCXX_USE_CXX11_ABI=0
gcc: error: unrecognized command line option ‘-std=c++14’
error: command 'gcc' failed with exit status 1
(deblur) [zhuge@node006 CPCR]$ gcc --version
gcc (GCC) 4.8.5 20150623 (Red Hat 4.8.5-11)
Copyright (C) 2015 Free Software Foundation, Inc.
This is free software; see the source for copying conditions. There is NO
warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.
The text was updated successfully, but these errors were encountered:
(deblur) [zhuge@node006 CPCR]$ python setup.py install
running install
running bdist_egg
running egg_info
creating fastconv_nu.egg-info
writing fastconv_nu.egg-info/PKG-INFO
writing dependency_links to fastconv_nu.egg-info/dependency_links.txt
writing top-level names to fastconv_nu.egg-info/top_level.txt
writing manifest file 'fastconv_nu.egg-info/SOURCES.txt'
/mnt/xfs1/home/zhuge/anaconda3/envs/deblur/lib/python3.6/site-packages/torch/utils/cpp_extension.py:352: UserWarning: Attempted to use ninja as the BuildExtension backend but we could not find ninja.. Falling back to using the slow distutils backend.
warnings.warn(msg.format('we could not find ninja.'))
reading manifest file 'fastconv_nu.egg-info/SOURCES.txt'
writing manifest file 'fastconv_nu.egg-info/SOURCES.txt'
installing library code to build/bdist.linux-x86_64/egg
running install_lib
running build_ext
/mnt/xfs1/home/zhuge/anaconda3/envs/deblur/lib/python3.6/site-packages/torch/utils/cpp_extension.py:294: UserWarning:
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
Your compiler (g++ 4.8.5) may be ABI-incompatible with PyTorch!
Please use a compiler that is ABI-compatible with GCC 5.0 and above.
See https://gcc.gnu.org/onlinedocs/libstdc++/manual/abi.html.
See https://gist.github.com/goldsborough/d466f43e8ffc948ff92de7486c5216d6
for instructions on how to install GCC 5 or higher.
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
warnings.warn(ABI_INCOMPATIBILITY_WARNING.format(compiler))
building 'custom_conv' extension
creating build
creating build/temp.linux-x86_64-3.6
creating build/temp.linux-x86_64-3.6/src
gcc -pthread -B /mnt/xfs1/home/zhuge/anaconda3/envs/deblur/compiler_compat -Wl,--sysroot=/ -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -I/mnt/xfs1/home/zhuge/anaconda3/envs/deblur/lib/python3.6/site-packages/torch/include -I/mnt/xfs1/home/zhuge/anaconda3/envs/deblur/lib/python3.6/site-packages/torch/include/torch/csrc/api/include -I/mnt/xfs1/home/zhuge/anaconda3/envs/deblur/lib/python3.6/site-packages/torch/include/TH -I/mnt/xfs1/home/zhuge/anaconda3/envs/deblur/lib/python3.6/site-packages/torch/include/THC -I/cm/shared/apps/cuda10.1/toolkit/10.1.243/include -I/mnt/xfs1/home/zhuge/anaconda3/envs/deblur/include/python3.6m -c src/nu_conv_cuda.cpp -o build/temp.linux-x86_64-3.6/src/nu_conv_cuda.o -std=c++14 -DTORCH_API_INCLUDE_EXTENSION_H -DPYBIND11_COMPILER_TYPE="_gcc" -DPYBIND11_STDLIB="_libstdcpp" -DPYBIND11_BUILD_ABI="_cxxabi1011" -DTORCH_EXTENSION_NAME=custom_conv -D_GLIBCXX_USE_CXX11_ABI=0
gcc: error: unrecognized command line option ‘-std=c++14’
error: command 'gcc' failed with exit status 1
(deblur) [zhuge@node006 CPCR]$ gcc --version
gcc (GCC) 4.8.5 20150623 (Red Hat 4.8.5-11)
Copyright (C) 2015 Free Software Foundation, Inc.
This is free software; see the source for copying conditions. There is NO
warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.
The text was updated successfully, but these errors were encountered: