python - Scrapy: Importing a package from the project that's not in the same directory -
i'm trying import package project not in same directory scrapy in. directory structure project follows:
main __init__.py /xpaths __init.py xpaths.py /scrapper scrapy.cfg /scrapper __init.py settings.py items.py pipelines.py /spiders myspider.py
i'm trying access xpaths.py
within myspider.py
. here attempts:
1) from main.xpaths.xpaths import xpathshandler
2) from xpaths.xpaths import xpathshandler
3) from ..xpaths.xpaths import xpathshandler
these failed error:
importerror: no module named .......
my last attempt was:
4) from ...xpaths.xpaths import xpathshandler
which failed error:
valueerror: attempted relative import beyond toplevel package
what doing wrong? xpaths
independent scrapy, therefore file structure has stay way.
//edit
after further debugging following @alecxe comment, tried adding path main
inside sys.path
, , print before importing xpaths. weird thing is, scrapper
directory gets appended path when run scrapy. here's added:
'c:\\users\\laptomer\\code\\python\\pythonbackend\\main'
and here's when print sys.path
:
'c:\\users\\laptomer\\code\\python\\pythonbackend\\main\\scrapper'
why scrapy append path?
i know little bit messy solution 1 find when had same problem you. before including files project need manually append system path top package level, i.e:
sys.path.append(os.path.join(os.path.dirname(__file__), '../..')) xpaths.xpaths import xpathshandler ...
from understand scrappy creates own package - why cannot import files other directories. explains error:
valueerror: attempted relative import beyond toplevel package
Comments
Post a Comment