You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When running the exact vignette script: https://cran.r-project.org/web/packages/CopernicusDEM/vignettes/Copernicus_Digital_Elevation_Models.html, using Windows 11 home and R version 4.4.1, and using the newest version of tmap (version 3.99.9003), I encountered the issue that the leaflet map was not produced with points, and the tmaps were produced differently than the examples in the vignette. The lft command yielded the CopernicusDEM aoi map but without the points. The tmps were not colored by 'individual local identifier'.
Here is the readout when running the script:
Processing of the 'Alberta_Wolves.csv' file ...
Convert the data.table to an 'sf' object ...
Transform the projection of the 'sf' object from 4326 to 7801 ...
Create a buffer of 250 meters using as input the initial sf object ...
Back-tranformation of the projection and computation of the bounding box ...
Use the bounding box to extract the raster extent ...
Compute the centroid of the sf-buffer object ...
Elapsed time: 0 hours and 0 minutes and 0 seconds.
Parallel download of the 6 .tif files using 4 threads starts ...
Elapsed time: 0 hours and 1 minutes and 29 seconds.
The VRT Mosaic will be built from 6 '.tif' files and will be saved in 'C:\Users\APD\AppData\Local\Temp\RtmpABfRIS/VRT_mosaic_FILE.vrt' ...
0...10...20...30...40...50...60...70...80...90...100 - done.
Elapsed time: 0 hours and 0 minutes and 0 seconds.
The raster will be uploaded ...
The 'sf' object will be converted to a spatial vector ...
The raster will be cropped ...
ℹ tmap mode set to "view".
Processing of the 'Mountain_caribou.csv' file ...
Convert the data.table to an 'sf' object ...
Transform the projection of the 'sf' object from 4326 to 7801 ...
Create a buffer of 250 meters using as input the initial sf object ...
Back-tranformation of the projection and computation of the bounding box ...
Use the bounding box to extract the raster extent ...
Compute the centroid of the sf-buffer object ...
Elapsed time: 0 hours and 0 minutes and 0 seconds.
Parallel download of the 16 .tif files using 4 threads starts ...
Elapsed time: 0 hours and 2 minutes and 23 seconds.
The VRT Mosaic will be built from 22 '.tif' files and will be saved in 'C:\Users\APD\AppData\Local\Temp\RtmpABfRIS/VRT_mosaic_FILE.vrt' ...
0...10...20...30...40...50...60...70...80...90...100 - done.
Elapsed time: 0 hours and 0 minutes and 0 seconds.
The raster will be uploaded ...
The 'sf' object will be converted to a spatial vector ...
The raster will be cropped ...
ℹ tmap mode set to "view".
Registered S3 method overwritten by 'jsonify':
method from
print.json jsonlite
lft
tmap_data$Rangifer tarandus # caribou
SpatRaster object downsampled to 4206 by 2379 cells.
Linking to GEOS 3.12.1, GDAL 3.8.4, PROJ 9.3.1; sf_use_s2() is FALSE
Warning message:
Number of levels of the variable assigned to the aesthetic "col" of the layer "dotssymbols" is 138, which is larger than n.max (which is 30), so levels are combined.
Thank you!
The text was updated successfully, but these errors were encountered:
I just updated the code on Github and submitted the new version to CRAN (1.0.5). As mentioned in the NEWS.md file I replaced tmap, leaflet, leafgl with the mapview R package.
You can install first the latest version of mapview using
When running the exact vignette script: https://cran.r-project.org/web/packages/CopernicusDEM/vignettes/Copernicus_Digital_Elevation_Models.html, using Windows 11 home and R version 4.4.1, and using the newest version of tmap (version 3.99.9003), I encountered the issue that the leaflet map was not produced with points, and the tmaps were produced differently than the examples in the vignette. The lft command yielded the CopernicusDEM aoi map but without the points. The tmps were not colored by 'individual local identifier'.
Here is the readout when running the script:
Processing of the 'Alberta_Wolves.csv' file ...
Convert the data.table to an 'sf' object ...
Transform the projection of the 'sf' object from 4326 to 7801 ...
Create a buffer of 250 meters using as input the initial sf object ...
Back-tranformation of the projection and computation of the bounding box ...
Use the bounding box to extract the raster extent ...
Compute the centroid of the sf-buffer object ...
Elapsed time: 0 hours and 0 minutes and 0 seconds.
Parallel download of the 6 .tif files using 4 threads starts ...
Elapsed time: 0 hours and 1 minutes and 29 seconds.
The VRT Mosaic will be built from 6 '.tif' files and will be saved in 'C:\Users\APD\AppData\Local\Temp\RtmpABfRIS/VRT_mosaic_FILE.vrt' ...
0...10...20...30...40...50...60...70...80...90...100 - done.
Elapsed time: 0 hours and 0 minutes and 0 seconds.
The raster will be uploaded ...
The 'sf' object will be converted to a spatial vector ...
The raster will be cropped ...
ℹ tmap mode set to "view".
Processing of the 'Mountain_caribou.csv' file ...
Convert the data.table to an 'sf' object ...
Transform the projection of the 'sf' object from 4326 to 7801 ...
Create a buffer of 250 meters using as input the initial sf object ...
Back-tranformation of the projection and computation of the bounding box ...
Use the bounding box to extract the raster extent ...
Compute the centroid of the sf-buffer object ...
Elapsed time: 0 hours and 0 minutes and 0 seconds.
Parallel download of the 16 .tif files using 4 threads starts ...
Elapsed time: 0 hours and 2 minutes and 23 seconds.
The VRT Mosaic will be built from 22 '.tif' files and will be saved in 'C:\Users\APD\AppData\Local\Temp\RtmpABfRIS/VRT_mosaic_FILE.vrt' ...
0...10...20...30...40...50...60...70...80...90...100 - done.
Elapsed time: 0 hours and 0 minutes and 0 seconds.
The raster will be uploaded ...
The 'sf' object will be converted to a spatial vector ...
The raster will be cropped ...
ℹ tmap mode set to "view".
Registered S3 method overwritten by 'jsonify':
method from
print.json jsonlite
Thank you!
The text was updated successfully, but these errors were encountered: