{"diffoscope-json-version": 1, "source1": "/srv/reproducible-results/rbuild-debian/r-b-build.7Ww0J1yU/b1/python-xarray_2025.01.2-1_amd64.changes", "source2": "/srv/reproducible-results/rbuild-debian/r-b-build.7Ww0J1yU/b2/python-xarray_2025.01.2-1_amd64.changes", "unified_diff": null, "details": [{"source1": "Files", "source2": "Files", "unified_diff": "@@ -1,3 +1,3 @@\n \n- 0498ec7f50853bbeffc8d052b694ab7f 2751776 doc optional python-xarray-doc_2025.01.2-1_all.deb\n+ c3c727b28c2d2ac9cfbdf090b7f86954 2751900 doc optional python-xarray-doc_2025.01.2-1_all.deb\n 4644c3352e568f782718f7d018211ae7 799852 python optional python3-xarray_2025.01.2-1_all.deb\n"}, {"source1": "python-xarray-doc_2025.01.2-1_all.deb", "source2": "python-xarray-doc_2025.01.2-1_all.deb", "unified_diff": null, "details": [{"source1": "file list", "source2": "file list", "unified_diff": "@@ -1,3 +1,3 @@\n -rw-r--r-- 0 0 0 4 2025-02-02 11:36:57.000000 debian-binary\n--rw-r--r-- 0 0 0 6364 2025-02-02 11:36:57.000000 control.tar.xz\n--rw-r--r-- 0 0 0 2745220 2025-02-02 11:36:57.000000 data.tar.xz\n+-rw-r--r-- 0 0 0 6368 2025-02-02 11:36:57.000000 control.tar.xz\n+-rw-r--r-- 0 0 0 2745340 2025-02-02 11:36:57.000000 data.tar.xz\n"}, {"source1": "control.tar.xz", "source2": "control.tar.xz", "unified_diff": null, "details": [{"source1": "control.tar", "source2": "control.tar", "unified_diff": null, "details": [{"source1": "./control", "source2": "./control", "unified_diff": "@@ -1,13 +1,13 @@\n Package: python-xarray-doc\n Source: python-xarray\n Version: 2025.01.2-1\n Architecture: all\n Maintainer: Debian Science Maintainers Let\u2019s create a simple plot of 2-m air temperature in degrees Celsius: Write equations to calculate the vertical coordinate. These will be only evaluated when data is requested. Information about the ROMS vertical coordinate can be found (here)[https://www.myroms.org/wiki/Vertical_S-coordinate] In short, for The function we will apply is Plot the first timestep: We first have to come up with the weights, - calculate the month length for each monthly data record - calculate weights using Finally, we just need to multiply our weights by the In this example, the logical coordinates are Control the map projection parameters on multiple axes This example illustrates how to plot multiple maps and control their extent and aspect ratio. For more details see this discussion on github. Visualizing your datasets is quick and convenient: Note the automatic labeling with names and units. Our effort in adding metadata attributes has paid off! Many aspects of these figures are customizable: see Plotting. We can also fit multi-dimensional functions, and even use a wrapper function to\n simultaneously fit a summation of several functions, such as this field containing\n two gaussian peaks: Note This method replicates the behavior of [3]:\n
\n-Error in callback <function _draw_all_if_interactive at 0x7f0d1cd74fe0> (for post_execute), with arguments args (),kwargs {}:\n+Error in callback <function _draw_all_if_interactive at 0x7f3fcd2ecfe0> (for post_execute), with arguments args (),kwargs {}:\n
\n ---------------------------------------------------------------------------\n FileNotFoundError Traceback (most recent call last)\n File /usr/lib/python3.13/pathlib/_local.py:724, in Path.mkdir(self, mode, parents, exist_ok)\n 723 try:\n --> 724 os.mkdir(self, mode)\n 725 except FileNotFoundError:\n \n-FileNotFoundError: [Errno 2] No such file or directory: '/nonexistent/first-build/.local/share/cartopy/shapefiles/natural_earth/physical'\n+FileNotFoundError: [Errno 2] No such file or directory: '/nonexistent/second-build/.local/share/cartopy/shapefiles/natural_earth/physical'\n \n During handling of the above exception, another exception occurred:\n \n FileNotFoundError Traceback (most recent call last)\n File /usr/lib/python3.13/pathlib/_local.py:724, in Path.mkdir(self, mode, parents, exist_ok)\n 723 try:\n --> 724 os.mkdir(self, mode)\n 725 except FileNotFoundError:\n \n-FileNotFoundError: [Errno 2] No such file or directory: '/nonexistent/first-build/.local/share/cartopy/shapefiles/natural_earth'\n+FileNotFoundError: [Errno 2] No such file or directory: '/nonexistent/second-build/.local/share/cartopy/shapefiles/natural_earth'\n \n During handling of the above exception, another exception occurred:\n \n FileNotFoundError Traceback (most recent call last)\n File /usr/lib/python3.13/pathlib/_local.py:724, in Path.mkdir(self, mode, parents, exist_ok)\n 723 try:\n --> 724 os.mkdir(self, mode)\n 725 except FileNotFoundError:\n \n-FileNotFoundError: [Errno 2] No such file or directory: '/nonexistent/first-build/.local/share/cartopy/shapefiles'\n+FileNotFoundError: [Errno 2] No such file or directory: '/nonexistent/second-build/.local/share/cartopy/shapefiles'\n \n During handling of the above exception, another exception occurred:\n \n FileNotFoundError Traceback (most recent call last)\n File /usr/lib/python3.13/pathlib/_local.py:724, in Path.mkdir(self, mode, parents, exist_ok)\n 723 try:\n --> 724 os.mkdir(self, mode)\n 725 except FileNotFoundError:\n \n-FileNotFoundError: [Errno 2] No such file or directory: '/nonexistent/first-build/.local/share/cartopy'\n+FileNotFoundError: [Errno 2] No such file or directory: '/nonexistent/second-build/.local/share/cartopy'\n \n During handling of the above exception, another exception occurred:\n \n FileNotFoundError Traceback (most recent call last)\n File /usr/lib/python3.13/pathlib/_local.py:724, in Path.mkdir(self, mode, parents, exist_ok)\n 723 try:\n --> 724 os.mkdir(self, mode)\n 725 except FileNotFoundError:\n \n-FileNotFoundError: [Errno 2] No such file or directory: '/nonexistent/first-build/.local/share'\n+FileNotFoundError: [Errno 2] No such file or directory: '/nonexistent/second-build/.local/share'\n \n During handling of the above exception, another exception occurred:\n \n FileNotFoundError Traceback (most recent call last)\n File /usr/lib/python3.13/pathlib/_local.py:724, in Path.mkdir(self, mode, parents, exist_ok)\n 723 try:\n --> 724 os.mkdir(self, mode)\n 725 except FileNotFoundError:\n \n-FileNotFoundError: [Errno 2] No such file or directory: '/nonexistent/first-build/.local'\n+FileNotFoundError: [Errno 2] No such file or directory: '/nonexistent/second-build/.local'\n \n During handling of the above exception, another exception occurred:\n \n FileNotFoundError Traceback (most recent call last)\n File /usr/lib/python3.13/pathlib/_local.py:724, in Path.mkdir(self, mode, parents, exist_ok)\n 723 try:\n --> 724 os.mkdir(self, mode)\n 725 except FileNotFoundError:\n \n-FileNotFoundError: [Errno 2] No such file or directory: '/nonexistent/first-build'\n+FileNotFoundError: [Errno 2] No such file or directory: '/nonexistent/second-build'\n \n During handling of the above exception, another exception occurred:\n \n PermissionError Traceback (most recent call last)\n File /usr/lib/python3/dist-packages/matplotlib/pyplot.py:197, in _draw_all_if_interactive()\n 195 def _draw_all_if_interactive() -> None:\n 196 if matplotlib.is_interactive():\n@@ -530,75 +530,75 @@\n ---------------------------------------------------------------------------\n FileNotFoundError Traceback (most recent call last)\n File /usr/lib/python3.13/pathlib/_local.py:724, in Path.mkdir(self, mode, parents, exist_ok)\n 723 try:\n --> 724 os.mkdir(self, mode)\n 725 except FileNotFoundError:\n \n-FileNotFoundError: [Errno 2] No such file or directory: '/nonexistent/first-build/.local/share/cartopy/shapefiles/natural_earth/physical'\n+FileNotFoundError: [Errno 2] No such file or directory: '/nonexistent/second-build/.local/share/cartopy/shapefiles/natural_earth/physical'\n \n During handling of the above exception, another exception occurred:\n \n FileNotFoundError Traceback (most recent call last)\n File /usr/lib/python3.13/pathlib/_local.py:724, in Path.mkdir(self, mode, parents, exist_ok)\n 723 try:\n --> 724 os.mkdir(self, mode)\n 725 except FileNotFoundError:\n \n-FileNotFoundError: [Errno 2] No such file or directory: '/nonexistent/first-build/.local/share/cartopy/shapefiles/natural_earth'\n+FileNotFoundError: [Errno 2] No such file or directory: '/nonexistent/second-build/.local/share/cartopy/shapefiles/natural_earth'\n \n During handling of the above exception, another exception occurred:\n \n FileNotFoundError Traceback (most recent call last)\n File /usr/lib/python3.13/pathlib/_local.py:724, in Path.mkdir(self, mode, parents, exist_ok)\n 723 try:\n --> 724 os.mkdir(self, mode)\n 725 except FileNotFoundError:\n \n-FileNotFoundError: [Errno 2] No such file or directory: '/nonexistent/first-build/.local/share/cartopy/shapefiles'\n+FileNotFoundError: [Errno 2] No such file or directory: '/nonexistent/second-build/.local/share/cartopy/shapefiles'\n \n During handling of the above exception, another exception occurred:\n \n FileNotFoundError Traceback (most recent call last)\n File /usr/lib/python3.13/pathlib/_local.py:724, in Path.mkdir(self, mode, parents, exist_ok)\n 723 try:\n --> 724 os.mkdir(self, mode)\n 725 except FileNotFoundError:\n \n-FileNotFoundError: [Errno 2] No such file or directory: '/nonexistent/first-build/.local/share/cartopy'\n+FileNotFoundError: [Errno 2] No such file or directory: '/nonexistent/second-build/.local/share/cartopy'\n \n During handling of the above exception, another exception occurred:\n \n FileNotFoundError Traceback (most recent call last)\n File /usr/lib/python3.13/pathlib/_local.py:724, in Path.mkdir(self, mode, parents, exist_ok)\n 723 try:\n --> 724 os.mkdir(self, mode)\n 725 except FileNotFoundError:\n \n-FileNotFoundError: [Errno 2] No such file or directory: '/nonexistent/first-build/.local/share'\n+FileNotFoundError: [Errno 2] No such file or directory: '/nonexistent/second-build/.local/share'\n \n During handling of the above exception, another exception occurred:\n \n FileNotFoundError Traceback (most recent call last)\n File /usr/lib/python3.13/pathlib/_local.py:724, in Path.mkdir(self, mode, parents, exist_ok)\n 723 try:\n --> 724 os.mkdir(self, mode)\n 725 except FileNotFoundError:\n \n-FileNotFoundError: [Errno 2] No such file or directory: '/nonexistent/first-build/.local'\n+FileNotFoundError: [Errno 2] No such file or directory: '/nonexistent/second-build/.local'\n \n During handling of the above exception, another exception occurred:\n \n FileNotFoundError Traceback (most recent call last)\n File /usr/lib/python3.13/pathlib/_local.py:724, in Path.mkdir(self, mode, parents, exist_ok)\n 723 try:\n --> 724 os.mkdir(self, mode)\n 725 except FileNotFoundError:\n \n-FileNotFoundError: [Errno 2] No such file or directory: '/nonexistent/first-build'\n+FileNotFoundError: [Errno 2] No such file or directory: '/nonexistent/second-build'\n \n During handling of the above exception, another exception occurred:\n \n PermissionError Traceback (most recent call last)\n File /usr/lib/python3/dist-packages/IPython/core/formatters.py:402, in BaseFormatter.__call__(self, obj)\n 400 pass\n 401 else:\n", "details": [{"source1": "html2text {}", "source2": "html2text {}", "unified_diff": "@@ -99,15 +99,15 @@\n 273 message.append(\n 274 f\"Use environment variable '{env}' to specify a different\n location.\"\n 275 )\n --> 276 raise PermissionError(\" \".join(message)) from error\n \n PermissionError: [Errno 13] Permission denied: '/nonexistent' | Pooch could not\n-create data cache folder '/nonexistent/first-build/.cache/\n+create data cache folder '/nonexistent/second-build/.cache/\n xarray_tutorial_data'. Will not be able to download data files.\n Let\u2019s create a simple plot of 2-m air temperature in degrees Celsius:\n [3]:\n ds = ds - 273.15\n ds.t2m[0].plot(cmap=plt.cm.coolwarm)\n ---------------------------------------------------------------------------\n NameError Traceback (most recent call last)\n@@ -138,97 +138,97 @@\n ----> 7 plot = ds.t2m[0].plot(\n 8 cmap=plt.cm.coolwarm, transform=ccrs.PlateCarree(), cbar_kwargs=\n {\"shrink\": 0.6}\n 9 )\n 10 plt.title(\"ERA5 - 2m temperature British Isles March 2019\")\n \n NameError: name 'ds' is not defined\n-Error in callback
Add a lazilly calculated vertical coordinates\u00b6
\n Vtransform==2
as used in this example,np.interp
which expects 1D numpy arrays. This functionality is already implemented in xarray so we use that capability to make sure we are not making mistakes.[2]:\n
[3]:\n
[ ]:\n
\n", "details": [{"source1": "html2text {}", "source2": "html2text {}", "unified_diff": "@@ -99,15 +99,15 @@\n 273 message.append(\n 274 f\"Use environment variable '{env}' to specify a different\n location.\"\n 275 )\n --> 276 raise PermissionError(\" \".join(message)) from error\n \n PermissionError: [Errno 13] Permission denied: '/nonexistent' | Pooch could not\n-create data cache folder '/nonexistent/first-build/.cache/\n+create data cache folder '/nonexistent/second-build/.cache/\n xarray_tutorial_data'. Will not be able to download data files.\n [ ]:\n _\b[_\bL_\bo_\bg_\bo_\b _\bo_\bf_\b _\bx_\ba_\br_\br_\ba_\by_\b]\n *\b**\b**\b**\b**\b**\b* _\bx\bx_\ba\ba_\br\br_\br\br_\ba\ba_\by\by *\b**\b**\b**\b**\b**\b*\n *\b**\b**\b**\b* N\bNa\bav\bvi\big\bga\bat\bti\bio\bon\bn *\b**\b**\b**\b*\n For users\n * _\bG_\be_\bt_\bt_\bi_\bn_\bg_\b _\bS_\bt_\ba_\br_\bt_\be_\bd\n"}]}, {"source1": "./usr/share/doc/python-xarray-doc/html/examples/blank_template.ipynb.gz", "source2": "./usr/share/doc/python-xarray-doc/html/examples/blank_template.ipynb.gz", "unified_diff": null, "details": [{"source1": "blank_template.ipynb", "source2": "blank_template.ipynb", "unified_diff": null, "details": [{"source1": "Pretty-printed", "source2": "Pretty-printed", "comments": ["Similarity: 0.9986468545751634%", "Differences: {\"'cells'\": \"{1: {'metadata': {'execution': {'iopub.execute_input': '2025-03-05T03:36:38.709491Z', \"", " \"'iopub.status.busy': '2025-03-05T03:36:38.709018Z', 'iopub.status.idle': \"", " \"'2025-03-05T03:36:41.861727Z', 'shell.execute_reply': \"", " '\\'2025-03-05T03:36:41.859998Z\\'}}, \\'outputs\\': {0: {\\'evalue\\': \"[Errno 13] '", " \"Permission denied: '/nonexistent' | Pooch could not create data cache folder \"", " \"'/nonexistent/second-build/.cache/xarray_tutorial_data'. [\u2026]"], "unified_diff": "@@ -12,24 +12,24 @@\n },\n {\n \"cell_type\": \"code\",\n \"execution_count\": 1,\n \"id\": \"41b90ede\",\n \"metadata\": {\n \"execution\": {\n- \"iopub.execute_input\": \"2026-04-07T09:27:07.732799Z\",\n- \"iopub.status.busy\": \"2026-04-07T09:27:07.732555Z\",\n- \"iopub.status.idle\": \"2026-04-07T09:27:08.437060Z\",\n- \"shell.execute_reply\": \"2026-04-07T09:27:08.436143Z\"\n+ \"iopub.execute_input\": \"2025-03-05T03:36:38.709491Z\",\n+ \"iopub.status.busy\": \"2025-03-05T03:36:38.709018Z\",\n+ \"iopub.status.idle\": \"2025-03-05T03:36:41.861727Z\",\n+ \"shell.execute_reply\": \"2025-03-05T03:36:41.859998Z\"\n }\n },\n \"outputs\": [\n {\n \"ename\": \"PermissionError\",\n- \"evalue\": \"[Errno 13] Permission denied: '/nonexistent' | Pooch could not create data cache folder '/nonexistent/first-build/.cache/xarray_tutorial_data'. Will not be able to download data files.\",\n+ \"evalue\": \"[Errno 13] Permission denied: '/nonexistent' | Pooch could not create data cache folder '/nonexistent/second-build/.cache/xarray_tutorial_data'. Will not be able to download data files.\",\n \"output_type\": \"error\",\n \"traceback\": [\n \"\\u001b[0;31m---------------------------------------------------------------------------\\u001b[0m\",\n \"\\u001b[0;31mPermissionError\\u001b[0m Traceback (most recent call last)\",\n \"File \\u001b[0;32m/usr/lib/python3/dist-packages/pooch/utils.py:262\\u001b[0m, in \\u001b[0;36mmake_local_storage\\u001b[0;34m(path, env)\\u001b[0m\\n\\u001b[1;32m 258\\u001b[0m \\u001b[38;5;28;01mif\\u001b[39;00m action \\u001b[38;5;241m==\\u001b[39m \\u001b[38;5;124m\\\"\\u001b[39m\\u001b[38;5;124mcreate\\u001b[39m\\u001b[38;5;124m\\\"\\u001b[39m:\\n\\u001b[1;32m 259\\u001b[0m \\u001b[38;5;66;03m# When running in parallel, it's possible that multiple jobs will\\u001b[39;00m\\n\\u001b[1;32m 260\\u001b[0m \\u001b[38;5;66;03m# try to create the path at the same time. Use exist_ok to avoid\\u001b[39;00m\\n\\u001b[1;32m 261\\u001b[0m \\u001b[38;5;66;03m# raising an error.\\u001b[39;00m\\n\\u001b[0;32m--> 262\\u001b[0m \\u001b[43mos\\u001b[49m\\u001b[38;5;241;43m.\\u001b[39;49m\\u001b[43mmakedirs\\u001b[49m\\u001b[43m(\\u001b[49m\\u001b[43mpath\\u001b[49m\\u001b[43m,\\u001b[49m\\u001b[43m \\u001b[49m\\u001b[43mexist_ok\\u001b[49m\\u001b[38;5;241;43m=\\u001b[39;49m\\u001b[38;5;28;43;01mTrue\\u001b[39;49;00m\\u001b[43m)\\u001b[49m\\n\\u001b[1;32m 263\\u001b[0m \\u001b[38;5;28;01melse\\u001b[39;00m:\\n\",\n \"File \\u001b[0;32m/usr/lib/python3.13/os.py:217\\u001b[0m, in \\u001b[0;36mmakedirs\\u001b[0;34m(name, mode, exist_ok)\\u001b[0m\\n\\u001b[1;32m 216\\u001b[0m \\u001b[38;5;28;01mtry\\u001b[39;00m:\\n\\u001b[0;32m--> 217\\u001b[0m \\u001b[43mmakedirs\\u001b[49m\\u001b[43m(\\u001b[49m\\u001b[43mhead\\u001b[49m\\u001b[43m,\\u001b[49m\\u001b[43m \\u001b[49m\\u001b[43mexist_ok\\u001b[49m\\u001b[38;5;241;43m=\\u001b[39;49m\\u001b[43mexist_ok\\u001b[49m\\u001b[43m)\\u001b[49m\\n\\u001b[1;32m 218\\u001b[0m \\u001b[38;5;28;01mexcept\\u001b[39;00m \\u001b[38;5;167;01mFileExistsError\\u001b[39;00m:\\n\\u001b[1;32m 219\\u001b[0m \\u001b[38;5;66;03m# Defeats race condition when another thread created the path\\u001b[39;00m\\n\",\n \"File \\u001b[0;32m/usr/lib/python3.13/os.py:217\\u001b[0m, in \\u001b[0;36mmakedirs\\u001b[0;34m(name, mode, exist_ok)\\u001b[0m\\n\\u001b[1;32m 216\\u001b[0m \\u001b[38;5;28;01mtry\\u001b[39;00m:\\n\\u001b[0;32m--> 217\\u001b[0m \\u001b[43mmakedirs\\u001b[49m\\u001b[43m(\\u001b[49m\\u001b[43mhead\\u001b[49m\\u001b[43m,\\u001b[49m\\u001b[43m \\u001b[49m\\u001b[43mexist_ok\\u001b[49m\\u001b[38;5;241;43m=\\u001b[39;49m\\u001b[43mexist_ok\\u001b[49m\\u001b[43m)\\u001b[49m\\n\\u001b[1;32m 218\\u001b[0m \\u001b[38;5;28;01mexcept\\u001b[39;00m \\u001b[38;5;167;01mFileExistsError\\u001b[39;00m:\\n\\u001b[1;32m 219\\u001b[0m \\u001b[38;5;66;03m# Defeats race condition when another thread created the path\\u001b[39;00m\\n\",\n@@ -39,15 +39,15 @@\n \"\\nThe above exception was the direct cause of the following exception:\\n\",\n \"\\u001b[0;31mPermissionError\\u001b[0m Traceback (most recent call last)\",\n \"Cell \\u001b[0;32mIn[1], line 5\\u001b[0m\\n\\u001b[1;32m 2\\u001b[0m \\u001b[38;5;28;01mimport\\u001b[39;00m \\u001b[38;5;21;01mnumpy\\u001b[39;00m \\u001b[38;5;28;01mas\\u001b[39;00m \\u001b[38;5;21;01mnp\\u001b[39;00m\\n\\u001b[1;32m 3\\u001b[0m \\u001b[38;5;28;01mimport\\u001b[39;00m \\u001b[38;5;21;01mpandas\\u001b[39;00m \\u001b[38;5;28;01mas\\u001b[39;00m \\u001b[38;5;21;01mpd\\u001b[39;00m\\n\\u001b[0;32m----> 5\\u001b[0m ds \\u001b[38;5;241m=\\u001b[39m \\u001b[43mxr\\u001b[49m\\u001b[38;5;241;43m.\\u001b[39;49m\\u001b[43mtutorial\\u001b[49m\\u001b[38;5;241;43m.\\u001b[39;49m\\u001b[43mload_dataset\\u001b[49m\\u001b[43m(\\u001b[49m\\u001b[38;5;124;43m\\\"\\u001b[39;49m\\u001b[38;5;124;43mair_temperature\\u001b[39;49m\\u001b[38;5;124;43m\\\"\\u001b[39;49m\\u001b[43m)\\u001b[49m\\n\\u001b[1;32m 6\\u001b[0m da \\u001b[38;5;241m=\\u001b[39m ds[\\u001b[38;5;124m\\\"\\u001b[39m\\u001b[38;5;124mair\\u001b[39m\\u001b[38;5;124m\\\"\\u001b[39m]\\n\",\n \"File \\u001b[0;32m/usr/lib/python3/dist-packages/xarray/tutorial.py:213\\u001b[0m, in \\u001b[0;36mload_dataset\\u001b[0;34m(*args, **kwargs)\\u001b[0m\\n\\u001b[1;32m 176\\u001b[0m \\u001b[38;5;28;01mdef\\u001b[39;00m \\u001b[38;5;21mload_dataset\\u001b[39m(\\u001b[38;5;241m*\\u001b[39margs, \\u001b[38;5;241m*\\u001b[39m\\u001b[38;5;241m*\\u001b[39mkwargs) \\u001b[38;5;241m-\\u001b[39m\\u001b[38;5;241m>\\u001b[39m Dataset:\\n\\u001b[1;32m 177\\u001b[0m \\u001b[38;5;250m \\u001b[39m\\u001b[38;5;124;03m\\\"\\\"\\\"\\u001b[39;00m\\n\\u001b[1;32m 178\\u001b[0m \\u001b[38;5;124;03m Open, load into memory, and close a dataset from the online repository\\u001b[39;00m\\n\\u001b[1;32m 179\\u001b[0m \\u001b[38;5;124;03m (requires internet).\\u001b[39;00m\\n\\u001b[0;32m (...)\\u001b[0m\\n\\u001b[1;32m 211\\u001b[0m \\u001b[38;5;124;03m load_dataset\\u001b[39;00m\\n\\u001b[1;32m 212\\u001b[0m \\u001b[38;5;124;03m \\\"\\\"\\\"\\u001b[39;00m\\n\\u001b[0;32m--> 213\\u001b[0m \\u001b[38;5;28;01mwith\\u001b[39;00m \\u001b[43mopen_dataset\\u001b[49m\\u001b[43m(\\u001b[49m\\u001b[38;5;241;43m*\\u001b[39;49m\\u001b[43margs\\u001b[49m\\u001b[43m,\\u001b[49m\\u001b[43m \\u001b[49m\\u001b[38;5;241;43m*\\u001b[39;49m\\u001b[38;5;241;43m*\\u001b[39;49m\\u001b[43mkwargs\\u001b[49m\\u001b[43m)\\u001b[49m \\u001b[38;5;28;01mas\\u001b[39;00m ds:\\n\\u001b[1;32m 214\\u001b[0m \\u001b[38;5;28;01mreturn\\u001b[39;00m ds\\u001b[38;5;241m.\\u001b[39mload()\\n\",\n \"File \\u001b[0;32m/usr/lib/python3/dist-packages/xarray/tutorial.py:165\\u001b[0m, in \\u001b[0;36mopen_dataset\\u001b[0;34m(name, cache, cache_dir, engine, **kws)\\u001b[0m\\n\\u001b[1;32m 162\\u001b[0m downloader \\u001b[38;5;241m=\\u001b[39m pooch\\u001b[38;5;241m.\\u001b[39mHTTPDownloader(headers\\u001b[38;5;241m=\\u001b[39mheaders)\\n\\u001b[1;32m 164\\u001b[0m \\u001b[38;5;66;03m# retrieve the file\\u001b[39;00m\\n\\u001b[0;32m--> 165\\u001b[0m filepath \\u001b[38;5;241m=\\u001b[39m \\u001b[43mpooch\\u001b[49m\\u001b[38;5;241;43m.\\u001b[39;49m\\u001b[43mretrieve\\u001b[49m\\u001b[43m(\\u001b[49m\\n\\u001b[1;32m 166\\u001b[0m \\u001b[43m \\u001b[49m\\u001b[43murl\\u001b[49m\\u001b[38;5;241;43m=\\u001b[39;49m\\u001b[43murl\\u001b[49m\\u001b[43m,\\u001b[49m\\u001b[43m \\u001b[49m\\u001b[43mknown_hash\\u001b[49m\\u001b[38;5;241;43m=\\u001b[39;49m\\u001b[38;5;28;43;01mNone\\u001b[39;49;00m\\u001b[43m,\\u001b[49m\\u001b[43m \\u001b[49m\\u001b[43mpath\\u001b[49m\\u001b[38;5;241;43m=\\u001b[39;49m\\u001b[43mcache_dir\\u001b[49m\\u001b[43m,\\u001b[49m\\u001b[43m \\u001b[49m\\u001b[43mdownloader\\u001b[49m\\u001b[38;5;241;43m=\\u001b[39;49m\\u001b[43mdownloader\\u001b[49m\\n\\u001b[1;32m 167\\u001b[0m \\u001b[43m\\u001b[49m\\u001b[43m)\\u001b[49m\\n\\u001b[1;32m 168\\u001b[0m ds \\u001b[38;5;241m=\\u001b[39m _open_dataset(filepath, engine\\u001b[38;5;241m=\\u001b[39mengine, \\u001b[38;5;241m*\\u001b[39m\\u001b[38;5;241m*\\u001b[39mkws)\\n\\u001b[1;32m 169\\u001b[0m \\u001b[38;5;28;01mif\\u001b[39;00m \\u001b[38;5;129;01mnot\\u001b[39;00m cache:\\n\",\n \"File \\u001b[0;32m/usr/lib/python3/dist-packages/pooch/core.py:227\\u001b[0m, in \\u001b[0;36mretrieve\\u001b[0;34m(url, known_hash, fname, path, processor, downloader, progressbar)\\u001b[0m\\n\\u001b[1;32m 222\\u001b[0m action, verb \\u001b[38;5;241m=\\u001b[39m download_action(full_path, known_hash)\\n\\u001b[1;32m 224\\u001b[0m \\u001b[38;5;28;01mif\\u001b[39;00m action \\u001b[38;5;129;01min\\u001b[39;00m (\\u001b[38;5;124m\\\"\\u001b[39m\\u001b[38;5;124mdownload\\u001b[39m\\u001b[38;5;124m\\\"\\u001b[39m, \\u001b[38;5;124m\\\"\\u001b[39m\\u001b[38;5;124mupdate\\u001b[39m\\u001b[38;5;124m\\\"\\u001b[39m):\\n\\u001b[1;32m 225\\u001b[0m \\u001b[38;5;66;03m# We need to write data, so create the local data directory if it\\u001b[39;00m\\n\\u001b[1;32m 226\\u001b[0m \\u001b[38;5;66;03m# doesn't already exist.\\u001b[39;00m\\n\\u001b[0;32m--> 227\\u001b[0m \\u001b[43mmake_local_storage\\u001b[49m\\u001b[43m(\\u001b[49m\\u001b[43mpath\\u001b[49m\\u001b[43m)\\u001b[49m\\n\\u001b[1;32m 229\\u001b[0m get_logger()\\u001b[38;5;241m.\\u001b[39minfo(\\n\\u001b[1;32m 230\\u001b[0m \\u001b[38;5;124m\\\"\\u001b[39m\\u001b[38;5;132;01m%s\\u001b[39;00m\\u001b[38;5;124m data from \\u001b[39m\\u001b[38;5;124m'\\u001b[39m\\u001b[38;5;132;01m%s\\u001b[39;00m\\u001b[38;5;124m'\\u001b[39m\\u001b[38;5;124m to file \\u001b[39m\\u001b[38;5;124m'\\u001b[39m\\u001b[38;5;132;01m%s\\u001b[39;00m\\u001b[38;5;124m'\\u001b[39m\\u001b[38;5;124m.\\u001b[39m\\u001b[38;5;124m\\\"\\u001b[39m,\\n\\u001b[1;32m 231\\u001b[0m verb,\\n\\u001b[1;32m 232\\u001b[0m url,\\n\\u001b[1;32m 233\\u001b[0m \\u001b[38;5;28mstr\\u001b[39m(full_path),\\n\\u001b[1;32m 234\\u001b[0m )\\n\\u001b[1;32m 236\\u001b[0m \\u001b[38;5;28;01mif\\u001b[39;00m downloader \\u001b[38;5;129;01mis\\u001b[39;00m \\u001b[38;5;28;01mNone\\u001b[39;00m:\\n\",\n \"File \\u001b[0;32m/usr/lib/python3/dist-packages/pooch/utils.py:276\\u001b[0m, in \\u001b[0;36mmake_local_storage\\u001b[0;34m(path, env)\\u001b[0m\\n\\u001b[1;32m 272\\u001b[0m \\u001b[38;5;28;01mif\\u001b[39;00m env \\u001b[38;5;129;01mis\\u001b[39;00m \\u001b[38;5;129;01mnot\\u001b[39;00m \\u001b[38;5;28;01mNone\\u001b[39;00m:\\n\\u001b[1;32m 273\\u001b[0m message\\u001b[38;5;241m.\\u001b[39mappend(\\n\\u001b[1;32m 274\\u001b[0m \\u001b[38;5;124mf\\u001b[39m\\u001b[38;5;124m\\\"\\u001b[39m\\u001b[38;5;124mUse environment variable \\u001b[39m\\u001b[38;5;124m'\\u001b[39m\\u001b[38;5;132;01m{\\u001b[39;00menv\\u001b[38;5;132;01m}\\u001b[39;00m\\u001b[38;5;124m'\\u001b[39m\\u001b[38;5;124m to specify a different location.\\u001b[39m\\u001b[38;5;124m\\\"\\u001b[39m\\n\\u001b[1;32m 275\\u001b[0m )\\n\\u001b[0;32m--> 276\\u001b[0m \\u001b[38;5;28;01mraise\\u001b[39;00m \\u001b[38;5;167;01mPermissionError\\u001b[39;00m(\\u001b[38;5;124m\\\"\\u001b[39m\\u001b[38;5;124m \\u001b[39m\\u001b[38;5;124m\\\"\\u001b[39m\\u001b[38;5;241m.\\u001b[39mjoin(message)) \\u001b[38;5;28;01mfrom\\u001b[39;00m \\u001b[38;5;21;01merror\\u001b[39;00m\\n\",\n- \"\\u001b[0;31mPermissionError\\u001b[0m: [Errno 13] Permission denied: '/nonexistent' | Pooch could not create data cache folder '/nonexistent/first-build/.cache/xarray_tutorial_data'. Will not be able to download data files.\"\n+ \"\\u001b[0;31mPermissionError\\u001b[0m: [Errno 13] Permission denied: '/nonexistent' | Pooch could not create data cache folder '/nonexistent/second-build/.cache/xarray_tutorial_data'. Will not be able to download data files.\"\n ]\n }\n ],\n \"source\": [\n \"import xarray as xr\\n\",\n \"import numpy as np\\n\",\n \"import pandas as pd\\n\",\n"}]}]}, {"source1": "./usr/share/doc/python-xarray-doc/html/examples/monthly-means.html", "source2": "./usr/share/doc/python-xarray-doc/html/examples/monthly-means.html", "unified_diff": "@@ -159,15 +159,15 @@\n File /usr/lib/python3/dist-packages/pooch/utils.py:276, in make_local_storage(path, env)\n 272 if env is not None:\n 273 message.append(\n 274 f"Use environment variable '{env}' to specify a different location."\n 275 )\n --> 276 raise PermissionError(" ".join(message)) from error\n \n-PermissionError: [Errno 13] Permission denied: '/nonexistent' | Pooch could not create data cache folder '/nonexistent/first-build/.cache/xarray_tutorial_data'. Will not be able to download data files.\n+PermissionError: [Errno 13] Permission denied: '/nonexistent' | Pooch could not create data cache folder '/nonexistent/second-build/.cache/xarray_tutorial_data'. Will not be able to download data files.\n
Now for the heavy lifting:\u00b6
\n groupby('time.season')
Dataset
and sum along the time dimension. Creating a DataArray
for the month length is as easy as using the days_in_month
accessor on the time coordinate. The calendar type, in this case 'noleap'
, is automatically considered in this operation.x
and y
, while the physical coordinates are xc
and yc
, which represent the longitudes and latitudes of the data.[3]:\n
Multiple plots and map projections\u00b6
\n <xarray.Dataset> Size: 41kB\n Dimensions: (time: 731, location: 3)\n Coordinates:\n * time (time) datetime64[ns] 6kB 2000-01-01 2000-01-02 ... 2001-12-31\n * location (location) <U2 24B 'IA' 'IN' 'IL'\n Data variables:\n tmin (time, location) float64 18kB -8.037 -1.788 ... -1.346 -4.544\n- tmax (time, location) float64 18kB 12.98 3.31 6.779 ... 3.343 3.805
PandasIndex(Index(['IA', 'IN', 'IL'], dtype='object', name='location'))
Examine a dataset with pandas and seaborn\u00b6
\n Convert to a pandas DataFrame\u00b6
\n [2]:\n@@ -697,15 +697,15 @@\n
[5]:\n
\n-<seaborn.axisgrid.PairGrid at 0x7f998cabf770>\n+<seaborn.axisgrid.PairGrid at 0x7f0b76afb770>\n
\n@@ -1110,26 +1110,26 @@\n [0. , 0. , 0. ],\n [0. , 0. , 0. ],\n [0. , 0.01612903, 0. ],\n [0.33333333, 0.35 , 0.23333333],\n [0.93548387, 0.85483871, 0.82258065]])\n Coordinates:\n * location (location) <U2 24B 'IA' 'IN' 'IL'\n- * month (month) int64 96B 1 2 3 4 5 6 7 8 9 10 11 12
array(['IA', 'IN', 'IL'], dtype='<U2')
array([ 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12])
PandasIndex(Index(['IA', 'IN', 'IL'], dtype='object', name='location'))
PandasIndex(Index([1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12], dtype='int64', name='month'))
[7]:\n
freeze.to_pandas().plot()\n
PandasIndex(Index(['IA', 'IN', 'IL'], dtype='object', name='location'))
[12]:\n
df = both.sel(time="2000").mean("location").reset_coords(drop=True).to_dataframe()\n df.head()\n", "details": [{"source1": "html2text {}", "source2": "html2text {}", "unified_diff": "@@ -142,15 +142,15 @@\n [4]:\n
array(['IA', 'IN', 'IL'], dtype='<U2')
array([ 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12])
PandasIndex(Index(['IA', 'IN', 'IL'], dtype='object', name='location'))
PandasIndex(Index([1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12], dtype='int64', name='month'))
Plotting\u00b6
\n In [37]: data.plot()\n-Out[37]: <matplotlib.collections.QuadMesh at 0x7f93c3208440>\n+Out[37]: <matplotlib.collections.QuadMesh at 0x7f39571d3e00>\n
\n
pandas\u00b6
\n", "details": [{"source1": "html2text {}", "source2": "html2text {}", "unified_diff": "@@ -253,15 +253,15 @@\n [0.37342613, 1.49497537, 1.33584385]])\n Coordinates:\n * x (x) int64 16B 10 20\n Dimensions without coordinates: y\n *\b**\b**\b**\b**\b* P\bPl\blo\bot\btt\bti\bin\bng\bg_\b?\b\u00b6 *\b**\b**\b**\b**\b*\n Visualizing your datasets is quick and convenient:\n In [37]: data.plot()\n-Out[37]: apply_ufunc
\", \"Compare weighted and unweighted mean temperature\", \"Blank template\", \"Calculating Seasonal Averages from Time Series of Monthly Means\", \"Working with Multidimensional Coordinates\", \"Visualization Gallery\", \"Toy weather data\", \"Gallery\", \"Frequently Asked Questions\", \"Getting Started\", \"Installation\", \"Quick overview\", \"Overview: Why xarray?\", \"Getting Help\", \"How do I \\u2026\", \"Xarray documentation\", \"Alternative chunked array types\", \"Integrating with duck arrays\", \"Extending xarray using accessors\", \"How to add a new backend\", \"How to create a custom index\", \"Xarray Internals\", \"Internal Design\", \"Interoperability of Xarray\", \"Zarr Encoding Specification\", \"Development roadmap\", \"Tutorials and Videos\", \"Combining data\", \"Computation\", \"Parallel Computing with Dask\", \"Data Structures\", \"Working with numpy-like arrays\", \"GroupBy: Group and Bin Data\", \"Hierarchical data\", \"User Guide\", \"Indexing and selecting data\", \"Interpolating data\", \"Reading and writing files\", \"Configuration\", \"Working with pandas\", \"Plotting\", \"Reshaping and reorganizing data\", \"Terminology\", \"Testing your code\", \"Time series data\", \"Weather and climate data\", \"What\\u2019s New\"],\n \"titleterms\": {\n \"\": [13, 54],\n \"0\": 54,\n \"01\": 54,\n \"02\": 54,\n"}]}, {"source1": "./usr/share/doc/python-xarray-doc/html/user-guide/computation.html", "source2": "./usr/share/doc/python-xarray-doc/html/user-guide/computation.html", "unified_diff": "@@ -879,15 +879,15 @@\n * param (param) <U2 16B 'a' 'xc'\n * cov_i (cov_i) <U2 16B 'a' 'xc'\n * cov_j (cov_j) <U2 16B 'a' 'xc'\n Data variables:\n var2_curvefit_coefficients (x, param) float64 2kB 3.0 -5.0 3.0 ... 3.0 4.9\n var2_curvefit_covariance (x, cov_i, cov_j) float64 3kB 9.286e-14 ... 1...\n var3_curvefit_coefficients (x, param) float64 2kB 0.9999 5.0 ... 1.0 -4.9\n- var3_curvefit_covariance (x, cov_i, cov_j) float64 3kB 5.825e-11 ... 1...\n+ var3_curvefit_covariance (x, cov_i, cov_j) float64 3kB 5.825e-11 ... 8...\n \n \n In [102]: def gaussian_2d(coords, a, xc, yc, xalpha, yalpha):\n .....: x, y = coords\n@@ -935,15 +935,15 @@\n Dimensions: (param: 10, cov_i: 10, cov_j: 10)\n Coordinates:\n * param (param) <U7 280B 'a0' 'xc0' ... 'xalpha1' 'yalpha1'\n * cov_i (cov_i) <U7 280B 'a0' 'xc0' ... 'xalpha1' 'yalpha1'\n * cov_j (cov_j) <U7 280B 'a0' 'xc0' ... 'xalpha1' 'yalpha1'\n Data variables:\n curvefit_coefficients (param) float64 80B 1.994 -0.9986 ... 1.999 0.9986\n- curvefit_covariance (cov_i, cov_j) float64 800B 6.556e-05 ... 4.467e-06\n+ curvefit_covariance (cov_i, cov_j) float64 800B 6.557e-05 ... 4.466e-06\n
scipy.optimize.curve_fit()
.func(ds)
). This allows you to write pipelines for\n transforming your data (using \u201cmethod chaining\u201d) instead of writing hard to\n follow nested function calls:
# these lines are equivalent, but with pipe we can make the logic flow\n # entirely from left to right\n In [64]: plt.plot((2 * ds.temperature.sel(loc=0)).mean("instrument"))\n-Out[64]: [<matplotlib.lines.Line2D at 0x7f939a5a2c10>]\n+Out[64]: [<matplotlib.lines.Line2D at 0x7f39369a6d50>]\n \n In [65]: (ds.temperature.sel(loc=0).pipe(lambda x: 2 * x).mean("instrument").pipe(plt.plot))\n-Out[65]: [<matplotlib.lines.Line2D at 0x7f939a5a2ad0>]\n+Out[65]: [<matplotlib.lines.Line2D at 0x7f39369a6c10>]\n
Both pipe
and assign
replicate the pandas methods of the same names\n (DataFrame.pipe
and\n DataFrame.assign
).
With xarray, there is no performance penalty for creating new datasets, even if\n variables are lazily loaded from a file on disk. Creating new objects instead\n", "details": [{"source1": "html2text {}", "source2": "html2text {}", "unified_diff": "@@ -585,19 +585,19 @@\n There is also the pipe() method that allows you to use a method call with an\n external function (e.g., ds.pipe(func)) instead of simply calling it (e.g.,\n func(ds)). This allows you to write pipelines for transforming your data (using\n \u201cmethod chaining\u201d) instead of writing hard to follow nested function calls:\n # these lines are equivalent, but with pipe we can make the logic flow\n # entirely from left to right\n In [64]: plt.plot((2 * ds.temperature.sel(loc=0)).mean(\"instrument\"))\n-Out[64]: [ If you were a previous user of the prototype xarray-contrib/datatree package, this is different from what you\u2019re used to!\n In that package the data model was that the data stored in each node actually was completely unrelated. The data model is now slightly stricter.\n This allows us to provide features like Coordinate Inheritance. To demonstrate, let\u2019s first generate some example datasets which are not aligned with one another: Now we have a valid This is a useful way to organise our data because we can still operate on all the groups at once.\n For example we can extract all three timeseries at a specific lat-lon location: or compute the standard deviation of each timeseries to find out how it varies with sampling frequency: This helps to differentiate which variables are defined on the datatree node that you are currently looking at, and which were defined somewhere above it. We can also still perform all the same operations on the whole tree:# (drop the attributes just to make the printed representation shorter)\n In [89]: ds = xr.tutorial.open_dataset("air_temperature").drop_attrs()\n-PermissionError: [Errno 13] Permission denied: '/nonexistent' | Pooch could not create data cache folder '/nonexistent/first-build/.cache/xarray_tutorial_data'. Will not be able to download data files.\n+PermissionError: [Errno 13] Permission denied: '/nonexistent' | Pooch could not create data cache folder '/nonexistent/second-build/.cache/xarray_tutorial_data'. Will not be able to download data files.\n \n \n In [90]: ds_daily = ds.resample(time="D").mean("time")\n KeyError: "No variable named 'time'. Variables on the dataset include ['foo', 'x', 'letters']"\n \n \n In [91]: ds_weekly = ds.resample(time="W").mean("time")\n@@ -1054,15 +1054,15 @@\n \u2514\u2500\u2500 Group: /b/B\n
DataTree
structure which contains all the data at each different time frequency, stored in a separate group.In [100]: dt.sel(lat=75, lon=300)\n-ValueError: Dimensions {'lon', 'lat'} do not exist. Expected one or more of set()\n+ValueError: Dimensions {'lat', 'lon'} do not exist. Expected one or more of set()\n
In [101]: dt.std(dim="time")\n ValueError: Dimension(s) 'time' do not exist. Expected one or more of set()\n
In [107]: print(dt["/daily"])\n KeyError: 'Could not find node at /daily'\n
In [108]: dt.sel(lat=[75], lon=[300])\n-ValueError: Dimensions {'lon', 'lat'} do not exist. Expected one or more of set()\n+ValueError: Dimensions {'lat', 'lon'} do not exist. Expected one or more of set()\n \n \n In [109]: dt.std(dim="time")\n ValueError: Dimension(s) 'time' do not exist. Expected one or more of set()\n
In [52]: ds = xr.tutorial.open_dataset("air_temperature")\n-PermissionError: [Errno 13] Permission denied: '/nonexistent' | Pooch could not create data cache folder '/nonexistent/first-build/.cache/xarray_tutorial_data'. Will not be able to download data files.\n+PermissionError: [Errno 13] Permission denied: '/nonexistent' | Pooch could not create data cache folder '/nonexistent/second-build/.cache/xarray_tutorial_data'. Will not be able to download data files.\n \n \n # Define target latitude and longitude (where weather stations might be)\n In [53]: target_lon = xr.DataArray([200, 201, 202, 205], dims="points")\n \n In [54]: target_lat = xr.DataArray([31, 41, 42, 42], dims="points")\n \n@@ -697,15 +697,15 @@\n
To select and assign values to a portion of a DataArray()
you\n can use indexing with .loc
:
In [57]: ds = xr.tutorial.open_dataset("air_temperature")\n-PermissionError: [Errno 13] Permission denied: '/nonexistent' | Pooch could not create data cache folder '/nonexistent/first-build/.cache/xarray_tutorial_data'. Will not be able to download data files.\n+PermissionError: [Errno 13] Permission denied: '/nonexistent' | Pooch could not create data cache folder '/nonexistent/second-build/.cache/xarray_tutorial_data'. Will not be able to download data files.\n \n \n # add an empty 2D dataarray\n In [58]: ds["empty"] = xr.full_like(ds.air.mean("time"), fill_value=0)\n AttributeError: 'Dataset' object has no attribute 'air'\n \n \n@@ -869,15 +869,15 @@\n
You can also assign values to all variables of a Dataset
at once:
In [83]: ds_org = xr.tutorial.open_dataset("eraint_uvz").isel(\n ....: latitude=slice(56, 59), longitude=slice(255, 258), level=0\n ....: )\n ....: \n-PermissionError: [Errno 13] Permission denied: '/nonexistent' | Pooch could not create data cache folder '/nonexistent/first-build/.cache/xarray_tutorial_data'. Will not be able to download data files.\n+PermissionError: [Errno 13] Permission denied: '/nonexistent' | Pooch could not create data cache folder '/nonexistent/second-build/.cache/xarray_tutorial_data'. Will not be able to download data files.\n \n \n # set all values to 0\n In [84]: ds = xr.zeros_like(ds_org)\n NameError: name 'ds_org' is not defined\n \n \n", "details": [{"source1": "html2text {}", "source2": "html2text {}", "unified_diff": "@@ -474,15 +474,15 @@\n collection specified weather station latitudes and longitudes. To trigger\n vectorized indexing behavior you will need to provide the selection dimensions\n with a new shared output dimension name. In the example below, the selections\n of the closest latitude and longitude are renamed to an output dimension named\n \u201cpoints\u201d:\n In [52]: ds = xr.tutorial.open_dataset(\"air_temperature\")\n PermissionError: [Errno 13] Permission denied: '/nonexistent' | Pooch could not\n-create data cache folder '/nonexistent/first-build/.cache/\n+create data cache folder '/nonexistent/second-build/.cache/\n xarray_tutorial_data'. Will not be able to download data files.\n \n \n # Define target latitude and longitude (where weather stations might be)\n In [53]: target_lon = xr.DataArray([200, 201, 202, 205], dims=\"points\")\n \n In [54]: target_lat = xr.DataArray([31, 41, 42, 42], dims=\"points\")\n@@ -513,15 +513,15 @@\n selected subpart of the target array (except for the explicitly indexed\n dimensions with .loc/.sel). Otherwise, IndexError will be raised.\n *\b**\b**\b**\b**\b* A\bAs\bss\bsi\big\bgn\bni\bin\bng\bg v\bva\bal\blu\bue\bes\bs w\bwi\bit\bth\bh i\bin\bnd\bde\bex\bxi\bin\bng\bg_\b?\b\u00b6 *\b**\b**\b**\b**\b*\n To select and assign values to a portion of a DataArray() you can use indexing\n with .loc :\n In [57]: ds = xr.tutorial.open_dataset(\"air_temperature\")\n PermissionError: [Errno 13] Permission denied: '/nonexistent' | Pooch could not\n-create data cache folder '/nonexistent/first-build/.cache/\n+create data cache folder '/nonexistent/second-build/.cache/\n xarray_tutorial_data'. Will not be able to download data files.\n \n \n # add an empty 2D dataarray\n In [58]: ds[\"empty\"] = xr.full_like(ds.air.mean(\"time\"), fill_value=0)\n AttributeError: 'Dataset' object has no attribute 'air'\n \n@@ -673,15 +673,15 @@\n Dimensions without coordinates: x\n You can also assign values to all variables of a Dataset at once:\n In [83]: ds_org = xr.tutorial.open_dataset(\"eraint_uvz\").isel(\n ....: latitude=slice(56, 59), longitude=slice(255, 258), level=0\n ....: )\n ....:\n PermissionError: [Errno 13] Permission denied: '/nonexistent' | Pooch could not\n-create data cache folder '/nonexistent/first-build/.cache/\n+create data cache folder '/nonexistent/second-build/.cache/\n xarray_tutorial_data'. Will not be able to download data files.\n \n \n # set all values to 0\n In [84]: ds = xr.zeros_like(ds_org)\n NameError: name 'ds_org' is not defined\n \n"}]}, {"source1": "./usr/share/doc/python-xarray-doc/html/user-guide/interpolation.html", "source2": "./usr/share/doc/python-xarray-doc/html/user-guide/interpolation.html", "unified_diff": "@@ -237,24 +237,24 @@\n ....: np.sin(np.linspace(0, 2 * np.pi, 10)),\n ....: dims="x",\n ....: coords={"x": np.linspace(0, 1, 10)},\n ....: )\n ....: \n \n In [17]: da.plot.line("o", label="original")\n-Out[17]: [<matplotlib.lines.Line2D at 0x7f93c339b250>]\n+Out[17]: [<matplotlib.lines.Line2D at 0x7f39570fb9d0>]\n \n In [18]: da.interp(x=np.linspace(0, 1, 100)).plot.line(label="linear (default)")\n-Out[18]: [<matplotlib.lines.Line2D at 0x7f93c323e5d0>]\n+Out[18]: [<matplotlib.lines.Line2D at 0x7f39570fb750>]\n \n In [19]: da.interp(x=np.linspace(0, 1, 100), method="cubic").plot.line(label="cubic")\n-Out[19]: [<matplotlib.lines.Line2D at 0x7f93c323e490>]\n+Out[19]: [<matplotlib.lines.Line2D at 0x7f39570fbd90>]\n \n In [20]: plt.legend()\n-Out[20]: <matplotlib.legend.Legend at 0x7f93c3208c20>\n+Out[20]: <matplotlib.legend.Legend at 0x7f39570846e0>\n
Additional keyword arguments can be passed to scipy\u2019s functions.
\n# fill 0 for the outside of the original coordinates.\n In [21]: da.interp(x=np.linspace(-0.5, 1.5, 10), kwargs={"fill_value": 0.0})\n@@ -439,15 +439,15 @@\n see Missing values.\n \n \n Example\u00b6
\n Let\u2019s see how interp()
works on real data.
\n # Raw data\n In [44]: ds = xr.tutorial.open_dataset("air_temperature").isel(time=0)\n-PermissionError: [Errno 13] Permission denied: '/nonexistent' | Pooch could not create data cache folder '/nonexistent/first-build/.cache/xarray_tutorial_data'. Will not be able to download data files.\n+PermissionError: [Errno 13] Permission denied: '/nonexistent' | Pooch could not create data cache folder '/nonexistent/second-build/.cache/xarray_tutorial_data'. Will not be able to download data files.\n \n \n In [45]: fig, axes = plt.subplots(ncols=2, figsize=(10, 4))\n \n In [46]: ds.air.plot(ax=axes[0])\n AttributeError: 'Dataset' object has no attribute 'air'\n \n@@ -511,15 +511,15 @@\n ....: axes[0].plot(*xr.broadcast(lon.isel(z=idx), lat.isel(z=idx)), "--k")\n ....: \n \n In [61]: axes[0].set_title("Raw data")\n Out[61]: Text(0.5, 1.0, 'Raw data')\n \n In [62]: dsi = ds.interp(lon=lon, lat=lat)\n-ValueError: Dimensions {'lon', 'lat'} do not exist. Expected one or more of FrozenMappingWarningOnValuesAccess({'x': 3, 'y': 4})\n+ValueError: Dimensions {'lat', 'lon'} do not exist. Expected one or more of FrozenMappingWarningOnValuesAccess({'x': 3, 'y': 4})\n \n \n In [63]: dsi.air.plot(ax=axes[1])\n NameError: name 'dsi' is not defined\n \n \n In [64]: axes[1].set_title("Remapped data")\n", "details": [{"source1": "html2text {}", "source2": "html2text {}", "unified_diff": "@@ -154,26 +154,26 @@\n ....: np.sin(np.linspace(0, 2 * np.pi, 10)),\n ....: dims=\"x\",\n ....: coords={\"x\": np.linspace(0, 1, 10)},\n ....: )\n ....:\n \n In [17]: da.plot.line(\"o\", label=\"original\")\n-Out[17]: []\n+Out[17]: []\n \n In [18]: da.interp(x=np.linspace(0, 1, 100)).plot.line(label=\"linear\n (default)\")\n-Out[18]: []\n+Out[18]: []\n \n In [19]: da.interp(x=np.linspace(0, 1, 100), method=\"cubic\").plot.line\n (label=\"cubic\")\n-Out[19]: []\n+Out[19]: []\n \n In [20]: plt.legend()\n-Out[20]: \n+Out[20]: \n _\b[_\b__\bb_\bu_\bi_\bl_\bd_\b/_\bh_\bt_\bm_\bl_\b/_\b__\bs_\bt_\ba_\bt_\bi_\bc_\b/_\bi_\bn_\bt_\be_\br_\bp_\bo_\bl_\ba_\bt_\bi_\bo_\bn_\b__\bs_\ba_\bm_\bp_\bl_\be_\b1_\b._\bp_\bn_\bg_\b]\n Additional keyword arguments can be passed to scipy\u2019s functions.\n # fill 0 for the outside of the original coordinates.\n In [21]: da.interp(x=np.linspace(-0.5, 1.5, 10), kwargs={\"fill_value\": 0.0})\n Out[21]:\n Size: 80B\n array([ 0. , 0. , 0. , 0.814, 0.604, -0.604, -0.814, 0. , 0. ,\n@@ -337,15 +337,15 @@\n * x (x) float64 24B 0.5 1.5 2.5\n For the details of interpolate_na(), see _\bM_\bi_\bs_\bs_\bi_\bn_\bg_\b _\bv_\ba_\bl_\bu_\be_\bs.\n *\b**\b**\b**\b**\b* E\bEx\bxa\bam\bmp\bpl\ble\be_\b?\b\u00b6 *\b**\b**\b**\b**\b*\n Let\u2019s see how interp() works on real data.\n # Raw data\n In [44]: ds = xr.tutorial.open_dataset(\"air_temperature\").isel(time=0)\n PermissionError: [Errno 13] Permission denied: '/nonexistent' | Pooch could not\n-create data cache folder '/nonexistent/first-build/.cache/\n+create data cache folder '/nonexistent/second-build/.cache/\n xarray_tutorial_data'. Will not be able to download data files.\n \n \n In [45]: fig, axes = plt.subplots(ncols=2, figsize=(10, 4))\n \n In [46]: ds.air.plot(ax=axes[0])\n AttributeError: 'Dataset' object has no attribute 'air'\n@@ -410,15 +410,15 @@\n k\")\n ....:\n \n In [61]: axes[0].set_title(\"Raw data\")\n Out[61]: Text(0.5, 1.0, 'Raw data')\n \n In [62]: dsi = ds.interp(lon=lon, lat=lat)\n-ValueError: Dimensions {'lon', 'lat'} do not exist. Expected one or more of\n+ValueError: Dimensions {'lat', 'lon'} do not exist. Expected one or more of\n FrozenMappingWarningOnValuesAccess({'x': 3, 'y': 4})\n \n \n In [63]: dsi.air.plot(ax=axes[1])\n NameError: name 'dsi' is not defined\n \n \n"}]}, {"source1": "./usr/share/doc/python-xarray-doc/html/user-guide/io.html", "source2": "./usr/share/doc/python-xarray-doc/html/user-guide/io.html", "unified_diff": "@@ -630,15 +630,15 @@\n ....: "y": pd.date_range("2000-01-01", periods=5),\n ....: "z": ("x", list("abcd")),\n ....: },\n ....: )\n ....: \n \n In [13]: ds.to_zarr("path/to/directory.zarr")\n-Out[13]: <xarray.backends.zarr.ZarrStore at 0x7f93961ee4d0>\n+Out[13]: <xarray.backends.zarr.ZarrStore at 0x7f39324264d0>\n
\n \n (The suffix .zarr
is optional\u2013just a reminder that a zarr store lives\n there.) If the directory does not exist, it will be created. If a zarr\n store is already present at that path, an error will be raised, preventing it\n from being overwritten. To override this behavior and overwrite an existing\n store, add mode='w'
when invoking to_zarr()
.
\n@@ -724,36 +724,36 @@\n \n In [18]: ds = xr.Dataset({"foo": ("x", dummies)}, coords={"x": np.arange(30)})\n \n In [19]: path = "path/to/directory.zarr"\n \n # Now we write the metadata without computing any array values\n In [20]: ds.to_zarr(path, compute=False)\n-Out[20]: Delayed('_finalize_store-bfb46614-7c78-48e0-b4aa-72b0e4c9db73')\n+Out[20]: Delayed('_finalize_store-f76aeb1b-76ee-4d49-9dca-80f0ce670b67')\n
Now, a Zarr store with the correct variable shapes and attributes exists that\n can be filled out by subsequent calls to to_zarr
.\n Setting region="auto"
will open the existing store and determine the\n correct alignment of the new data with the existing dimensions, or as an\n explicit mapping from dimension names to Python slice
objects indicating\n where the data should be written (in index space, not label space), e.g.,
# For convenience, we'll slice a single dataset, but in the real use-case\n # we would create them separately possibly even from separate processes.\n In [21]: ds = xr.Dataset({"foo": ("x", np.arange(30))}, coords={"x": np.arange(30)})\n \n # Any of the following region specifications are valid\n In [22]: ds.isel(x=slice(0, 10)).to_zarr(path, region="auto")\n-Out[22]: <xarray.backends.zarr.ZarrStore at 0x7f93961efb50>\n+Out[22]: <xarray.backends.zarr.ZarrStore at 0x7f3932427b50>\n \n In [23]: ds.isel(x=slice(10, 20)).to_zarr(path, region={"x": "auto"})\n-Out[23]: <xarray.backends.zarr.ZarrStore at 0x7f93961ef520>\n+Out[23]: <xarray.backends.zarr.ZarrStore at 0x7f3932427520>\n \n In [24]: ds.isel(x=slice(20, 30)).to_zarr(path, region={"x": slice(20, 30)})\n-Out[24]: <xarray.backends.zarr.ZarrStore at 0x7f9396579e10>\n+Out[24]: <xarray.backends.zarr.ZarrStore at 0x7f39325b5e10>\n
Concurrent writes with region
are safe as long as they modify distinct\n chunks in the underlying Zarr arrays (or use an appropriate lock
).
As a safety check to make it harder to inadvertently override existing values,\n if you set region
then all variables included in a Dataset must have\n dimensions included in region
. Other variables (typically coordinates)\n@@ -769,15 +769,15 @@\n
In [25]: import zarr\n \n In [26]: from numcodecs.blosc import Blosc\n \n In [27]: compressor = Blosc(cname="zstd", clevel=3, shuffle=2)\n \n In [28]: ds.to_zarr("foo.zarr", encoding={"foo": {"compressor": compressor}})\n-Out[28]: <xarray.backends.zarr.ZarrStore at 0x7f93e390c4c0>\n+Out[28]: <xarray.backends.zarr.ZarrStore at 0x7f39322c84c0>\n
Note
\nNot all native zarr compression and filtering options have been tested with\n xarray.
\nChunk sizes may be specified in one of three ways when writing to a zarr store:
\nFor example, let\u2019s say we\u2019re working with a dataset with dimensions\n ('time', 'x', 'y')
, a variable Tair
which is chunked in x
and y
,\n and two multi-dimensional coordinates xc
and yc
:
In [33]: ds = xr.tutorial.open_dataset("rasm")\n-PermissionError: [Errno 13] Permission denied: '/nonexistent' | Pooch could not create data cache folder '/nonexistent/first-build/.cache/xarray_tutorial_data'. Will not be able to download data files.\n+PermissionError: [Errno 13] Permission denied: '/nonexistent' | Pooch could not create data cache folder '/nonexistent/second-build/.cache/xarray_tutorial_data'. Will not be able to download data files.\n \n \n In [34]: ds["Tair"] = ds["Tair"].chunk({"x": 100, "y": 100})\n KeyError: "No variable named 'Tair'. Variables on the dataset include ['foo', 'x']"\n \n \n In [35]: ds\n@@ -882,15 +882,15 @@\n foo (x) int64 240B 0 1 2 3 4 5 6 7 8 9 ... 21 22 23 24 25 26 27 28 29\n
These multi-dimensional coordinates are only two-dimensional and take up very little\n space on disk or in memory, yet when writing to disk the default zarr behavior is to\n split them into chunks:
\nIn [36]: ds.to_zarr("path/to/directory.zarr", mode="w")\n-Out[36]: <xarray.backends.zarr.ZarrStore at 0x7f93e390d900>\n+Out[36]: <xarray.backends.zarr.ZarrStore at 0x7f39322c9900>\n \n In [37]: ! ls -R path/to/directory.zarr\n path/to/directory.zarr:\n foo x\n \n path/to/directory.zarr/foo:\n 0\n@@ -1062,15 +1062,15 @@\n Ncdata\u00b6
\n Ncdata provides more sophisticated means of transferring data, including entire\n datasets. It uses the file saving and loading functions in both projects to provide a\n more \u201ccorrect\u201d translation between them, but still with very low overhead and not\n using actual disk files.
\n For example:
\n In [48]: ds = xr.tutorial.open_dataset("air_temperature_gradient")\n-PermissionError: [Errno 13] Permission denied: '/nonexistent' | Pooch could not create data cache folder '/nonexistent/first-build/.cache/xarray_tutorial_data'. Will not be able to download data files.\n+PermissionError: [Errno 13] Permission denied: '/nonexistent' | Pooch could not create data cache folder '/nonexistent/second-build/.cache/xarray_tutorial_data'. Will not be able to download data files.\n \n \n In [49]: cubes = ncdata.iris_xarray.cubes_from_xarray(ds)\n NameError: name 'ncdata' is not defined\n \n \n In [50]: print(cubes)\n", "details": [{"source1": "html2text {}", "source2": "html2text {}", "unified_diff": "@@ -481,15 +481,15 @@\n ....: \"y\": pd.date_range(\"2000-01-01\", periods=5),\n ....: \"z\": (\"x\", list(\"abcd\")),\n ....: },\n ....: )\n ....:\n \n In [13]: ds.to_zarr(\"path/to/directory.zarr\")\n-Out[13]: \n+Out[13]: \n (The suffix .zarr is optional\u2013just a reminder that a zarr store lives there.)\n If the directory does not exist, it will be created. If a zarr store is already\n present at that path, an error will be raised, preventing it from being\n overwritten. To override this behavior and overwrite an existing store, add\n mode='w' when invoking to_zarr().\n DataArrays can also be saved to disk using the DataArray.to_zarr() method, and\n loaded from disk using the open_dataarray() function with engine='zarr'.\n@@ -562,35 +562,35 @@\n \n In [18]: ds = xr.Dataset({\"foo\": (\"x\", dummies)}, coords={\"x\": np.arange(30)})\n \n In [19]: path = \"path/to/directory.zarr\"\n \n # Now we write the metadata without computing any array values\n In [20]: ds.to_zarr(path, compute=False)\n-Out[20]: Delayed('_finalize_store-bfb46614-7c78-48e0-b4aa-72b0e4c9db73')\n+Out[20]: Delayed('_finalize_store-f76aeb1b-76ee-4d49-9dca-80f0ce670b67')\n Now, a Zarr store with the correct variable shapes and attributes exists that\n can be filled out by subsequent calls to to_zarr. Setting region=\"auto\" will\n open the existing store and determine the correct alignment of the new data\n with the existing dimensions, or as an explicit mapping from dimension names to\n Python slice objects indicating where the data should be written (in index\n space, not label space), e.g.,\n # For convenience, we'll slice a single dataset, but in the real use-case\n # we would create them separately possibly even from separate processes.\n In [21]: ds = xr.Dataset({\"foo\": (\"x\", np.arange(30))}, coords={\"x\": np.arange\n (30)})\n \n # Any of the following region specifications are valid\n In [22]: ds.isel(x=slice(0, 10)).to_zarr(path, region=\"auto\")\n-Out[22]: \n+Out[22]: \n \n In [23]: ds.isel(x=slice(10, 20)).to_zarr(path, region={\"x\": \"auto\"})\n-Out[23]: \n+Out[23]: \n \n In [24]: ds.isel(x=slice(20, 30)).to_zarr(path, region={\"x\": slice(20, 30)})\n-Out[24]: \n+Out[24]: \n Concurrent writes with region are safe as long as they modify distinct chunks\n in the underlying Zarr arrays (or use an appropriate lock).\n As a safety check to make it harder to inadvertently override existing values,\n if you set region then a\bal\bll\bl variables included in a Dataset must have dimensions\n included in region. Other variables (typically coordinates) need to be\n explicitly dropped and/or written in a separate calls to to_zarr with mode='a'.\n *\b**\b**\b**\b* Z\bZa\bar\brr\br C\bCo\bom\bmp\bpr\bre\bes\bss\bso\bor\brs\bs a\ban\bnd\bd F\bFi\bil\blt\bte\ber\brs\bs_\b?\b\u00b6 *\b**\b**\b**\b*\n@@ -601,15 +601,15 @@\n In [25]: import zarr\n \n In [26]: from numcodecs.blosc import Blosc\n \n In [27]: compressor = Blosc(cname=\"zstd\", clevel=3, shuffle=2)\n \n In [28]: ds.to_zarr(\"foo.zarr\", encoding={\"foo\": {\"compressor\": compressor}})\n-Out[28]: \n+Out[28]: \n Note\n Not all native zarr compression and filtering options have been tested with\n xarray.\n *\b**\b**\b**\b* M\bMo\bod\bdi\bif\bfy\byi\bin\bng\bg e\bex\bxi\bis\bst\bti\bin\bng\bg Z\bZa\bar\brr\br s\bst\bto\bor\bre\bes\bs_\b?\b\u00b6 *\b**\b**\b**\b*\n Xarray supports several ways of incrementally writing variables to a Zarr\n store. These options are useful for scenarios when it is infeasible or\n undesirable to write your entire dataset at once.\n@@ -635,28 +635,28 @@\n ....: \"y\": [1, 2, 3, 4, 5],\n ....: \"t\": pd.date_range(\"2001-01-01\", periods=2),\n ....: },\n ....: )\n ....:\n \n In [30]: ds1.to_zarr(\"path/to/directory.zarr\")\n-Out[30]: \n+Out[30]: \n \n In [31]: ds2 = xr.Dataset(\n ....: {\"foo\": ((\"x\", \"y\", \"t\"), np.random.rand(4, 5, 2))},\n ....: coords={\n ....: \"x\": [10, 20, 30, 40],\n ....: \"y\": [1, 2, 3, 4, 5],\n ....: \"t\": pd.date_range(\"2001-01-03\", periods=2),\n ....: },\n ....: )\n ....:\n \n In [32]: ds2.to_zarr(\"path/to/directory.zarr\", append_dim=\"t\")\n-Out[32]: \n+Out[32]: \n *\b**\b**\b**\b* S\bSp\bpe\bec\bci\bif\bfy\byi\bin\bng\bg c\bch\bhu\bun\bnk\bks\bs i\bin\bn a\ba z\bza\bar\brr\br s\bst\bto\bor\bre\be_\b?\b\u00b6 *\b**\b**\b**\b*\n Chunk sizes may be specified in one of three ways when writing to a zarr store:\n 1. Manual chunk sizing through the use of the encoding argument in\n Dataset.to_zarr():\n 2. Automatic chunking based on chunks in dask arrays\n 3. Default chunk behavior determined by the zarr library\n The resulting chunks will be determined based on the order of the above list;\n@@ -675,15 +675,15 @@\n positional ordering of the dimensions in each array. Watch out for arrays with\n differently-ordered dimensions within a single Dataset.\n For example, let\u2019s say we\u2019re working with a dataset with dimensions ('time',\n 'x', 'y'), a variable Tair which is chunked in x and y, and two multi-\n dimensional coordinates xc and yc:\n In [33]: ds = xr.tutorial.open_dataset(\"rasm\")\n PermissionError: [Errno 13] Permission denied: '/nonexistent' | Pooch could not\n-create data cache folder '/nonexistent/first-build/.cache/\n+create data cache folder '/nonexistent/second-build/.cache/\n xarray_tutorial_data'. Will not be able to download data files.\n \n \n In [34]: ds[\"Tair\"] = ds[\"Tair\"].chunk({\"x\": 100, \"y\": 100})\n KeyError: \"No variable named 'Tair'. Variables on the dataset include ['foo',\n 'x']\"\n \n@@ -696,15 +696,15 @@\n * x (x) int64 240B 0 1 2 3 4 5 6 7 8 9 ... 21 22 23 24 25 26 27 28 29\n Data variables:\n foo (x) int64 240B 0 1 2 3 4 5 6 7 8 9 ... 21 22 23 24 25 26 27 28 29\n These multi-dimensional coordinates are only two-dimensional and take up very\n little space on disk or in memory, yet when writing to disk the default zarr\n behavior is to split them into chunks:\n In [36]: ds.to_zarr(\"path/to/directory.zarr\", mode=\"w\")\n-Out[36]: \n+Out[36]: \n \n In [37]: ! ls -R path/to/directory.zarr\n path/to/directory.zarr:\n foo x\n \n path/to/directory.zarr/foo:\n 0\n@@ -850,15 +850,15 @@\n _\bN_\bc_\bd_\ba_\bt_\ba provides more sophisticated means of transferring data, including entire\n datasets. It uses the file saving and loading functions in both projects to\n provide a more \u201ccorrect\u201d translation between them, but still with very low\n overhead and not using actual disk files.\n For example:\n In [48]: ds = xr.tutorial.open_dataset(\"air_temperature_gradient\")\n PermissionError: [Errno 13] Permission denied: '/nonexistent' | Pooch could not\n-create data cache folder '/nonexistent/first-build/.cache/\n+create data cache folder '/nonexistent/second-build/.cache/\n xarray_tutorial_data'. Will not be able to download data files.\n \n \n In [49]: cubes = ncdata.iris_xarray.cubes_from_xarray(ds)\n NameError: name 'ncdata' is not defined\n \n \n"}]}, {"source1": "./usr/share/doc/python-xarray-doc/html/user-guide/plotting.html", "source2": "./usr/share/doc/python-xarray-doc/html/user-guide/plotting.html", "unified_diff": "@@ -100,15 +100,15 @@\n In [3]: import matplotlib.pyplot as plt\n \n In [4]: import xarray as xr\n
\n \n For these examples we\u2019ll use the North American air temperature dataset.
\n In [5]: airtemps = xr.tutorial.open_dataset("air_temperature")\n-PermissionError: [Errno 13] Permission denied: '/nonexistent' | Pooch could not create data cache folder '/nonexistent/first-build/.cache/xarray_tutorial_data'. Will not be able to download data files.\n+PermissionError: [Errno 13] Permission denied: '/nonexistent' | Pooch could not create data cache folder '/nonexistent/second-build/.cache/xarray_tutorial_data'. Will not be able to download data files.\n \n \n In [6]: airtemps\n NameError: name 'airtemps' is not defined\n \n \n # Convert to celsius\n@@ -445,15 +445,15 @@\n \n # Apply a nonlinear transformation to one of the coords\n In [50]: b.coords["lat"] = np.log(b.coords["lat"])\n KeyError: 'lat'\n \n \n In [51]: b.plot()\n-Out[51]: [<matplotlib.lines.Line2D at 0x7f93e369b250>]\n+Out[51]: [<matplotlib.lines.Line2D at 0x7f3970777c50>]\n
\n \n
\n \n \n \n Other types of plot\u00b6
\n@@ -857,117 +857,117 @@\n * y (y) float64 88B 0.0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1.0\n * z (z) int64 32B 0 1 2 3\n * w (w) <U5 80B 'one' 'two' 'three' 'five'\n Attributes:\n units: Aunits\n \n In [99]: ds.A.plot.scatter(x="y")\n-Out[99]: <matplotlib.collections.PathCollection at 0x7f93e333c440>\n+Out[99]: <matplotlib.collections.PathCollection at 0x7f3970854440>\n
Same plot can be displayed using the dataset:
\nIn [100]: ds.plot.scatter(x="y", y="A")\n-Out[100]: <matplotlib.collections.PathCollection at 0x7f93e31307d0>\n+Out[100]: <matplotlib.collections.PathCollection at 0x7f39706e2ad0>\n
Now suppose we want to scatter the A
DataArray against the B
DataArray
In [101]: ds.plot.scatter(x="A", y="B")\n-Out[101]: <matplotlib.collections.PathCollection at 0x7f93e2fcf390>\n+Out[101]: <matplotlib.collections.PathCollection at 0x7f397071e0d0>\n
The hue
kwarg lets you vary the color by variable value
In [102]: ds.plot.scatter(x="A", y="B", hue="w")\n-Out[102]: <matplotlib.collections.PathCollection at 0x7f93e3016c10>\n+Out[102]: <matplotlib.collections.PathCollection at 0x7f3970564f50>\n
You can force a legend instead of a colorbar by setting add_legend=True, add_colorbar=False
.
In [103]: ds.plot.scatter(x="A", y="B", hue="w", add_legend=True, add_colorbar=False)\n-Out[103]: <matplotlib.collections.PathCollection at 0x7f93e2fcc2d0>\n+Out[103]: <matplotlib.collections.PathCollection at 0x7f39706e0050>\n
In [104]: ds.plot.scatter(x="A", y="B", hue="w", add_legend=False, add_colorbar=True)\n-Out[104]: <matplotlib.collections.PathCollection at 0x7f93e2fcce10>\n+Out[104]: <matplotlib.collections.PathCollection at 0x7f393252fed0>\n
The markersize
kwarg lets you vary the point\u2019s size by variable value.\n You can additionally pass size_norm
to control how the variable\u2019s values are mapped to point sizes.
In [105]: ds.plot.scatter(x="A", y="B", hue="y", markersize="z")\n-Out[105]: <matplotlib.collections.PathCollection at 0x7f93c32b0a50>\n+Out[105]: <matplotlib.collections.PathCollection at 0x7f395709bb10>\n
The z
kwarg lets you plot the data along the z-axis as well.
In [106]: ds.plot.scatter(x="A", y="B", z="z", hue="y", markersize="x")\n-Out[106]: <mpl_toolkits.mplot3d.art3d.Path3DCollection at 0x7f93e3699310>\n+Out[106]: <mpl_toolkits.mplot3d.art3d.Path3DCollection at 0x7f395662cf50>\n
Faceting is also possible
\nIn [107]: ds.plot.scatter(x="A", y="B", hue="y", markersize="x", row="x", col="w")\n-Out[107]: <xarray.plot.facetgrid.FacetGrid at 0x7f93c3209400>\n+Out[107]: <xarray.plot.facetgrid.FacetGrid at 0x7f39570842f0>\n
And adding the z-axis
\nIn [108]: ds.plot.scatter(x="A", y="B", z="z", hue="y", markersize="x", row="x", col="w")\n-Out[108]: <xarray.plot.facetgrid.FacetGrid at 0x7f93e2a4a710>\n+Out[108]: <xarray.plot.facetgrid.FacetGrid at 0x7f396ff62710>\n
For more advanced scatter plots, we recommend converting the relevant data variables\n to a pandas DataFrame and using the extensive plotting capabilities of seaborn
.
Visualizing vector fields is supported with quiver plots:
\nIn [109]: ds.isel(w=1, z=1).plot.quiver(x="x", y="y", u="A", v="B")\n-Out[109]: <matplotlib.quiver.Quiver at 0x7f9399bbc1a0>\n+Out[109]: <matplotlib.quiver.Quiver at 0x7f395757fa10>\n
where u
and v
denote the x and y direction components of the arrow vectors. Again, faceting is also possible:
In [110]: ds.plot.quiver(x="x", y="y", u="A", v="B", col="w", row="z", scale=4)\n-Out[110]: <xarray.plot.facetgrid.FacetGrid at 0x7f93e280f4d0>\n+Out[110]: <xarray.plot.facetgrid.FacetGrid at 0x7f396ff1b4d0>\n
scale
is required for faceted quiver plots.\n The scale determines the number of data units per arrow length unit, i.e. a smaller scale parameter makes the arrow longer.
Visualizing vector fields is also supported with streamline plots:
\nIn [111]: ds.isel(w=1, z=1).plot.streamplot(x="x", y="y", u="A", v="B")\n-Out[111]: <matplotlib.collections.LineCollection at 0x7f93e2306fd0>\n+Out[111]: <matplotlib.collections.LineCollection at 0x7f3970223d90>\n
where u
and v
denote the x and y direction components of the vectors tangent to the streamlines.\n Again, faceting is also possible:
In [112]: ds.plot.streamplot(x="x", y="y", u="A", v="B", col="w", row="z")\n-Out[112]: <xarray.plot.facetgrid.FacetGrid at 0x7f9395de3950>\n+Out[112]: <xarray.plot.facetgrid.FacetGrid at 0x7f393289b6f0>\n
To follow this section you\u2019ll need to have Cartopy installed and working.
\nThis script will plot the air temperature on a map.
\nIn [113]: import cartopy.crs as ccrs\n \n In [114]: air = xr.tutorial.open_dataset("air_temperature").air\n-PermissionError: [Errno 13] Permission denied: '/nonexistent' | Pooch could not create data cache folder '/nonexistent/first-build/.cache/xarray_tutorial_data'. Will not be able to download data files.\n+PermissionError: [Errno 13] Permission denied: '/nonexistent' | Pooch could not create data cache folder '/nonexistent/second-build/.cache/xarray_tutorial_data'. Will not be able to download data files.\n \n \n In [115]: p = air.isel(time=0).plot(\n .....: subplot_kws=dict(projection=ccrs.Orthographic(-80, 35), facecolor="gray"),\n .....: transform=ccrs.PlateCarree(),\n .....: )\n .....: \n@@ -1024,24 +1024,24 @@\n In [121]: import xarray.plot as xplt\n \n In [122]: da = xr.DataArray(range(5))\n \n In [123]: fig, axs = plt.subplots(ncols=2, nrows=2)\n \n In [124]: da.plot(ax=axs[0, 0])\n-Out[124]: [<matplotlib.lines.Line2D at 0x7f93e23ade50>]\n+Out[124]: [<matplotlib.lines.Line2D at 0x7f396f123c50>]\n \n In [125]: da.plot.line(ax=axs[0, 1])\n-Out[125]: [<matplotlib.lines.Line2D at 0x7f93e23adf90>]\n+Out[125]: [<matplotlib.lines.Line2D at 0x7f396f123b10>]\n \n In [126]: xplt.plot(da, ax=axs[1, 0])\n-Out[126]: [<matplotlib.lines.Line2D at 0x7f93e23af250>]\n+Out[126]: [<matplotlib.lines.Line2D at 0x7f396f1239d0>]\n \n In [127]: xplt.line(da, ax=axs[1, 1])\n-Out[127]: [<matplotlib.lines.Line2D at 0x7f93e23c9590>]\n+Out[127]: [<matplotlib.lines.Line2D at 0x7f396f120b90>]\n \n In [128]: plt.tight_layout()\n \n In [129]: plt.draw()\n
\n \n
\n@@ -1091,15 +1091,15 @@\n
The plot will produce an image corresponding to the values of the array.\n Hence the top left pixel will be a different color than the others.\n Before reading on, you may want to look at the coordinates and\n think carefully about what the limits, labels, and orientation for\n each of the axes should be.
\nIn [134]: a.plot()\n-Out[134]: <matplotlib.collections.QuadMesh at 0x7f93e1b15090>\n+Out[134]: <matplotlib.collections.QuadMesh at 0x7f396f065090>\n
It may seem strange that\n the values on the y axis are decreasing with -0.5 on the top. This is because\n the pixels are centered over their coordinates, and the\n@@ -1122,57 +1122,57 @@\n .....: np.arange(20).reshape(4, 5),\n .....: dims=["y", "x"],\n .....: coords={"lat": (("y", "x"), lat), "lon": (("y", "x"), lon)},\n .....: )\n .....: \n \n In [139]: da.plot.pcolormesh(x="lon", y="lat")\n-Out[139]: <matplotlib.collections.QuadMesh at 0x7f93e2967ed0>\n+Out[139]: <matplotlib.collections.QuadMesh at 0x7f396f094a50>\n
Note that in this case, xarray still follows the pixel centered convention.\n This might be undesirable in some cases, for example when your data is defined\n on a polar projection (GH781). This is why the default is to not follow\n this convention when plotting on a map:
\nIn [140]: import cartopy.crs as ccrs\n \n In [141]: ax = plt.subplot(projection=ccrs.PlateCarree())\n \n In [142]: da.plot.pcolormesh(x="lon", y="lat", ax=ax)\n-Out[142]: <cartopy.mpl.geocollection.GeoQuadMesh at 0x7f93e1a5e210>\n+Out[142]: <cartopy.mpl.geocollection.GeoQuadMesh at 0x7f3934ab3890>\n \n In [143]: ax.scatter(lon, lat, transform=ccrs.PlateCarree())\n-Out[143]: <matplotlib.collections.PathCollection at 0x7f93e2323ed0>\n+Out[143]: <matplotlib.collections.PathCollection at 0x7f396fa3c190>\n \n In [144]: ax.coastlines()\n-Out[144]: <cartopy.mpl.feature_artist.FeatureArtist at 0x7f9397503cb0>\n+Out[144]: <cartopy.mpl.feature_artist.FeatureArtist at 0x7f3935fa3770>\n \n In [145]: ax.gridlines(draw_labels=True)\n-Out[145]: <cartopy.mpl.gridliner.Gridliner at 0x7f93975becf0>\n+Out[145]: <cartopy.mpl.gridliner.Gridliner at 0x7f3933703b60>\n
You can however decide to infer the cell boundaries and use the\n infer_intervals
keyword:
In [146]: ax = plt.subplot(projection=ccrs.PlateCarree())\n \n In [147]: da.plot.pcolormesh(x="lon", y="lat", ax=ax, infer_intervals=True)\n-Out[147]: <cartopy.mpl.geocollection.GeoQuadMesh at 0x7f93e2af2ad0>\n+Out[147]: <cartopy.mpl.geocollection.GeoQuadMesh at 0x7f396f9c1590>\n \n In [148]: ax.scatter(lon, lat, transform=ccrs.PlateCarree())\n-Out[148]: <matplotlib.collections.PathCollection at 0x7f93e2304f50>\n+Out[148]: <matplotlib.collections.PathCollection at 0x7f396f1b2850>\n \n In [149]: ax.coastlines()\n-Out[149]: <cartopy.mpl.feature_artist.FeatureArtist at 0x7f93e2304e10>\n+Out[149]: <cartopy.mpl.feature_artist.FeatureArtist at 0x7f396f1b2710>\n \n In [150]: ax.gridlines(draw_labels=True)\n-Out[150]: <cartopy.mpl.gridliner.Gridliner at 0x7f93e2307750>\n+Out[150]: <cartopy.mpl.gridliner.Gridliner at 0x7f396f1b25d0>\n
Note
\nThe data model of xarray does not support datasets with cell boundaries\n@@ -1180,26 +1180,26 @@\n outside the xarray framework.
\nOne can also make line plots with multidimensional coordinates. In this case, hue
must be a dimension name, not a coordinate name.
In [151]: f, ax = plt.subplots(2, 1)\n \n In [152]: da.plot.line(x="lon", hue="y", ax=ax[0])\n Out[152]: \n-[<matplotlib.lines.Line2D at 0x7f93e18fe490>,\n- <matplotlib.lines.Line2D at 0x7f93e18fe5d0>,\n- <matplotlib.lines.Line2D at 0x7f93e18fe710>,\n- <matplotlib.lines.Line2D at 0x7f93e18fe850>]\n+[<matplotlib.lines.Line2D at 0x7f396ee9a490>,\n+ <matplotlib.lines.Line2D at 0x7f396ee9a5d0>,\n+ <matplotlib.lines.Line2D at 0x7f396ee9a710>,\n+ <matplotlib.lines.Line2D at 0x7f396ee9a850>]\n \n In [153]: da.plot.line(x="lon", hue="x", ax=ax[1])\n Out[153]: \n-[<matplotlib.lines.Line2D at 0x7f93e17a4b90>,\n- <matplotlib.lines.Line2D at 0x7f93e17a4cd0>,\n- <matplotlib.lines.Line2D at 0x7f93e17a4e10>,\n- <matplotlib.lines.Line2D at 0x7f93e17a4f50>,\n- <matplotlib.lines.Line2D at 0x7f93e17a5090>]\n+[<matplotlib.lines.Line2D at 0x7f396eebcb90>,\n+ <matplotlib.lines.Line2D at 0x7f396eebccd0>,\n+ <matplotlib.lines.Line2D at 0x7f396eebce10>,\n+ <matplotlib.lines.Line2D at 0x7f396eebcf50>,\n+ <matplotlib.lines.Line2D at 0x7f396eebd090>]\n
Whilst coarsen
is normally used for reducing your data\u2019s resolution by applying a reduction function\n (see the page on computation),\n it can also be used to reorganise your data without applying a computation via construct()
.
Taking our example tutorial air temperature dataset over the Northern US
\nIn [56]: air = xr.tutorial.open_dataset("air_temperature")["air"]\n-PermissionError: [Errno 13] Permission denied: '/nonexistent' | Pooch could not create data cache folder '/nonexistent/first-build/.cache/xarray_tutorial_data'. Will not be able to download data files.\n+PermissionError: [Errno 13] Permission denied: '/nonexistent' | Pooch could not create data cache folder '/nonexistent/second-build/.cache/xarray_tutorial_data'. Will not be able to download data files.\n \n \n In [57]: air.isel(time=0).plot(x="lon", y="lat")\n NameError: name 'air' is not defined\n
To see an example of what each of these strategies might produce, you can call one followed by the .example()
method,\n which is a general hypothesis method valid for all strategies.
In [2]: import xarray.testing.strategies as xrst\n \n In [3]: xrst.variables().example()\n Out[3]: \n-<xarray.Variable (\u017b\u01518\u010c: 3)> Size: 24B\n-array([ nan+nanj, -1.175e-38 +2.j, -3.333e-01 +0.j], dtype=complex64)\n+<xarray.Variable (\u00e2\u00f2: 4)> Size: 16B\n+array([nan, nan, -0., nan], dtype=float32)\n+Attributes:\n+ \u00c1\u0148: {'\u0166\u010b\u00b20': '\u0154\u017c', '\u016bQ\u017f\u00e7\u015b': '\u017c9\u012fc\u00f4'}\n+ \u0115\u0108\u00dc\u00f3\u017f: {'\u0101\u00d0l\u0140\u0145': '\u00c8H\u0157\u017c', 'A\u017b\u0129': '', '\u017f\u0130\u0168\u00bdA': True, '': False, '\u017f\u017c': Tr...\n \n In [4]: xrst.variables().example()\n Out[4]: \n-<xarray.Variable (\u00fdB\u017b: 2)> Size: 16B\n-array([-1.9 , -0.333])\n+<xarray.Variable (\u0168B: 4, \u017f\u00e7\u0106\u0107\u0174: 4)> Size: 32B\n+array([[17709, 27830, 17709, 17709],\n+ [17709, 17709, 17709, 17709],\n+ [17709, 17709, 17709, 17709],\n+ [17709, 17709, 17709, 17709]], shape=(4, 4), dtype=int16)\n Attributes:\n- \u017c: None\n- \u0136\u017f\u0153\u00d6\u017f: None\n+ \u0104\u0113\u017d\u00eb\u017f: ['' '\\x9e\u00a6=S']\n+ : False\n+ \u0148\u014a\u017dI\u00f0: 5\u017d\u013e\u00ee\u00f6\n+ \u00da\u013e\u00c48\u017f: True\n+ \u011f\u013e\u01031: None\n+ \u015e\u0116: \u0147\u017e\u0105\n+ \u00fc\u00be: [b'\\x03\\x10Q(/+26\\xa2\\xe9\\xa1Ep\\x85@']\n \n In [5]: xrst.variables().example()\n Out[5]: \n-<xarray.Variable (\u00cf\u0143\u00ec\u0111\u011c: 1, \u017e\u0148\u00d0\u0105\u017b: 1, \u0172\u0109\u011b\u00d2\u00e9: 3)> Size: 24B\n-array([[[1.798e+308, 1.798e+308, 1.798e+308]]])\n+<xarray.Variable (o: 5)> Size: 80B\n+array([-1.500e+00+1.000e+000j, 9.007e+15-6.429e+016j, nan +nanj,\n+ 4.802e+16-1.766e+084j, -inf-2.225e-308j])\n Attributes:\n- 1\u017b\u013d\u00c6\u00ca: \u00ca\n- \u012d\u017d\u017e\u00e1\u017c: False\n- \u0169E: [['\u00d6' '\u00d6']\\n ['\u00d6' '\u00d6']]\n- \u0142\u0135\u017dOE: None\n+ \u00ceb: [b'\\x1f']\n
You can see that calling .example()
multiple times will generate different examples, giving you an idea of the wide\n range of data that the xarray strategies can generate.
In your tests however you should not use .example()
- instead you should parameterize your tests with the\n hypothesis.given()
decorator:
In [6]: from hypothesis import given\n@@ -129,66 +138,65 @@\n Xarray\u2019s strategies can accept other strategies as arguments, allowing you to customise the contents of the generated\n examples.
\n # generate a Variable containing an array with a complex number dtype, but all other details still arbitrary\n In [8]: from hypothesis.extra.numpy import complex_number_dtypes\n \n In [9]: xrst.variables(dtype=complex_number_dtypes()).example()\n Out[9]: \n-<xarray.Variable (\u0124: 6)> Size: 48B\n-array([-6.948e+15 +nanj, -1.000e+07-6.104e-05j, -0.000e+00-0.000e+00j, -0.000e+00+1.401e-45j,\n- 0.000e+00-1.401e-45j, -3.403e+16+1.175e-38j], dtype=complex64)\n+<xarray.Variable (\u0173\u0168\u017d\u00ce: 2, J: 4)> Size: 128B\n+array([[-1.175e-038-6.860e+016j, 1.100e+000-3.624e+016j, -2.335e+153-1.113e-308j,\n+ 5.351e+016+2.225e-311j],\n+ [ 4.032e+016+3.356e+116j, -2.640e-174 +nanj, -2.225e-311+5.357e+016j,\n+ -1.648e-093-5.000e-001j]], dtype='>c16')\n Attributes:\n- : [b'']\n- \u017bT\u0136a\u017c: [['\\U000fb3d5' '`']\\n ['{' '{']]\n- \u0114: [b'\\xa3h']\n- \u00bc\u0120\u00c3\u017e\u017c: \u0156\u0120\u00e7\u00cbU\n- \u016e: False\n- \u00c7: \n+ \u00c3: [[b'B\\xe1?' b'\\xfe\\x08\\xd4f']\\n [b'=dO\\xd23\\xe0\\xdd\\xf2\\xe9\\xed...\n+ \u0172\u0108\u0113\u017e\u017c: \u013e\u00ff\u017f\n
\n \n This also works with custom strategies, or strategies defined in other packages.\n For example you could imagine creating a chunks
strategy to specify particular chunking patterns for a dask-backed array.
\n \n \n Fixing Arguments\u00b6
\n If you want to fix one aspect of the data structure, whilst allowing variation in the generated examples\n over all other aspects, then use hypothesis.strategies.just()
.
\n In [10]: import hypothesis.strategies as st\n \n # Generates only variable objects with dimensions ["x", "y"]\n In [11]: xrst.variables(dims=st.just(["x", "y"])).example()\n Out[11]: \n-<xarray.Variable (x: 3, y: 6)> Size: 36B\n-array([[ inf, -0.0e+00, inf, inf, inf, inf],\n- [ inf, inf, inf, inf, inf, inf],\n- [ inf, 1.2e-07, 1.5e+00, inf, inf, inf]], shape=(3, 6), dtype=float16)\n-Attributes:\n- x\u00f2\u016f\u016f: {'\u010f': ''}\n- \u00ff\u017f\u00be\u017e\u0148: {}\n- \u00c7\u0114\u012c\u00dc\u0131: {'\u00b2': array([[2547821043, 51546]], dtype='>u4'), 'u\u017e\u00c2': Fa...\n+<xarray.Variable (x: 5, y: 6)> Size: 30B\n+array([[ 43, -108, -108, -108, -108, -108],\n+ [-108, -108, -108, -108, 122, -108],\n+ [-108, -45, -122, -108, -108, -108],\n+ [-108, -108, -108, -108, -108, -108],\n+ [-108, -108, -108, -108, -108, -108]], shape=(5, 6), dtype=int8)\n
\n \n (This is technically another example of chaining strategies - hypothesis.strategies.just()
is simply a\n special strategy that just contains a single example.)
\n To fix the length of dimensions you can instead pass dims
as a mapping of dimension names to lengths\n (i.e. following xarray objects\u2019 .sizes()
property), e.g.
\n # Generates only variables with dimensions ["x", "y"], of lengths 2 & 3 respectively\n In [12]: xrst.variables(dims=st.just({"x": 2, "y": 3})).example()\n Out[12]: \n-<xarray.Variable (x: 2, y: 3)> Size: 6B\n-array([[ 18, 221, 159],\n- [ 91, 239, 30]], dtype=uint8)\n+<xarray.Variable (x: 2, y: 3)> Size: 48B\n+array([[-3083827208917602196, -3083827208917602196, -3083827208917602196],\n+ [-3083827208917602196, -3083827208917602196, -3083827208917602196]])\n+Attributes:\n+ : [False False]\n+ J\u00c3\u017f\u017e\u0102: ['NaT']\n
\n \n You can also use this to specify that you want examples which are missing some part of the data structure, for instance
\n # Generates a Variable with no attributes\n In [13]: xrst.variables(attrs=st.just({})).example()\n Out[13]: \n-<xarray.Variable (\u00d5: 5)> Size: 40B\n-array([ inf, inf, 5.169e+115, 2.225e-308, 1.401e-045])\n+<xarray.Variable (\u0154\u0117a\u0129n: 2)> Size: 16B\n+array([-9223372036854727831, -9223372036854746411])\n
\n \n Through a combination of chaining strategies and fixing arguments, you can specify quite complicated requirements on the\n objects your chained strategy will generate.
\n In [14]: fixed_x_variable_y_maybe_z = st.fixed_dictionaries(\n ....: {"x": st.just(2), "y": st.integers(3, 4)}, optional={"z": st.just(2)}\n ....: )\n@@ -197,32 +205,39 @@\n In [15]: fixed_x_variable_y_maybe_z.example()\n Out[15]: {'x': 2, 'y': 3, 'z': 2}\n \n In [16]: special_variables = xrst.variables(dims=fixed_x_variable_y_maybe_z)\n \n In [17]: special_variables.example()\n Out[17]: \n-<xarray.Variable (x: 2, y: 4)> Size: 64B\n-array([[-9223372036854775598, -9223372036854727701, -765380011776258377, -9223372035077253998],\n- [ -756101856880953354, -9223372036854737559, -9223372036854723696, -9223372036854759933]])\n+<xarray.Variable (x: 2, y: 4, z: 2)> Size: 128B\n+array([[[-9223372036854734621, -9223372036854734621],\n+ [-9223372036854734621, -9223372036854734621],\n+ [-9223372036854775654, -9223372036854734621],\n+ [-9223372036854734621, -9223372036854734621]],\n+\n+ [[-9223372036854734621, -9223372036854734621],\n+ [-9223372036854754014, -9223372036854734621],\n+ [-9223372036854734621, -9223372036854734621],\n+ [-9223372036854775792, -9223372036854734621]]], shape=(2, 4, 2))\n Attributes:\n- \u00cf\u0134\u017feC: {'\u016a\u00e7\u0135\u017b\u017f': array([['NaT', 'NaT']], dtype='timedelta64[Y]'), '': ...\n+ L\u00e6: \u0155\n+ \u0173\u00cf\u011e\u016e\u010f: True\n+ \u013a\u0175\u013c: True\n+ \u017fXa\u00e7\u017b: ['\\x8e\\U000b0bbd\\U0009ff28']\n+ \u0150: \u016f\n+ \u017e: True\n+ \u0175\u010c\u00f0\u0149\u017c: True\n+ : \u00cem\n \n In [18]: special_variables.example()\n Out[18]: \n-<xarray.Variable (x: 2, y: 3)> Size: 12B\n-array([[ 11855, -14645, -2561],\n- [-18322, -22316, -20968]], dtype=int16)\n-Attributes:\n- : False\n- \u00dc: True\n- \u0160\u017d: False\n- \u0166: \u0176\n- \u0136: True\n- 5\u017f\u00ff\u017d\u014f: \u00e9\n+<xarray.Variable (x: 2, y: 3)> Size: 6B\n+array([[0, 0, 0],\n+ [0, 0, 0]], dtype=int8)\n
\n \n Here we have used one of hypothesis\u2019 built-in strategies hypothesis.strategies.fixed_dictionaries()
to create a\n strategy which generates mappings of dimension names to lengths (i.e. the size
of the xarray object we want).\n This particular strategy will always generate an x
dimension of length 2, and a y
dimension of\n length either 3 or 4, and will sometimes also generate a z
dimension of length 2.\n By feeding this strategy for dictionaries into the dims
argument of xarray\u2019s variables()
strategy,\n@@ -323,53 +338,47 @@\n ....: array_strategy_fn=xps.arrays,\n ....: dtype=xps.scalar_dtypes(),\n ....: )\n ....: \n \n In [32]: xp_variables.example()\n Out[32]: \n-<xarray.Variable (\u017b\u0126: 2, \u00d91: 1)> Size: 4B\n-array([[23794],\n- [-2592]], dtype=int16)\n+<xarray.Variable (\u00d3: 5)> Size: 40B\n+array([-9223372033794027569, -9223372033794027569, -9223372033794027569, -9223372033794027569,\n+ -9223372033794027569])\n Attributes:\n- f8\u017c\u010d: \u017b\u017c\u00b9\n- \u017e\u00d2\u017f\u017cv: None\n- \u0123\u010e\u00eb: None\n- \u017ev: [[2.]]\n- \u0142\u0100\u017e\u0115\u0165: [[-2147469683 -2147483559]\\n [ 2147483646 -2147440432]]\n- \u0165\u00c3u: \u0169\n- \u0106\u00d8\u00d8w\u00e3: False\n- : \n+ \u00e0\u00b2: None\n+ \u00fd: None\n
Another array API-compliant duck array library would replace the import, e.g. import cupy as cp
instead.
A common task when testing xarray user code is checking that your function works for all valid input dimensions.\n We can chain strategies to achieve this, for which the helper strategy unique_subset_of()
\n is useful.
It works for lists of dimension names
\nIn [33]: dims = ["x", "y", "z"]\n \n In [34]: xrst.unique_subset_of(dims).example()\n-Out[34]: ['x', 'z']\n+Out[34]: ['x', 'y', 'z']\n \n In [35]: xrst.unique_subset_of(dims).example()\n-Out[35]: ['x', 'z']\n+Out[35]: ['y', 'x', 'z']\n
as well as for mappings of dimension names to sizes
\nIn [36]: dim_sizes = {"x": 2, "y": 3, "z": 4}\n \n In [37]: xrst.unique_subset_of(dim_sizes).example()\n-Out[37]: {'y': 3}\n+Out[37]: {'z': 4, 'x': 2, 'y': 3}\n \n In [38]: xrst.unique_subset_of(dim_sizes).example()\n-Out[38]: {'y': 3, 'x': 2, 'z': 4}\n+Out[38]: {'z': 4, 'x': 2}\n
This is useful because operations like reductions can be performed over any subset of the xarray object\u2019s dimensions.\n For example we can write a pytest test that tests that a reduction gives the expected result when applying that reduction\n along any possible valid subset of the Variable\u2019s dimensions.
\nimport numpy.testing as npt\n \n", "details": [{"source1": "html2text {}", "source2": "html2text {}", "unified_diff": "@@ -28,34 +28,44 @@\n To see an example of what each of these strategies might produce, you can call\n one followed by the .example() method, which is a general hypothesis method\n valid for all strategies.\n In [2]: import xarray.testing.strategies as xrst\n \n In [3]: xrst.variables().example()\n Out[3]:\n- Size: 24B\n-array([ nan+nanj, -1.175e-38 +2.j, -3.333e-01 +0.j], dtype=complex64)\n+ Size: 16B\n+array([nan, nan, -0., nan], dtype=float32)\n+Attributes:\n+ \u00c1\u0148: {'\u0166\u010b\u00b20': '\u0154\u017c', '\u016bQ\u017f\u00e7\u015b': '\u017c9\u012fc\u00f4'}\n+ \u0115\u0108\u00dc\u00f3\u017f: {'\u0101\u00d0l\u0140\u0145': '\u00c8H\u0157\u017c', 'A\u017b\u0129': '', '\u017f\u0130\u0168\u00bdA': True, '': False, '\u017f\u017c':\n+Tr...\n \n In [4]: xrst.variables().example()\n Out[4]:\n- Size: 16B\n-array([-1.9 , -0.333])\n+ Size: 32B\n+array([[17709, 27830, 17709, 17709],\n+ [17709, 17709, 17709, 17709],\n+ [17709, 17709, 17709, 17709],\n+ [17709, 17709, 17709, 17709]], shape=(4, 4), dtype=int16)\n Attributes:\n- \u017c: None\n- \u0136\u017f\u0153\u00d6\u017f: None\n+ \u0104\u0113\u017d\u00eb\u017f: ['' '\\x9e\u00a6=S']\n+ : False\n+ \u0148\u014a\u017dI\u00f0: 5\u017d\u013e\u00ee\u00f6\n+ \u00da\u013e\u00c48\u017f: True\n+ \u011f\u013e\u01031: None\n+ \u015e\u0116: \u0147\u017e\u0105\n+ \u00fc\u00be: [b'\\x03\\x10Q(/+26\\xa2\\xe9\\xa1Ep\\x85@']\n \n In [5]: xrst.variables().example()\n Out[5]:\n- Size: 24B\n-array([[[1.798e+308, 1.798e+308, 1.798e+308]]])\n+ Size: 80B\n+array([-1.500e+00+1.000e+000j, 9.007e+15-6.429e+016j, nan +nanj,\n+ 4.802e+16-1.766e+084j, -inf-2.225e-308j])\n Attributes:\n- 1\u017b\u013d\u00c6\u00ca: \u00ca\n- \u012d\u017d\u017e\u00e1\u017c: False\n- \u0169E: [['\u00d6' '\u00d6']\\n ['\u00d6' '\u00d6']]\n- \u0142\u0135\u017dOE: None\n+ \u00ceb: [b'\\x1f']\n You can see that calling .example() multiple times will generate different\n examples, giving you an idea of the wide range of data that the xarray\n strategies can generate.\n In your tests however you should not use .example() - instead you should\n parameterize your tests with the hypothesis.given() decorator:\n In [6]: from hypothesis import given\n In [7]: @given(xrst.variables())\n@@ -67,67 +77,66 @@\n customise the contents of the generated examples.\n # generate a Variable containing an array with a complex number dtype, but all\n other details still arbitrary\n In [8]: from hypothesis.extra.numpy import complex_number_dtypes\n \n In [9]: xrst.variables(dtype=complex_number_dtypes()).example()\n Out[9]:\n- Size: 48B\n-array([-6.948e+15 +nanj, -1.000e+07-6.104e-05j, -0.000e+00-0.000e+00j, -\n-0.000e+00+1.401e-45j,\n- 0.000e+00-1.401e-45j, -3.403e+16+1.175e-38j], dtype=complex64)\n+ Size: 128B\n+array([[-1.175e-038-6.860e+016j, 1.100e+000-3.624e+016j, -2.335e+153-1.113e-\n+308j,\n+ 5.351e+016+2.225e-311j],\n+ [ 4.032e+016+3.356e+116j, -2.640e-174 +nanj, -2.225e-\n+311+5.357e+016j,\n+ -1.648e-093-5.000e-001j]], dtype='>c16')\n Attributes:\n- : [b'']\n- \u017bT\u0136a\u017c: [['\\U000fb3d5' '`']\\n ['{' '{']]\n- \u0114: [b'\\xa3h']\n- \u00bc\u0120\u00c3\u017e\u017c: \u0156\u0120\u00e7\u00cbU\n- \u016e: False\n- \u00c7:\n+ \u00c3: [[b'B\\xe1?' b'\\xfe\\x08\\xd4f']\\n\n+[b'=dO\\xd23\\xe0\\xdd\\xf2\\xe9\\xed...\n+ \u0172\u0108\u0113\u017e\u017c: \u013e\u00ff\u017f\n This also works with custom strategies, or strategies defined in other\n packages. For example you could imagine creating a chunks strategy to specify\n particular chunking patterns for a dask-backed array.\n *\b**\b**\b**\b* F\bFi\bix\bxi\bin\bng\bg A\bAr\brg\bgu\bum\bme\ben\bnt\bts\bs_\b?\b\u00b6 *\b**\b**\b**\b*\n If you want to fix one aspect of the data structure, whilst allowing variation\n in the generated examples over all other aspects, then use\n hypothesis.strategies.just().\n In [10]: import hypothesis.strategies as st\n \n # Generates only variable objects with dimensions [\"x\", \"y\"]\n In [11]: xrst.variables(dims=st.just([\"x\", \"y\"])).example()\n Out[11]:\n- Size: 36B\n-array([[ inf, -0.0e+00, inf, inf, inf, inf],\n- [ inf, inf, inf, inf, inf, inf],\n- [ inf, 1.2e-07, 1.5e+00, inf, inf, inf]], shape=(3,\n-6), dtype=float16)\n-Attributes:\n- x\u00f2\u016f\u016f: {'\u010f': ''}\n- \u00ff\u017f\u00be\u017e\u0148: {}\n- \u00c7\u0114\u012c\u00dc\u0131: {'\u00b2': array([[2547821043, 51546]], dtype='>u4'), 'u\u017e\u00c2':\n-Fa...\n+ Size: 30B\n+array([[ 43, -108, -108, -108, -108, -108],\n+ [-108, -108, -108, -108, 122, -108],\n+ [-108, -45, -122, -108, -108, -108],\n+ [-108, -108, -108, -108, -108, -108],\n+ [-108, -108, -108, -108, -108, -108]], shape=(5, 6), dtype=int8)\n (This is technically another example of chaining strategies -\n hypothesis.strategies.just() is simply a special strategy that just contains a\n single example.)\n To fix the length of dimensions you can instead pass dims as a mapping of\n dimension names to lengths (i.e. following xarray objects\u2019 .sizes() property),\n e.g.\n # Generates only variables with dimensions [\"x\", \"y\"], of lengths 2 & 3\n respectively\n In [12]: xrst.variables(dims=st.just({\"x\": 2, \"y\": 3})).example()\n Out[12]:\n- Size: 6B\n-array([[ 18, 221, 159],\n- [ 91, 239, 30]], dtype=uint8)\n+ Size: 48B\n+array([[-3083827208917602196, -3083827208917602196, -3083827208917602196],\n+ [-3083827208917602196, -3083827208917602196, -3083827208917602196]])\n+Attributes:\n+ : [False False]\n+ J\u00c3\u017f\u017e\u0102: ['NaT']\n You can also use this to specify that you want examples which are missing some\n part of the data structure, for instance\n # Generates a Variable with no attributes\n In [13]: xrst.variables(attrs=st.just({})).example()\n Out[13]:\n- Size: 40B\n-array([ inf, inf, 5.169e+115, 2.225e-308, 1.401e-045])\n+ Size: 16B\n+array([-9223372036854727831, -9223372036854746411])\n Through a combination of chaining strategies and fixing arguments, you can\n specify quite complicated requirements on the objects your chained strategy\n will generate.\n In [14]: fixed_x_variable_y_maybe_z = st.fixed_dictionaries(\n ....: {\"x\": st.just(2), \"y\": st.integers(3, 4)}, optional={\"z\": st.just\n (2)}\n ....: )\n@@ -136,35 +145,39 @@\n In [15]: fixed_x_variable_y_maybe_z.example()\n Out[15]: {'x': 2, 'y': 3, 'z': 2}\n \n In [16]: special_variables = xrst.variables(dims=fixed_x_variable_y_maybe_z)\n \n In [17]: special_variables.example()\n Out[17]:\n- Size: 64B\n-array([[-9223372036854775598, -9223372036854727701, -765380011776258377, -\n-9223372035077253998],\n- [ -756101856880953354, -9223372036854737559, -9223372036854723696, -\n-9223372036854759933]])\n+ Size: 128B\n+array([[[-9223372036854734621, -9223372036854734621],\n+ [-9223372036854734621, -9223372036854734621],\n+ [-9223372036854775654, -9223372036854734621],\n+ [-9223372036854734621, -9223372036854734621]],\n+\n+ [[-9223372036854734621, -9223372036854734621],\n+ [-9223372036854754014, -9223372036854734621],\n+ [-9223372036854734621, -9223372036854734621],\n+ [-9223372036854775792, -9223372036854734621]]], shape=(2, 4, 2))\n Attributes:\n- \u00cf\u0134\u017feC: {'\u016a\u00e7\u0135\u017b\u017f': array([['NaT', 'NaT']], dtype='timedelta64[Y]'), '':\n-...\n+ L\u00e6: \u0155\n+ \u0173\u00cf\u011e\u016e\u010f: True\n+ \u013a\u0175\u013c: True\n+ \u017fXa\u00e7\u017b: ['\\x8e\\U000b0bbd\\U0009ff28']\n+ \u0150: \u016f\n+ \u017e: True\n+ \u0175\u010c\u00f0\u0149\u017c: True\n+ : \u00cem\n \n In [18]: special_variables.example()\n Out[18]:\n- Size: 12B\n-array([[ 11855, -14645, -2561],\n- [-18322, -22316, -20968]], dtype=int16)\n-Attributes:\n- : False\n- \u00dc: True\n- \u0160\u017d: False\n- \u0166: \u0176\n- \u0136: True\n- 5\u017f\u00ff\u017d\u014f: \u00e9\n+ Size: 6B\n+array([[0, 0, 0],\n+ [0, 0, 0]], dtype=int8)\n Here we have used one of hypothesis\u2019 built-in strategies\n hypothesis.strategies.fixed_dictionaries() to create a strategy which generates\n mappings of dimension names to lengths (i.e. the size of the xarray object we\n want). This particular strategy will always generate an x dimension of length\n 2, and a y dimension of length either 3 or 4, and will sometimes also generate\n a z dimension of length 2. By feeding this strategy for dictionaries into the\n dims argument of xarray\u2019s variables() strategy, we can generate arbitrary\n@@ -258,48 +271,43 @@\n ....: array_strategy_fn=xps.arrays,\n ....: dtype=xps.scalar_dtypes(),\n ....: )\n ....:\n \n In [32]: xp_variables.example()\n Out[32]:\n- Size: 4B\n-array([[23794],\n- [-2592]], dtype=int16)\n+ Size: 40B\n+array([-9223372033794027569, -9223372033794027569, -9223372033794027569, -\n+9223372033794027569,\n+ -9223372033794027569])\n Attributes:\n- f8\u017c\u010d: \u017b\u017c\u00b9\n- \u017e\u00d2\u017f\u017cv: None\n- \u0123\u010e\u00eb: None\n- \u017ev: [[2.]]\n- \u0142\u0100\u017e\u0115\u0165: [[-2147469683 -2147483559]\\n [ 2147483646 -2147440432]]\n- \u0165\u00c3u: \u0169\n- \u0106\u00d8\u00d8w\u00e3: False\n- :\n+ \u00e0\u00b2: None\n+ \u00fd: None\n Another array API-compliant duck array library would replace the import, e.g.\n import cupy as cp instead.\n *\b**\b**\b**\b* T\bTe\bes\bst\bti\bin\bng\bg o\bov\bve\ber\br S\bSu\bub\bbs\bse\bet\bts\bs o\bof\bf D\bDi\bim\bme\ben\bns\bsi\bio\bon\bns\bs_\b?\b\u00b6 *\b**\b**\b**\b*\n A common task when testing xarray user code is checking that your function\n works for all valid input dimensions. We can chain strategies to achieve this,\n for which the helper strategy unique_subset_of() is useful.\n It works for lists of dimension names\n In [33]: dims = [\"x\", \"y\", \"z\"]\n \n In [34]: xrst.unique_subset_of(dims).example()\n-Out[34]: ['x', 'z']\n+Out[34]: ['x', 'y', 'z']\n \n In [35]: xrst.unique_subset_of(dims).example()\n-Out[35]: ['x', 'z']\n+Out[35]: ['y', 'x', 'z']\n as well as for mappings of dimension names to sizes\n In [36]: dim_sizes = {\"x\": 2, \"y\": 3, \"z\": 4}\n \n In [37]: xrst.unique_subset_of(dim_sizes).example()\n-Out[37]: {'y': 3}\n+Out[37]: {'z': 4, 'x': 2, 'y': 3}\n \n In [38]: xrst.unique_subset_of(dim_sizes).example()\n-Out[38]: {'y': 3, 'x': 2, 'z': 4}\n+Out[38]: {'z': 4, 'x': 2}\n This is useful because operations like reductions can be performed over any\n subset of the xarray object\u2019s dimensions. For example we can write a pytest\n test that tests that a reduction gives the expected result when applying that\n reduction along any possible valid subset of the Variable\u2019s dimensions.\n import numpy.testing as npt\n \n \n"}]}, {"source1": "./usr/share/doc/python-xarray-doc/html/whats-new.html", "source2": "./usr/share/doc/python-xarray-doc/html/whats-new.html", "unified_diff": "@@ -7936,15 +7936,15 @@\n New xray.Dataset.where
method for masking xray objects according\n to some criteria. This works particularly well with multi-dimensional data:
\n In [44]: ds = xray.Dataset(coords={"x": range(100), "y": range(100)})\n \n In [45]: ds["distance"] = np.sqrt(ds.x**2 + ds.y**2)\n \n In [46]: ds.distance.where(ds.distance < 100).plot()\n-Out[46]: <matplotlib.collections.QuadMesh at 0x7f93962802d0>\n+Out[46]: <matplotlib.collections.QuadMesh at 0x7f39570ce710>\n
\n \n
\n \n \n Added new methods xray.DataArray.diff
and xray.Dataset.diff
\n for finite difference calculations along a given axis.
\n", "details": [{"source1": "html2text {}", "source2": "html2text {}", "unified_diff": "@@ -5100,15 +5100,15 @@\n * New xray.Dataset.where method for masking xray objects according to some\n criteria. This works particularly well with multi-dimensional data:\n In [44]: ds = xray.Dataset(coords={\"x\": range(100), \"y\": range(100)})\n \n In [45]: ds[\"distance\"] = np.sqrt(ds.x**2 + ds.y**2)\n \n In [46]: ds.distance.where(ds.distance < 100).plot()\n- Out[46]: \n+ Out[46]: \n _\b[_\b__\bb_\bu_\bi_\bl_\bd_\b/_\bh_\bt_\bm_\bl_\b/_\b__\bs_\bt_\ba_\bt_\bi_\bc_\b/_\bw_\bh_\be_\br_\be_\b__\be_\bx_\ba_\bm_\bp_\bl_\be_\b._\bp_\bn_\bg_\b]\n * Added new methods xray.DataArray.diff and xray.Dataset.diff for finite\n difference calculations along a given axis.\n * New xray.DataArray.to_masked_array convenience method for returning a\n numpy.ma.MaskedArray.\n In [47]: da = xray.DataArray(np.random.random_sample(size=(5, 4)))\n \n"}]}]}]}]}]}