iNatting in Real Time - An Experiment (for me)

Mary Kay and I took a couple of hikes in Forest Park in suburban Portland, OR, today. And today's effort was purposefully a little different for me in the field. Because (a) we were hiking in a densely-shaded forest which was giving my small point-and-hope Canon camera fits in the low-light conditions, and (b) we were in an urban park with pretty good cell phone reception and I had my trusty new iPhone 14 at hand, I decided to go full “iPhone-iNaturalist” mode for the day. This was a way to both accumulate some more observations but, more importantly, to better familiarize myself with the operation of the iOS version of the iNaturalist app. Previously I had occasionally “Explored” an area with the app, but I rarely uploaded observations in real time in the field. This was a chance to try my hand at the latter type of work flow.

So in about 4 hours and 3 miles of hiking (a pretty typical iNat pace for me), I made and uploaded about 75 observations. In my recent uploads (for July 3 from about 10 a.m. to 2 p.m. PDT), this includes all the plants and a few critters from through

I came away with the following take-home lessons:

— It’s so much fun and so easy to accumulate observations in real time; this is a drastic reduction in the work required compared to my usual tedious work flow that I’m infamous for. See, for instance: (Am I right? @sambiology, @kimberlietx)

— In addition to the fun of quicker uploads, I think the real-time and repeated access to field identifications is a hugely positive learning tool. I found myself more readily cementing into my neurons the names of new-to-me plants without the need of continually leafing through a field guide or waiting until I got back to other references at home. This, of course, is a direct result of being in the field in an area with relatively strong cell phone coverage. Elsewhere in the “boondocks”, this capability wouldn’t exit. Using the app in the field as a handy learning tool is also constrained by the following issue:

— Adding identifications to observations for real-time uploads is still a somewhat frustrating and iffy process. iNat’s Computer Vision is amazing and is improving with every release. However, suggestions for plants are still most commonly placed at genus level in a botanically diverse area like the Pacific Northwest. That means a lot of genus-level uploads and the need to wait until I get back to references (i.e. hard-copy books and full-fledged internet) or wait on others to refine those IDs. Maybe this shows my own impatience or my pathological self-reliance on such matters; I am uncomfortable relying on external “validation” of IDs.

— Because of the limited editing tools available to use on images in the iOS version of the app, I quickly found it necessary (to meet my own demanding standards for cropping, etc.) to take images with the Camera app first, then edit them a little bit (especially cropping) in Photos before grabbing them from the Photos stream to upload them as observations from within the iNat app. If I simply accessed the camera from within the app, I was left with whatever image size, shape, and quality was obtained in real time. The newer iPhones have great cameras, but most often they just don’t meet my standards for iNat submissions. I understand that for the vast majority of cell phone contributors of iNat observations, this is less of an issue. But as the cliché goes, “I have my standards.”

All-in-all today’s hiking with the iNat app was a very positive experience and exposure for me. I’m sure you’ll be seeing more such uploads from me in the future, but…I’m still old school, so I’m not giving up my beloved field guides, books, MPG, BG, Biodiversity Heritage Library, and other tools. My work flow for the thousands of observations thus far (and in the future) accumulated on the present road trip will still take the pathway characterized by my long-ingrained snail pace!

由使用者 gcwarbler gcwarbler2023年07月04日 02:27 所貼文


Chuck!!! A new app master! :)

One thing that I always try to do with the app is add multiple photos for each observation. With most plants, I take a close-up of the flower/fruit as the first photo, and then a step back for the second photo. If I don't know the plant, I will take a few more additional photos (leaves/stem, etc).

The first photo is really useful for that visual algorithm/computer vision ID (and it's getting crazy good).

Now, this is JUST for plants (or super slow things) -- for the quicker stuff, I tend to use my point and hope camera as well. :)

發佈由 sambiology 約 1 年 前

Yay! I love that you tried the real-time method and liked it. Now speed that up 100x for a fast paced bioblitz and you'll be giving Sam a real run for his money! I love having both my camera and cell phone available, but you are right about needing good cell service (and being in a good iNat locality, too.) With a little more practice you'll get better about taking the photos in a way that requires less cropping, or at least being more mindful of it. My biggest issue is the direct sun being so bright on the screen that I can't tell if I'm in focus or not. (And then finding out I wasn't. UGH!)

And btw, I feel like 4 hours for 3 miles is still a pretty decent clip for a botanist!

發佈由 kimberlietx 約 1 年 前

I tried out the app a few years ago when I bought a new Android phone. While I liked the all in one experience, cell phone cameras are just way too cumbersome and unpredictable for it to be a pleasant experience. Trying to hold a part of a plant praying that I could get a decent photo with my cell phone in the other hand was immensely frustrating. The lack of macro ability was also a deal breaker. Maybe in the future those features will get better, but for now I will stay with my painfully slow process, probably 10x slower than anyone else's. :)

發佈由 rymcdaniel 約 1 年 前


登入註冊 添加評論