Answering my own question:
1) if you have to massage the data (e.g. adding intensity so it displays or converting feet to meters), then Excel can only handle about a million points.
2) I've had no trouble importing 4+ million points (6 files) and filtering so only 20% appear. I have confidence that if less filtering (more points displayed) is desired, then it will work.
With my computer, I get significant delay when selecting many points for classification using the lasso or polygon tool; best to use the rectangle for speed and save the other tools for refining.
4+ million points is working well in BTB. Detail level is high along the aerial scan lines and there's less than a meter between scan lines, so it's probably in the intermediate density. I classified the points in BTB and am gradually making progress... it's a historic track that was turned into a golf course, so sussing out the original circuit's route is, uh, fiddly because there's no obvious pavement. I'm doing it in point-to-point segments as a guide for the final circuit. The terrain will look fabulous!
Another historic track has a sparse density aerial scan. Because most of the track still exists, the points have been easier to classify, just wish the scan were a little higher density.
Went to look for LIDAR data for my home track, Portland International Raceway, and discovered our government's most recent scan contains 34 million points... hmm. Don't think I'm up to working with that much data just yet, but it is available if someone is looking for a project!
I'm having the pleasure of working with a dense lidar cloud sample and finally got an "out of system memory" error somewhere above 50 million points (unfiltered)... now trying to pin down that upper limit.
Right around 60 million points is when the error pops up. You really need to leave memory available for making track & terrain & adding objects, so probably importing no more than 30 million points would make sense.