I've been developing with Access since 1994. I fell in love with Access because of its ability to "Access" various ODBC data sources. Very few of my applications use Jet/ACE databases. The majority use SQL server because most shops have that database installed these days. But over the years, I've worked with Oracle, DB2, Sybase, and names you wouldn't even recognize. The one common thread is that in ALL cases, I used linked tables. In only one situation did I find that I needed to use pass-through queries and that was to delete rows in a temp table. I ended up using TRUNCATE because that was the most efficient method of clearing a table.
It is a little known fact that in ALL cases, Access attempts to "pass-through" every single query to the server for processing so creating specialized pass-through queries saves little processing time and creates significant work on your part. You can certainly defeat the query processor and make it bring back bazillions of unnecessary records so it is important that you know what causes the problems. When you use Access querydefs against linked tables, there is a little overhead involved because the query processor needs to figure out what it needs to send to the server. With a pass-through query, the query processor simply sends the query without further ado. If you use saved querydefs rather than dynamic SQL, Access "compiles" the querydef the first time you run it and saves the query plan which saves time on subsequent executions. When you use dynamic SQL (SQL strings in VBA code), Access has to "compile" the query each and every time it runs because it has no way of saving the execution plan. This slows down the application unnecessarily if a query is run repeatedly in a session and it is also responsible for database bloat. It takes a lot of workspace to calculate an execution plan and Access has no way to recover free space until it is compacted. So if you use this technique (SQL strings in VBA), it is imperative that you compact frequently. In fact I would suggest compact on close to reduce the bloat.
Every case is different so you need to determine if it is better to summarize data on the server and bring the result set into an Access table for further summarization or use with multiple reports or to simply bind the reports to queries on linked tables. One application I developed for a regional bank processed millions of transactions. What I found was, I could summarize the data to a certain point and download it where it was used in a dozen reports. Other parts of the application that just used current transactions just used queries against linked tables.
One thing I would suggest if you stick with the download method, is to use a separate database to hold the downloaded tables. That eliminates the bloat caused by creating/deleting tables or appending/deleting rows. At the start of the download, you either copy a pre-defined empty database or create one on the fly. Then download all the data to the separate database. That way, you start every session with a clean slate.
Nesting the pass-through into a select-into obfuscates the messages send by the server. That’s why you no longer see the affected records count. Regarding your other question, Access is extremely sensitive to network blips. If Access loses touch with the network frequently, you need to get your network people motivated to find the problem or live with it. One nasty problem that is difficult to find is a faulty nic card which may not even be on your computer.