Downloads provided by UsageCounts
doi: 10.5281/zenodo.8306375 , 10.5281/zenodo.15025383 , 10.5281/zenodo.7786918 , 10.5281/zenodo.11002059 , 10.5281/zenodo.17934776 , 10.5281/zenodo.8128150 , 10.5281/zenodo.17160350 , 10.5281/zenodo.15502395 , 10.5281/zenodo.14028422 , 10.5281/zenodo.11198070 , 10.5281/zenodo.7958348 , 10.5281/zenodo.17302661 , 10.5281/zenodo.11194440 , 10.5281/zenodo.8128004 , 10.5281/zenodo.7922510 , 10.5281/zenodo.11463315 , 10.5281/zenodo.13317000 , 10.5281/zenodo.7933212 , 10.5281/zenodo.7927971 , 10.5281/zenodo.8231168 , 10.5281/zenodo.7922509 , 10.5281/zenodo.16965228 , 10.5281/zenodo.8165270 , 10.5281/zenodo.15708647 , 10.5281/zenodo.8169913 , 10.5281/zenodo.15787828 , 10.5281/zenodo.10989701 , 10.5281/zenodo.7746815 , 10.5281/zenodo.15502299 , 10.5281/zenodo.7726284 , 10.5281/zenodo.11111824
doi: 10.5281/zenodo.8306375 , 10.5281/zenodo.15025383 , 10.5281/zenodo.7786918 , 10.5281/zenodo.11002059 , 10.5281/zenodo.17934776 , 10.5281/zenodo.8128150 , 10.5281/zenodo.17160350 , 10.5281/zenodo.15502395 , 10.5281/zenodo.14028422 , 10.5281/zenodo.11198070 , 10.5281/zenodo.7958348 , 10.5281/zenodo.17302661 , 10.5281/zenodo.11194440 , 10.5281/zenodo.8128004 , 10.5281/zenodo.7922510 , 10.5281/zenodo.11463315 , 10.5281/zenodo.13317000 , 10.5281/zenodo.7933212 , 10.5281/zenodo.7927971 , 10.5281/zenodo.8231168 , 10.5281/zenodo.7922509 , 10.5281/zenodo.16965228 , 10.5281/zenodo.8165270 , 10.5281/zenodo.15708647 , 10.5281/zenodo.8169913 , 10.5281/zenodo.15787828 , 10.5281/zenodo.10989701 , 10.5281/zenodo.7746815 , 10.5281/zenodo.15502299 , 10.5281/zenodo.7726284 , 10.5281/zenodo.11111824
Jarvis can now work completely offline! (Continue reading) This release adds two new model interfaces. Google PaLM If you have access to it (it's free), you can use it for chat and for related notes. Custom OpenAI-like APIs This allows Jarvis to use custom endpoints and models that have an OpenAI-compatible interface. Example: [tested] OpenRouter (for ebc000) setup guide Example: [not tested] Azure OpenAI (previously requested) Example: [tested] Locally served GPT4All (for laurent, and everyone else who showed interest) setup guide This is an open source, offline model (you may in fact choose from several availbale models), that you can install and run on a laptop. It can be used for chat, and potentially also for related notes (embeddings didn't work for me, but related notes already supports the USE offline model). This solution for an offline model is not ideal, as it may be technically challenging for a user to run their own server, but at the moment this workaround looks like the only viable solution, and doesn't involve a lot of steps.
If you use this software, please cite it as below.
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 0 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
| views | 22 | |
| downloads | 3 |

Views provided by UsageCounts
Downloads provided by UsageCounts