Commit Graph

20 Commits

Author SHA1 Message Date
094394afc2 Add robots.txt checking
Still needs periodic cache refresh
2024-10-23 14:24:10 +03:00
1ac250ca6e Revert "refactor: Improve robots.txt parsing and caching"
This reverts commit 6a96fb26cc.
2024-10-23 14:07:14 +03:00
6a96fb26cc refactor: Improve robots.txt parsing and caching 2024-10-23 14:06:56 +03:00
3e01cb1819 refactor: Simplify robots.txt parsing logic 2024-10-23 14:06:55 +03:00
8d9ea6cdec fix: use hostname instead of host in gemini network connection 2024-10-23 12:46:48 +03:00
f9b5fd5e7f fix: Use parsedUrl.Hostname() for TLS SNI 2024-10-23 12:46:03 +03:00
62369d90ae fix: Refactor ConnectAndGetData function to return GeminiPageData struct 2024-10-23 12:46:02 +03:00
e51d84cad8 feat: Implement robots.txt parser 2024-10-23 09:26:39 +03:00
17ef03d621 feat: Batch insert links to improve database performance 2024-10-23 09:05:50 +03:00
a2a6bd200a Optimize worker random snapshot selection 2024-10-22 23:37:31 +03:00
3c5206ae43 Change blacklist to comprise domains. 2024-10-22 16:43:55 +03:00
cd60c1363b Lots of features, first version that reliably crawls Geminispace.
- [x] Concurrent downloading with workers
- [x] Concurrent connection limit per host
- [x] URL Blacklist
- [x] Save image/* and text/* files
- [x] Configuration via environment variables
- [x] Storing snapshots in PostgreSQL
- [x] Proper response header & body UTF-8 and format validation
.

.

.
2024-10-21 20:04:09 +03:00
212345764b Properly decode URLs 2024-10-10 18:39:27 +03:00
8278f2b204 Proper mimetype parsing, refactoring 2024-10-09 13:31:49 +03:00
91f8e69fdf Work on header parsing & saving other files 2024-10-08 18:16:47 +03:00
7e849feffe Add README.md 2024-10-08 17:28:10 +03:00
c3d6481de0 Add configuration via env vars 2024-10-08 12:42:08 +03:00
74e9327b0b Persist pages to file system 2024-10-07 13:36:20 +03:00
74be6b4d0d Basic functionality 2024-10-04 13:15:07 +03:00
eb963542b7 Initial commit. 2024-10-04 13:14:00 +03:00