Suck

Table of contents

  1. General Informations
  2. Requirements
  3. How to fetch the remote articles
  4. How to send the local posts
1. General informations
Public news servers have got the main disadvantage to be slower than commercial ones. Those who need more speed should set up a local news server and configure an external program to download the articles from a real news server. This process (which is usually called suck feed) is divided into three steps :
  1. First of all, a program (called news agent) scans the local host searching for the groups that have to be updated
  2. Acting as a standard client, the agent connects to an external news server and downloads all unseen articles (those messages that were posted after the last time when the agent was executed)
  3. At least it sends the downloaded articles to the local host using those NNTP commands designed for servers communication (so in this step the agent acts as a normal news server)
After transferring the remote messages, the news agent looks for which articles were posted by the users on the local host and sends them to the remote server (acting - again - as a standard client). It’s important to remember that the download of remote articles and the dispatch of local ones are different processes which have to be separately configured. Suck feeds could become dangerous for the external news server. A misconfigured news agent could try to download an huge number of articles or might post multiple copies of the same message. Even if Aioe.org server has got a lot of security measures against these accidents, we don’t tolerate this kind of mistakes by the our users. When the refresh interval (the pause between two checks made by the news agent) is too short, the server could suffer an useless waste of system resources. Those who wish to establish a suck feed from our server have to respect a few rules. Trespassers will be banned forever.
  1. A single connection has to be used to download all articles. This implies that programs like suckmt are forbidden.
  2. Only about 50 connections per day are allowed. A connection every 30 minutes is much enough for almost all users.
  3. Every IP can download only 30MB of data per day.
2. Requirements
First of all, it’s necessary to install a news server. The most known NNTP server for UNIX is Internet Net News. Debian Users could type (as root) :
$ apt-get install inn2
If the news server is installed on the same host used to read the news, the default configuration shipped with the debian package doesn’t need to be changed. Otherwise, the read and post privileges must be assigned to the local IPs in order to allow the remote clients to contact the server. Those who wish to adoperate an intranet server should add at the end of /etc/news/readers.conf the following lines :
auth "Local network" {
    hosts: "!*, 10.0.0.0/8, 192.168.0.0/16"
    default: "local"
}

access "Local network" {
    users: "local"
    read: "*,!junk,!control*"
    post: "*,!junk,!control*"
}
Each time that INN configuration files are changed, INN must be reloaded with:
$ ctlinnd reload all "a reason"
When the server is running, new groups can be created with the following command:
$ ctlinnd newgroup [your_group] [group_type]
When the server is running, a news agent has to be installed. This tutorial explains how to configure Suck - which is the most popular news agent for UNIX - and we strongly suggest to choose this program because it’s safe, stable and well written. Debian users can install it with apt-get
$ apt-get install suck
3. How to fetch the remote articles
All groups which will be kept synchronized have to be created on the local host. When a new group is created, the news agent starts to download from the external server the articles sent to that forum and when one group is removed suck automatically ends to keep it synchronized. For instance, in order to create it.test and it.test.moderato (which is a moderated group) the rights commands are :
$ ctlinnd newgroup it.test y            # Free posting group (y)
$ ctlinnd newgroup it.test.moderato m   # Moderated group    (m)
The news agent (suck) is a program that has to be executed on a periodical base. The minimum allowed refresh interval is about 30 minutes but we suggest a quite longer time (a scan every about 45 minutes). The greater part of those who choose this method to download the news configures his agent to contact the server every hour at the minutes 00 and 30 (14 :00, 14 :30, 15 :00 and so on). This is a severe problem because they try to establish a connection all together at the same time and the server could become overloaded.
Those who are using suck should add to the crontab an entry like this :
*/44 * * * *  suck news.aioe.org -A -bp -hl localhost -c -i 200 -MnQ
The Suck Manual Page explains the exact meaning of each command line statement. We strongly suggest to follow the example above.
4. How to send the local posts
First of all, innd must be configured to save a reference to every local article in a predefined file. This is made by adding a structure like this inside /ect/news/newsfeeds
my_feed :*, !control*, !junk* :Tf,H2,Wn :
#  [1]          [2]          [3]
In this way, innd saves in /var/spool/news/outgoing/my_feed [1] the internal token which matches with every article posted to every group except control.* and junk [2]. For each article this file will include only the internal token (Wm) and only those posts which have got less than 3 entries in the path are included in that list (H2). [3] This is a fast and easy way to divide local articles from remote ones.
Those who’re using suck feed must send us only the local articles without bouncing the ones which were already received from us. Trespasser will be quickly banned forever.
Every time that /etc/news/newsfeeds is modified, innd must be reloaded using :
$ ctlinnd reload newsfeeds some_reason
If everything works well, after a local post, /var/spool/news/outgoing/my_feed will include a line like this :
$ cat /var/spool/news/outgoing/my_feed
@03004F4E4500000000000001E1BD000000A7@
Now, an external script - which is usually invoked by cron - after reading the spool file finds for each token the corresponding article which is at last posted to the remote server through rpost, a program shipped with suck.
Before posting every article to a remote server, the local script which handles the outgoing messages must remove several headers (notably NNTP-Posting-Host and Xref) from every post. The articles that include these forbidden headers are rejected by the server.