Synology: Difference between revisions

From Wildsong
Jump to navigationJump to search
Brian Wilson (talk | contribs)
Brian Wilson (talk | contribs)
Line 63: Line 63:
then I can also handle DNS for L2TP clients. That might work... routing has to be set correctly too though;
then I can also handle DNS for L2TP clients. That might work... routing has to be set correctly too though;
we don't want ALL traffic for the remote client routed through the Synology. Just Synology traffic.
we don't want ALL traffic for the remote client routed through the Synology. Just Synology traffic.
I could also just have the outside DNS always point at the inside addresses
then any outside DNS would work -- you'd have to have a tunnel open but that's fine...
'''this is far easier.''' We like "easier".
So for example.com
example.com and www.example.com are our hosted web site. It always resolves to the web host public IP
diskstation.example.com is our NAS. It always resolves to 192.168.1.5
You can't reach the server from outside without the tunnel but that's by design anyway.
So, that's what I will do.


== Database engine of choice ==
== Database engine of choice ==

Revision as of 18:36, 10 December 2016

todo

  • backups for owncloud, ldap databases
  • backups for configuration files
  • pam auth for shell log in
  • samba set up
  • vpn set up
  • don't panic guide
  • new user guide
    • samba
    • owncloud
    • account management, change settings and password

User guide

Web access using SSL

The first time you connect you will get the scary security exception screen. Press on, it's okay really. We use a "self-signed" SSL certificate so that we don't have to pay for one. Really, it's okay.

Administration guide

Creating a new user

  • Do NOT use "Users" and "Groups" in the DSM control panel. We use "Directory Server" (LDAP). This is so we can have one username/password for all services included Samba (file sharing) and Owncloud (cloud file storage).
  1. Create in LDAP Directory Server
  2. Enable in Shared Files

Manage access to folders in the Control Panel "Shared files".

Avatar in LDAP, connect to owncloud

Access from Windows - \\diskstation\ Log in with username@trailpeople.net Access from Mac - afp://diskstation.local - same credentials, username@trailpeople.net

Command line access via SSH - Use the "admin" account, then use "sudo -s" if you need root access. I put my keys in so I don't need the password to connect. The password is on the label on the server.

RAM upgrade

It comes with 1GB of SO-DIMM RAM and I have read of people putting in up to 16GB. I got the idea from Charles Hooper's blog and I found a nice Youtube video on the same subject.

VPN

Goal: allow remote workers (like me) to connect to and use the Synology from a laptop, from anywhere.

The laptops run Windows and Mac OS/X.

We don't need to bridge networks together. (Don't get distracted, Brian! We don't need this right now.)

Here is the Synology tutorial. Refer to the L2TP section.

  1. I enable L2TP for accounts in LDAP.
  2. I enable L2TP
  3. I leave most settings at defaults.
  4. I set the range for dynamic addresses to 172.16.123.0

DNS and DHCP

If I assign DHCP addresses out of this box, then I can also handle DNS requests (I push DNS server setting out) then I can also handle DNS for L2TP clients. That might work... routing has to be set correctly too though; we don't want ALL traffic for the remote client routed through the Synology. Just Synology traffic.

I could also just have the outside DNS always point at the inside addresses then any outside DNS would work -- you'd have to have a tunnel open but that's fine... this is far easier. We like "easier".

So for example.com

example.com and www.example.com are our hosted web site. It always resolves to the web host public IP diskstation.example.com is our NAS. It always resolves to 192.168.1.5

You can't reach the server from outside without the tunnel but that's by design anyway. So, that's what I will do.

Database engine of choice

I wanted to use PostgreSQL but can't get it to work with owncloud. I tried and tried and gave up. Owncloud does not appear to be sending the username to postgres. I dropped back to Mariadb

Configuration files for postgresql are in /etc/postgresql/ Use a HUP to reconfigure it. killall -1 /usr/bin/postgres

Misc notes, fix this up someday

Enabled SSH Server (built in, see control panel) used TrailPeople gmail account to enable email

NGINX

When I first got the Syno, I touched the nginx configuration and ended up breaking the DSM app. I backed out my changes.

The file I created for Trailpeople is in /usr/local/etc/nginx/sites-enabled/trailpeople.conf and it looks like this:

server {
  listen 443 ssl;
  listen [::]:443 ssl;

  server_name diskstation.trailpeople.net;
#  ssl_certificate /etc/ssl/nginx/owncloud.crt;
#  ssl_certificate_key /etc/ssl/private/owncloud.key;

  root /volume1/web/trailpeople;
  # set max upload size
  client_max_body_size 10G;
  fastcgi_buffers 64 4K;

  # Disable gzip to avoid the removal of the ETag header
  gzip off;

  # Uncomment if your server is build with the ngx_pagespeed module
  # This module is currently not supported.
  #pagespeed off;

  rewrite ^/caldav(.*)$ /remote.php/caldav$1 redirect;
  rewrite ^/carddav(.*)$ /remote.php/carddav$1 redirect;
  rewrite ^/webdav(.*)$ /remote.php/webdav$1 redirect;

  index index.php;
  location ~ \.php {
    fastcgi_index index.php;
    fastcgi_pass unix:/run/php-fpm/php56-fpm.sock;
    fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name;
    fastcgi_param PATH_INFO $fastcgi_script_name;
    include fastcgi_params;
  }

  location = /robots.txt {
    allow all;
    log_not_found off;
    access_log off;
  }

  location ~ ^/(?:\.htaccess|data|config|db_structure\.xml|README){
    deny all;
  }

  location / {
    # The following 2 rules are only needed with webfinger
    rewrite ^/.well-known/host-meta /public.php?service=host-meta last;
    rewrite ^/.well-known/host-meta.json /public.php?service=host-meta-json last;

    rewrite ^/.well-known/carddav /remote.php/carddav/ redirect;
    rewrite ^/.well-known/caldav /remote.php/caldav/ redirect;

    rewrite ^(/core/doc/[^\/]+/)$ $1/index.html;

    try_files $uri $uri/ =404;
  }

  # Adding the cache control header for js and css files
  # Make sure it is BELOW the location ~ \.php(?:$|/) { block
  location ~* \.(?:css|js)$ {
    add_header Cache-Control "public, max-age=7200";
    # Add headers to serve security related headers
    add_header Strict-Transport-Security "max-age=15768000; includeSubDomains; preload;";
    add_header X-Content-Type-Options nosniff;
    add_header X-Frame-Options "SAMEORIGIN";
    add_header X-XSS-Protection "1; mode=block";
    add_header X-Robots-Tag none;
    # Optional: Don't log access to assets
    access_log off;
  }

  # Optional: Don't log access to other assets
  location ~* \.(?:jpg|jpeg|gif|bmp|ico|png|swf)$ {
    access_log off;
  }

  # ownCloud security tip
  add_header Strict-Transport-Security "max-age=15768000; includeSubdomains; ";
}

Packages

Do not install WebStation! It pulls in Apache. I don't want it hanging around. Likewise skip phpMyadmin because it pulls in WebStation.

  • Synology Directory Service
  • Synology VPN

Enable Synocommunity, https://synocommunity.com/

for owncloud, install

I download from owncloud.org because the version in packages is outdated.

wget https://download.owncloud.org/community/owncloud-9.1.2.tar.bz2

Debian packages

sudo -s
sudo /var/packages/chroot/scripts/start_stop_status chroot
apt-get update
apt-get install locales
dpkg-reconfigure locales
dpkg-reconfigure tzdata
apt-get install php5-dev
apt-get install php5-redis

More packages: ipkg

sudo -s
curl http://ipkg.nslu2-linux.org/feeds/optware/syno-i686/cross/stable/syno-i686-bootstrap_1.2-7_i686.xsh > bootstrap.xsh; bash bootstrap.xsh
ipkg update
ipkg install emacs22 less file git

YAY EMACS FINALLY YAY YAY. This version of "file" will tell you the file information and THEN segfault. You get the information you need followed by an error message. I can live with this.

Docker

There is an official Synology package, just not for the DS416play.

See https://forum.synology.com/enu/viewtopic.php?t=120653 Especially the note that says to go to the Synology site and download the package for Docker from the DS415+ page. and then use the DSM Package Center manual install option.

Once it's installed it has a GUI under that "building blocks" icon in the upper left of the web page.

My immediate goal is to try to run owncloud 9.1 + redis + php7 + nginx.

The Docker GUI in Synology has a "registry" page and I tried selecting php + Download and got "Failed to query registry". Fine -- so the GUI does not work. Fine fine next I tried this instead, from the ssh command line.

docker pull skiychan/nginx-php7

and it appears to download flawlessly. See https://hub.docker.com/r/skiychan/nginx-php7/ for more information on this container. I could click "launch" in the GUI but then I could not apply any command line settings. I think the way to do this is to create a docker file and then put it on the Synology and use the "import" button in the The suggested command is:

docker run --name nginx -p 8080:80 -d skiychan/nginx-php7

Once it's launched from the command line, I can see how much CPU and RAM it is using in the GUI, which is pretty neat. I can stop it there too. I can bring up a page just for this container and do things like see what is running and connect to a terminal. All pretty cool.

PhpMyAdmin

Installed from tar ball to avoid dependencies on Apache.

Owncloud 9

I put it in a subdirectory to simplify DNS. Everything runs under SSL at https://diskstation.trailpeople.net/. So owncloud is at https://diskstation.trailpeople.net/owncloud/.

I loosely followed some instructions I found here, it was a starting point anyway. He uses Apache and I use nginx. http://www.iholken.com/index.php/2016/03/15/guide-for-installing-owncloud-9-to-synology-nas-running-dsm-6/

Optimizations: fixed because owncloud told me to--

  • Add /dev/urandom to open_basedir in /usr/local/etc/php56/conf.d/user-settings.ini
  • Add "always_populate_raw_post_data = -1"
  • Send a HUP to php-fpm
cat fpm.d/env.conf 
; bwilson added this for owncloud

;env[HOSTNAME] = $HOSTNAME
env[PATH] = /usr/local/bin:/usr/bin:/bin
;env[TMP] = /tmp
;env[TMPDIR] = /tmp
;env[TEMP] = /tmp

Crontab

Change the shell on http user from /bin/false to /bin/sh and add this to /etc/crontab:

0,15,30,45  *   *   *   *   root    su -c "/usr/local/bin/php56 -f /volume1/web/trailpeople/owncloud/cron.php" http

There are odd, specific rules to add things to /etc/crontab, see http://jimmybonney.com/articles/manage_crontab_synology/

User authentication

LDAP Directory Service : Synology OpenLDAP

Clients:

  • Owncloud
  • Mediawiki
  • Synology DSM
  • Samba
  • AppleTalk

Synology has a pretty good UI in DSM for LDAP, so I enabled their Directory Service package, then set up owncloud to use it. When owncloud is using LDAP, then you create the account in LDAP and the first time the user logins with owncloud the account is created there.

After you enable the LDAP app, for LDAP settings - configure it to use "localhost" as the server, and it should detect port 389; and leave DN and PASSWORD BLANK. Set the Base DN manually to 'dc=trailpeople,dc=net'. If you put a user/password in there and then change the password later, owncloud will suddenly stop working.

Owncloud config.php

<?php
<?php
$CONFIG = array (
  'instanceid' => 'ocarb6oq5tsb',
  'passwordsalt' => 'WOO1qwVT6iOCp6ycWp4lZ8GlNVv9y4',
  'secret' => 'FtvmpxpedQGTqwrxy7u+b8Ye5HMgXUmXzBlSlxROfogExbs8',
  'trusted_domains' => 
  array (
    0 => 'diskstation',
    1 => 'diskstation.trailpeople.net',
    2 => '192.168.1.5',
  ),
  'datadirectory' => '/volume1/web/trailpeople/owncloud/data',
  'overwrite.cli.url' => 'https://diskstation.trailpeople.net/owncloud',
  'dbtype' => 'mysql',
  'version' => '9.1.2.5',
  'dbname' => 'owncloud',
  'dbhost' => 'localhost',
  'dbtableprefix' => 'oc_',
  'dbuser' => 'owncloud',
  'dbpassword' => 'XXXXXXXX',
  'logtimezone' => 'UTC',
  'installed' => true,
  'memcache.local' => '\\OC\\Memcache\\Redis',
  'redis' => 
  array (
    'host' => 'localhost',
    'port' => 6379,
  ),
  'ldapIgnoreNamingRules' => false,
  'mail_from_address' => 'owncloud',
  'mail_smtpmode' => 'smtp',
  'mail_domain' => 'trailpeople.net',
  'mail_smtphost' => 'smtp.gmail.com',
  'mail_smtpport' => '587',
  'loglevel' => 2,
  'mail_smtpsecure' => 'tls',
  'mail_smtpauthtype' => 'LOGIN',
  'mail_smtpauth' => 1,
  'mail_smtpname' => '[email protected]',
  'mail_smtppassword' => 'XXXXXXXX',
);

Mediawiki

I installed it at /volume1/web/trailpeople/wiki by downloading and unpacking the tar ball from mediawiki.org to avoid the Apache package dependencies (and the outdated version) from Synology.

So it's accessible as https://diskstation.trailpeople.net/wiki/.

It keeps its data in MySQL and the username and database are mediawiki

LDAP authentication

After installing the LDAP plugin I had to fix up the database

cd wiki/maintenance
/usr/local/bin/php56 update.php

VPN

Configuring and testing the Synology VPN.

The Synology will be deployed in a remote office (HQ) and accessed from home offices (SAT). I need to set up a test network to try it out before deployment.

Config #1, remote lan at satellite office

Config 1 uses Synology VPN to route traffic between the server and a Mikrotik router (SAT ROUTER) to get packets to my home LAN. I need this set up to work through some generic router provided by Sonic.

SYN <--> HQ ROUTER <-->   ( INTERNET )   <--> MT ROUTER <--> LAN

I have full control on the configuration of the Mikrotik and LAN. I can get ports opened on the HQ router.

My normal configuration

LAN <--> Mikrotik LAN port - [MT ROUTER] - Mikrotik WAN port <-> Comcast router in bridge mode <-> (Internet)

Test set up - configure a dedicated port to the MT, and put a separate DHCP pool on it to simulate a second LAN Set up L2TP on the Syno and the MT and route traffic between the Syno and LAN.

LAN <--> Mikrotik LAN port - [MT ROUTER] - Mikrotik WAN port <-> Comcast router in bridge mode <-> (Internet)

SYNO <-> Mikrotik SYNO port -----|

I will put a spare ethernet switch into the SYNO port so that I can put a laptop over there to configure the Syno and make sure it is working. Then I will set up the L2TP/IPSEC on Syno and MT.

Config #2, remote workstation

This will be the more typical set up. There is no satellite router for a LAN, just a single computer running the L2TP client.

SYN <--> HQ ROUTER <--> (INTERNET) <--> SAT ROUTER <--> HOME COMPUTER

I won't know what HQ ROUTER and SAT ROUTER are, but I can get ports opened on SAT ROUTER.