Synology: Difference between revisions
Brian Wilson (talk | contribs) m →todo |
Brian Wilson (talk | contribs) |
||
Line 26: | Line 26: | ||
=== Web access using SSL === | === Web access using SSL === | ||
The first time you connect you will get the | The first time you connect you will get the security exception screen. Press on, it's okay really. | ||
We use a "self-signed" SSL certificate so that we don't have to pay for one. Really, it's okay. | We use a "self-signed" SSL certificate so that we don't have to pay for one. Really, it's okay. | ||
== Administration guide == | == Administration guide == |
Revision as of 06:24, 16 October 2019
To-do
Remove the SHR storage pool and create a RAID 1 pool. Create a volume. Use BTRFS if available.
- let's encrypt
- ssh
- backups for ldap databases
- backups for configuration files
- pam auth for shell log in
- samba set up
- vpn set up
- don't panic guide
- new user guide
- samba
- account management, change settings and password
interesting stuff in the packages folder
- various backups
- syncing
go look in the folder
User guide
Web access using SSL
The first time you connect you will get the security exception screen. Press on, it's okay really. We use a "self-signed" SSL certificate so that we don't have to pay for one. Really, it's okay.
Administration guide
Creating a new user
- Do NOT use "Users" and "Groups" in the DSM control panel. We use "Directory Server" (LDAP). This is so we can have one username/password for all services included Samba (file sharing) and Owncloud (cloud file storage).
- Create in LDAP Directory Server
- Enable in Shared Files
Manage access to folders in the Control Panel "Shared files".
Avatar in LDAP, connect to owncloud
Access from Windows - \\diskstation\ Log in with username@trailpeople.net Access from Mac - afp://diskstation.local - same credentials, username@trailpeople.net
Command line access via SSH - Use the "admin" account, then use "sudo -s" if you need root access. I put my keys in so I don't need the password to connect. The password is on the label on the server.
File sharing
In Control Panel, go to File Sharing -> File Services
There you can disable it, it's enabled by default. You can change settings. I enabled SMB 3 in Advanced Settings which only affects Windows 10+.
SMB via Windows: \\diskstation
SMB via Mac: smb://diskstation
For Macs, AppleShare is enabled by default too and it is possible to set up Time Machine.
afp://diskstation.local
For Linux, it is possible to enable NFS.
Cloud services
The whitepaper: https://global.download.synology.com/download/Document/WhitePaper/Synology_Cloud_Sync_White_Paper-Based_on_DSM_6.0.pdf
Cloud Station Server is Synology's version of synchronized file sharing.
I tried the Linux client for Ubuntu which seems to run fine on Debian Jessie, and when I installed it on Bellman from my Mac it even properly launched its GUI on the Mac! I decided not to use it since I don't need two way sync with that server. I am looking at running rsync to back the Syno up.
It installs into /opt/Synology/CloudStation and keeps its configuration files in ~/.CloudStation/. You can start the GUI with synology-cloud-station-drive which is in /usr/bin. It integrates with Nautilus.
Cloud Station Server ShareSync will sync files between two Synology Cloud Stations.
Cloud Sync lets you interact with services like Dropbox and OneDrive
Maintenance: Backups
Options from Packages:
- Hyper Backup
- rsync
- Cloud Sync - could possibly do a one way sync to a server of my choice
Databases: LDAP, Mediawiki
Files:
IO can use rsync via ssh tunnel with a command similar to this
rsync -avn -e "ssh -v -p 2222" trailpeople.synology.me:src dest
Maintenance: RAID scrubbing
You are supposed to do this at least once a month and the system will send you reminders about it. This process slows everything down so only do it when there is a weekend or something. Because it's resource intensive and slow they don't let you schedule it.
- Storage Manager -> Volume Manager Wizard
- Start Data Scrubbing
- Next
- RAID scrubbing
- start it
It gives a red warning message on the Storage Manager page while it's running. If you shut down while it's running it has to start all over again.
Hardware upgrade: RAM
The Diskstation comes with 1GB of SO-DIMM RAM and I have read of people putting in up to 16GB. I got the idea from Charles Hooper's blog and I found a nice Youtube video on the same subject.
VPN
Goal: allow remote workers (like me) to connect to and use the Synology from a laptop, from anywhere.
VPN is nice of course but if your access is via HTTPS (owncloud and webdav and synology port 5001) then it's already encrypted. If your access is via ssh, it's already encrypted. So for the time being, I am not using VPN at all.
It is a matter of controlling port access in the firewall. If you have a sophisticated firewall router (even a Mikrotik which is what I use), use that. Otherwise you can use the Synology firewall.
If you are intent on using the Synology VPN then I have a page on setting it up. Synology VPN
Database engine of choice
I wanted to use PostgreSQL but can't get it to work with owncloud. I tried and tried and gave up. Owncloud does not appear to be sending the username to postgres. I dropped back to Mariadb
Configuration files for postgresql are in /etc/postgresql/ Use a HUP to reconfigure it. killall -1 /usr/bin/postgres
Misc notes, fix this up someday
Enabled SSH Server (built in, see control panel) used TrailPeople gmail account to enable email
NGINX
When I first got the Syno, I touched the nginx configuration and ended up breaking the DSM app. I backed out my changes.
The file I created for Trailpeople is in /usr/local/etc/nginx/sites-enabled/trailpeople.conf and it looks like this:
server { listen 443 ssl; listen [::]:443 ssl; server_name diskstation.trailpeople.net; # ssl_certificate /etc/ssl/nginx/owncloud.crt; # ssl_certificate_key /etc/ssl/private/owncloud.key; root /volume1/web/trailpeople; # set max upload size client_max_body_size 10G; fastcgi_buffers 64 4K; # Disable gzip to avoid the removal of the ETag header gzip off; # Uncomment if your server is build with the ngx_pagespeed module # This module is currently not supported. #pagespeed off; rewrite ^/caldav(.*)$ /remote.php/caldav$1 redirect; rewrite ^/carddav(.*)$ /remote.php/carddav$1 redirect; rewrite ^/webdav(.*)$ /remote.php/webdav$1 redirect; index index.php; location ~ \.php { fastcgi_index index.php; fastcgi_pass unix:/run/php-fpm/php56-fpm.sock; fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name; fastcgi_param PATH_INFO $fastcgi_script_name; include fastcgi_params; } location = /robots.txt { allow all; log_not_found off; access_log off; } location ~ ^/(?:\.htaccess|data|config|db_structure\.xml|README){ deny all; } location / { # The following 2 rules are only needed with webfinger rewrite ^/.well-known/host-meta /public.php?service=host-meta last; rewrite ^/.well-known/host-meta.json /public.php?service=host-meta-json last; rewrite ^/.well-known/carddav /remote.php/carddav/ redirect; rewrite ^/.well-known/caldav /remote.php/caldav/ redirect; rewrite ^(/core/doc/[^\/]+/)$ $1/index.html; try_files $uri $uri/ =404; } # Adding the cache control header for js and css files # Make sure it is BELOW the location ~ \.php(?:$|/) { block location ~* \.(?:css|js)$ { add_header Cache-Control "public, max-age=7200"; # Add headers to serve security related headers add_header Strict-Transport-Security "max-age=15768000; includeSubDomains; preload;"; add_header X-Content-Type-Options nosniff; add_header X-Frame-Options "SAMEORIGIN"; add_header X-XSS-Protection "1; mode=block"; add_header X-Robots-Tag none; # Optional: Don't log access to assets access_log off; } # Optional: Don't log access to other assets location ~* \.(?:jpg|jpeg|gif|bmp|ico|png|swf)$ { access_log off; } # ownCloud security tip add_header Strict-Transport-Security "max-age=15768000; includeSubdomains; "; }
Packages
Do not install WebStation! It pulls in Apache. I don't want it hanging around. Likewise skip phpMyadmin because it pulls in WebStation.
- Synology Directory Service
- Synology VPN
Enable Synocommunity, https://synocommunity.com/
for owncloud, install
- redis -- http://www.iholken.com/index.php/2016/03/16/install-redis-server-and-phpredis-extension-into-synology-nas-running-dsm-6-without-bootstrapping/
- MariaDB
- debian chroot
I download from owncloud.org because the version in packages is outdated.
wget https://download.owncloud.org/community/owncloud-9.1.2.tar.bz2
Debian packages
sudo -s sudo /var/packages/chroot/scripts/start_stop_status chroot apt-get update apt-get install locales dpkg-reconfigure locales dpkg-reconfigure tzdata apt-get install php5-dev apt-get install php5-redis
More packages: ipkg
sudo -s curl http://ipkg.nslu2-linux.org/feeds/optware/syno-i686/cross/stable/syno-i686-bootstrap_1.2-7_i686.xsh > bootstrap.xsh; bash bootstrap.xsh ipkg update ipkg install emacs22 less file git
YAY EMACS FINALLY YAY YAY. This version of "file" will tell you the file information and THEN segfault. You get the information you need followed by an error message. I can live with this.
Docker
There is an official Synology package, just not for the DS416play.
See https://forum.synology.com/enu/viewtopic.php?t=120653 Especially the note that says to go to the Synology site and download the package for Docker from the DS415+ page. and then use the DSM Package Center manual install option.
Once it's installed it has a GUI under that "building blocks" icon in the upper left of the web page.
My immediate goal is to try to run owncloud 9.1 + redis + php7 + nginx.
The Docker GUI in Synology has a "registry" page and I tried selecting php + Download and got "Failed to query registry". Fine -- so the GUI does not work. Fine fine next I tried this instead, from the ssh command line.
docker pull skiychan/nginx-php7
and it appears to download flawlessly. See https://hub.docker.com/r/skiychan/nginx-php7/ for more information on this container. I could click "launch" in the GUI but then I could not apply any command line settings. I think the way to do this is to create a docker file and then put it on the Synology and use the "import" button in the The suggested command is:
docker run --name nginx -p 8080:80 -d skiychan/nginx-php7
Once it's launched from the command line, I can see how much CPU and RAM it is using in the GUI, which is pretty neat. I can stop it there too. I can bring up a page just for this container and do things like see what is running and connect to a terminal. All pretty cool.
PhpMyAdmin
Installed from tar ball to avoid dependencies on Apache.
Owncloud 9
Server set up
I put it in a subdirectory to simplify DNS. Everything runs under SSL at https://diskstation.trailpeople.net/. So owncloud is at https://diskstation.trailpeople.net/owncloud/.
From outside (in the firewall whitelist) right now it also is visible as https://bellman.wildsong.biz/owncloud/
I loosely followed some instructions I found here, it was a starting point anyway. He uses Apache and I use nginx. http://www.iholken.com/index.php/2016/03/15/guide-for-installing-owncloud-9-to-synology-nas-running-dsm-6/
Optimizations: fixed because owncloud told me to--
- Add /dev/urandom to open_basedir in /usr/local/etc/php56/conf.d/user-settings.ini
- Add "always_populate_raw_post_data = -1"
- Send a HUP to php-fpm
cat /usr/local/etc/php56/fpm.d/env.conf ; bwilson added this for owncloud ;env[HOSTNAME] = $HOSTNAME env[PATH] = /usr/local/bin:/usr/bin:/bin ;env[TMP] = /tmp ;env[TMPDIR] = /tmp ;env[TEMP] = /tmp
Crontab
Change the shell on http user from /bin/false to /bin/sh and add this to /etc/crontab:
0,15,30,45 * * * * root su -c "/usr/local/bin/php56 -f /volume1/web/trailpeople/owncloud/cron.php" http
There are odd, specific rules to add things to /etc/crontab, see http://jimmybonney.com/articles/manage_crontab_synology/
User authentication
LDAP Directory Service : Synology OpenLDAP
diskstation is set up as the Provider waldo is set up as the Consumer For initial set up
Clients:
- Owncloud
- Mediawiki
- Synology DSM
- Samba
- AppleTalk
Synology has a pretty good UI in DSM for LDAP, so I enabled their Directory Service package, then set up owncloud to use it. When owncloud is using LDAP, then you create the account in LDAP and the first time the user logins with owncloud the account is created there.
After you enable the LDAP app, for LDAP settings - configure it to use "localhost" as the server, and it should detect port 389; and leave DN and PASSWORD BLANK. Set the Base DN manually to 'dc=trailpeople,dc=net'. If you put a user/password in there and then change the password later, owncloud will suddenly stop working.
Owncloud config.php
<?php <?php $CONFIG = array ( 'instanceid' => 'ocarb6oq5tsb', 'passwordsalt' => 'WOO1qwVT6iOCp6ycWp4lZ8GlNVv9y4', 'secret' => 'FtvmpxpedQGTqwrxy7u+b8Ye5HMgXUmXzBlSlxROfogExbs8', 'trusted_domains' => array ( 0 => 'diskstation', 1 => 'diskstation.trailpeople.net', 2 => '192.168.1.5', ), 'datadirectory' => '/volume1/web/trailpeople/owncloud/data', 'overwrite.cli.url' => 'https://diskstation.trailpeople.net/owncloud', 'dbtype' => 'mysql', 'version' => '9.1.2.5', 'dbname' => 'owncloud', 'dbhost' => 'localhost', 'dbtableprefix' => 'oc_', 'dbuser' => 'owncloud', 'dbpassword' => 'XXXXXXXX', 'logtimezone' => 'UTC', 'installed' => true, 'memcache.local' => '\\OC\\Memcache\\Redis', 'redis' => array ( 'host' => 'localhost', 'port' => 6379, ), 'ldapIgnoreNamingRules' => false, 'mail_from_address' => 'owncloud', 'mail_smtpmode' => 'smtp', 'mail_domain' => 'trailpeople.net', 'mail_smtphost' => 'smtp.gmail.com', 'mail_smtpport' => '587', 'loglevel' => 2, 'mail_smtpsecure' => 'tls', 'mail_smtpauthtype' => 'LOGIN', 'mail_smtpauth' => 1, 'mail_smtpname' => '[email protected]', 'mail_smtppassword' => 'XXXXXXXX', );
Mediawiki
I installed it at /volume1/web/trailpeople/wiki by downloading and unpacking the tar ball from mediawiki.org to avoid the Apache package dependencies (and the outdated version) from Synology.
So it's accessible as https://diskstation.trailpeople.net/wiki/.
It keeps its data in MySQL and the username and database are mediawiki
LDAP authentication
After installing the LDAP plugin I had to fix up the database
cd wiki/maintenance /usr/local/bin/php56 update.php