360Deploy is a product that simplifies the tedious process of deploying changes in a development database to an existing production database. This can be done with the click of a button while eliminating human error and reducing the downtime traditionally associated with the process. All that is required is some simple configuration, and click Deploy!
For a better understanding, consider the typical FileMaker file development scenario of a production and development server. The production server continuously hosts solutions for clients. The development/staging file is the database where alterations are being made by developers to enhance utility and better serve client's needs. Overhauling the old file to reflect the new one typically involves a process during which production is shut down and administrators sift through tables, scripts, and fields to reflect updates. We seek to bypass this arduous task by automating the entire series of events.
The entire process takes only as long as it takes to import data based on your database size, during which the databases are paused. This allows files to be updated without worrying about how often or when the changes occur. The set up consists of installing our plug-in in the FileMaker client running the deploy file, and some simple script copy/pasting.
Why use 360Deploy?
Altering live databases has traditionally been a tedious, intrusive task. It consists of server downtime and technical labor at the expense of company resources. 360Deploy is capable of doing that in minutes by seamlessly automating the process. What our software does is create a clone of the development database, transfers it to the production server, and then imports data from production into the clone to absorb the new architecture. As a safeguard, a back-up of the production file is also created on the machine before the imports occur should clients wish to revert back to the old structure.
- Ideal for frequent upgrades to a databases of any size
- Minimizes file downtime, only pausing file temporarily during import of data
- Drastically reduces complexity and time required to upgrade new database architecture
- Includes import of scripts and calculations
- Backs up production file in event changes want to be reverted
- 360Deploy requires FileMaker Pro or Advanced 16 to run the actual import process.
- Production databases must be hosted on FileMaker Server 14 or later
- For technical reasons, FileMaker Cloud is NOT compatible with 360Deploy
There are four types of licenses for 360Deploy. Pricing for each license type is available at http://360works.com/360Deploy by clicking the 'pricing' button.
- A free demo license, which can deploy to any number of servers. It has one important limitation, which is that it can only be used to deploy databases named '360Deploy Demo Solution'
- Express Edition: This allows deployment of a single FileMaker solution (this can be a multi-file solution) to a single deployment server
- Enterprise License: This allows deployment of unlimited FileMaker solutions to a single deployment server. It is not legal to use this license for multiple different clients, for instance in a shared hosting server or vertical market hosted server.
- Solution Bundle License: This allows deployment of a single FileMaker solution (this can be a multi-file solution) to unlimited deployment servers. The deployed solution can have different names for different customers.
An Enterprise License is included with the 360Works Portfolio License (http://360works.com/portfolio) at no additional charge. Any current Portfolio License holders will be able to immediately start using 360Deploy.
An important note: 360Deploys captures the list of databases when you run your first deployment for a license (and after a license reset). This is assumed to be all the files in your solution. Following that, you are allowed to deploy a subset of that list of files. If you have a large list of files in your solution, and you don't want to deploy them all at once, reach out to email@example.com and we can hard code the list of databases against your license, so you will be able to deploy a subset of that large list of databases without issue.
View the documentation here on how to get started: 360Deploy Instructions
Every time that you run a deployment, 360Deploy will put a copy of the previous production database into a timestamped directory in the FileMaker Server documents directory. This way, if anything goes wrong (due to a problem with the deployment, a programming error in the new version, or any other reason), you can simply place this timestamped archive version back into the live database directory effectively roll back to the both the structure and data that existed prior to running the deployment. You can find the archive folder at:
Keep in mind that this archive directory is never automatically deleted, and can potentially consume a lot of disk space if your solution is large. It's a good idea to manually clean up this directory periodically.
Also keep in mind that if you are using containers with external storage, that container data is shared between the old and new database versions - there is no separate copy placed into the archive folder. This speeds up the import and conserves disk space, but it also means that if you do need to roll back the database version, changes that were made to the external container data are not rolled back. Use FileMaker Server's built-in backup feature to guarantee a complete backup including a separate copy of all container data.
Scheduling after-hour imports
The script "Upload Files and Start Migration ( $~configuration_id )" is Server Side compatible, and can be used in a Server Side Schedule, which is our recommended way of doing a scheduled deployment. To use this script, we recommend creating a wrapper script, where you can pass in the id of the configuration to the "Upload Files and Start Migration" script.
This script does make use of the 360Deploy plugin, so you will need to install that server side if you intend to schedule deployments.
See the example script in the 360Deploy.fmp12 file called "PSOS : 360Store Deployment"
To get the id of your configuration, open the data viewer and evaluate the "Configuration::__id" field while you are on the main layout. This will be the value you pass in.
External Data Sources Are Different Between Dev and Prod
In a multi file solution there will typically be External Data Sources. Now if you have a Dev environment with Dev External Data Sources named one way, and a Prod environment with Prod External Data Sources named a different way, this situation requires a little attention before it is 360Deploy ready. If you did nothing before running a deployment, your Dev file would be deployed into the Prod environment, but would still be looking for the Dev External Data Sources (which will either not be found, or perhaps worse, actually resolve correctly but now you are using the wrong database).
The solution to this problem is to use Dynamic Data Sources. Dynamic Data Sources allow us to set an External Data Source to a Global Variable, then set that Global Variable during the Startup script. Whatever we set that variable to is what FileMaker will use to resolve the data source. View more documentation about Dynamic Data Sources here: Dynamic Data Sources
Dynamic Data Sources allow us to solve this problem of different data sources between Dev and Prod. First, change your data sources to Dynamic Data Sources. Use a Global Variable for each of your data sources. Then in your startup script, create an if/else branch that will determine whether we are running in the Prod or Dev environment, then set the Global Variable to the appropriate path for your data source. There are two common approaches here:
You can use a naming convention that allows you to programmatically determine the right external data source name.
So for example, let's say I name my main file "MySolution", I would name the external data sources after it: "MySolution_Invoices" and "MySolution_Inventory". In my startup script, I can determine the proper external data source by doing: Set Variable: $$invoicesExternalDataSource: Get(FileName) & "_Invoices"
Alternatively, and a little bit less clear, is you can take advantage of the fact that 360Deploy preserves the data from the prod file. So you could store the name of the external data source inside a field in the prod file, and have your startup script reference that field when setting up the dynamic data source. This approach is more flexible, because you could have different files follow different naming conventions for their external data sources, but it's a little harder to follow.
How are User Accounts handled during a deployment?
This is a funny question because you would think that the Accounts come from the Dev file like the rest of the schema (layouts, scripts, tables, etc). However, Accounts actually come from the Prod file during a deployment, and are imported into the Dev file. This is really the preferred way to handle this, users have been logging into the Prod file previously, and we still want their accounts to work. So if you have a new user, you can add their account to the Prod file. Then the next time you run a deployment, that account will get merged into the Dev file before it is hosted. All users who could log into the Prod file before a deployment, will still be able to log into the Prod file after the deployment.
Deleting / adding / renaming fields
Renaming or adding fields in the development database will work fine during the import. If you add fields in your development copy, be aware that this field will be empty after you migrate these changes to the production server, as there will be no data on the production server to populate it. Deleting fields will work as expected, the field will be removed from the production copy.
360Deploy uses the FMDataMigration tool, which will match up tables by id and name first, if no match then by name alone, then by id alone. Renaming tables will work fine as long as a new table is not created using the renamed tables previous name.
If there is an error with a deployment, there will be a record created in the @AppLog table. You will need to open this layout manually and scroll to the last record. Look in the "errorDescription" field for more information about the most recent error.