Comments

  • When starting MSP360 Connect Freeware, it shows Disconnected and will not Reconnect
    There was a DNS issue on one of our servers that the hosting provider is correcting. From what I was told by engineering, the problem was already addressed by the hosting provider, but the DNS entries need to do the typical internet migration. That should be complete by tomorrow. We apologize for the issue with Connect.
  • User Connection Question
    I think you're asking about something like remote desktop services /terminal services from Microsoft, where you could have many users connecting under separate accounts running Windows virtually. That's not what the Connect solution is. Connect is for connecting to a remote system securely for help desk or administration. While you can have more than one administrator connect to the same session remotely for the purposes of training or when an administrator needs help from another administrator to solve a remote issue, I don't get a sense that's what you need. Connect is not designed to run multiple sessions at the same time which would require something like Windows remote desktop services.
  • Minor annoyance, or something more?
    If you're absolutely positive that those are the same endpoints, then I would ask you open a support case with the support team. I can't find that issue referenced in the system anywhere, and if it really is a problem beyond that one endpoint, it needs to be addressed. They're going to be some maintenance updates on the servers in a couple days. You could wait until after that. Possibly that issue is being addressed (September 22, Thursday 4.00 AM - 7.00 AM PT). But please double check to make sure that they are the same endpoint.
  • Viewing Folder Size on MSP360 Explorer for Mac
    Something strange for you to test. Can you try pressing the SPACEBAR on the keyboard when the folder is selected and tell me what happens. Might be nothing...
  • Viewing Folder Size on MSP360 Explorer for Mac
    Did you mean to write above that it's easy to see the size on Explorer for Windows, but you can see that on Mac Explorer? If so I'm guessing it's just a feature limitation.
  • Minor annoyance, or something more?
    You can post screenshots here. You'll need to upload them using the upload option in the toolbar above the text entry.
  • Testing in my PC
    Yes, that's the current release. Assuming there's not a regression issue in the release, is it possible you do not have the correct User password for the registration in the agent? If you want to do this from the management console, follow these steps:

    • In Computers - Remote Management, find the endpoint in question
    • Click on the gear icon on the right and select Edit - Edit Account
    • Select the correct user account
  • Testing in my PC
    Moved this post to the Managed Backup section.

    There was a problem in a recent release that was addressed related to authentication issues that some customers were running into. My recommendation is you test the latest release which should not have those issues.

    Having said that, it’s much easier to assign the account from the management console, as it does not require you enter a password since you’re already authenticated into the management console.
  • Legacy and new format backup
    Yes. That's how you would seed as if you were using something like S3 Snowball. Again, I would do a simple test. a few files in an archive saved locally and copied to the cloud and synchronized.
  • Request for backup plan
    I think you can try what I originally posted above in my initial reply to satisfy most of the requirements in your original post. That is, use a combination of "Keep backup For" retention mixed with GFS retention for long term. If you're performing monthly Full backups, then you cannot use the Weeks GFS option as it requires a minimum of a Weekly full backup.

    If you need more detailed help, I'd recommend reaching out to Support as they may be able to better address this question.
  • Legacy and new format backup
    Lifecycle Policies as I recall work on the object creation date in the cloud and not the dates on the original files. So any lifecycle policy on a new backup format would move the archive according to the date on the archive. You'd effectively be seeding the backups in the cloud as some customers do on initial backups when the source data is large and the upstream bandwidth is inadequate.
  • Request for backup plan
    I don't think that would satisfy what you wrote in the original post, which is keeping the full plus differentials in the most recent six months for one year.
  • Request for backup plan
    I didn't split the plan. My description was for one plan. When using GFS, If you want to keep weekly backups, then you need to schedule a full at least once a week. If you don't do that there's no full backup to keep on the GFS side. It will let you continue but those weeklies will not be kept according to the settings.
  • Request for backup plan
    Since GFS settings are only for Full backups, I think what you'll have to do is set the Keep For retention option to 6 Months. This will keep all Full+Incremental backups for 6 months, satisfying (I think) the first part of your request above.

    Then use the GFS Settings and keep 12 Monthly Backups to satisfy the next 6 months of retention, but at the monthly level, and set GFS Yearly backups to 2 with a start the month of December. That last part will not trigger GFS for the Yearly until December and then keep the following December as well.

    There is no way to schedule a full to run on the last day of a month. You could set it to the Last Saturday or Sunday if you wanted, or use any of the other scheduling options for the Full. Schedule Incremental backups run as needed - once a week, daily, etc.

    But I would also consider reaching out to Support for guidance on best settings. What I wrote above is just what I think based on my understanding of the product and what you wrote above.
  • Legacy and new format backup
    Object count will normally be much, much lower. We use a dynamic algorithm to group files into archives - based partly on file processing speed & time with a maximum size. I do not recall off-hand what the max size is but it's at least 1 GB from what I recall. So, if you're backing up many small objects they should fit nicely into the new archives. And this will reduce API calls since there will be far fewer objects to upload and move around through lifecycle policies (and restores) and you're not going to be below the minimum object size for S3-IA and Glacier, which can add greatly to storage costs. You can always run a local test off-hours on a subset of data to see what the archives look like.
  • Can Frame.io be added to the list to connect to?
    If they're cloud is S3 compatible, then it's possible with our S3 compatible connector. But you would have to check with them to see if that's how their cloud is designed. If not you may be able to use one of the many integrations that are available to connect it to Amazon S3 storage. https://zapier.com/apps/frameio/integrations
  • Not connecting to Glacier Vault, needs S3 bucket that does not exist
    You do not create a bucket for a storage class. You only set the objects to that storage class when they are created. And you can transition the objects automatically to less expensive classes using a Lifecycle Policy, if desired. You back up to S3 Glacier using the Storage Class option in the backup wizard as mentioned above.
  • Legacy and new format backup
    Synthetic Full backups are supported on Backblaze B2 with the new backup format for all backup types. https://mspbackups.com/AP/Help/backup/about/backup-format/synthetic-full-backup
  • Immutable Backups
    It's more difficult to protect local data (compared to cloud data) if there's malware running on the network. If data is exposed as a network share, then there's sufficient access to some of the backup data that can put it at risk.

    If you had a data center at your MSP that was going underutilized, then you could look at using Minio. Minio exposes local disk as an S3 Compatible cloud and is accessed through the S3 APIs (as opposed to CIFS), which means access needs to use those APIs. You can run it on Linux or Windows.

    if you lock down the Managed Backup agents (recommended) by unchecking Enable Backup Agent and uncheck Allow Data Deletion in Backup Agent from Settings - Global Agent Options you can help prevent someone or some malware from deleting backups. You could also uncheck Allow Edit of Backup Plans
    and Allow Edit of Restore Plans in Options to ensure no changes are made to plans. You can also make these changes at the Company level in the management console.

    You can assign a Master Password to the agents (from Remote Deploy or by editing an endpoint directly in Remote Management - Edit - Edit Options), if desired - if you need to keep the agents available or so the password is needed should you temporarily enable an agent.

    Saving locally is fine, but we always recommend using the public cloud (or Minio at the MSP) as a secondary target for backups.

    Immutability is available with the new backup format and is tied to GFS retention settings. Dial in how many backups of each Period (weekly, monthly, yearly) you need and they will be locked down with Object Lock if that feature was enabled when the bucket was created and enabled for the backup plan. Object Lock prevents deletion of the data before the GFS period expires. The key here is not to keep more backup sets than you need to satisfy your customers. Depending on the customers, you may need to adjust GFS settings accordingly. Obviously, the more backup sets you keep, the more storage is needed, but if your customer needs monthly backups for 12 months and yearly backups for 3 years, then that's what they need and you can have that conversation up front to ensure there are no surprises on storage costs as time goes by and storage grows.
  • Not connecting to Glacier Vault, needs S3 bucket that does not exist
    We support S3 Glacier. Legacy Glacier is no longer officially supported. From my understanding, the only real difference is that vendors like MSP360 can use the S3 API rather than the legacy Glacier API to access data in Glacier and you can access S3 Glacier the same as other S3 storage classes. To target S3 Glacier for a backup, select the desired Storage Class on the Compression & Encryption Options page in the backup wizard. You can currently select: Glacier, Glacier Instant Retrieval, and Glacier Deep Archive as the three Glacier storage class options - in addition to other S3 storage classes. More information from Amazon here: https://aws.amazon.com/s3/storage-classes/glacier/ as well as some information from us here: https://kb.msp360.com/cloud-vendors/amazon-aws/s3-glacier-legacy-glacier-difference