Clean data is the starting point for making solid, research-driven decisions for your association. In this roundtable, Russ Webb from the Bay Area Apartment Association, talks to us about the importance of good data and how to make sure your data stays clean.

(Written recap below)

So what do you mean by "clean data"?

Clean data is regularly updated and processed to remove inaccuracies or incomplete information. 

The saying "garbage in, garbage out" applied to data means that if you have bad information to start with, expect the reports that you run and the communication lists you use to also be bad.

How do you prioritize data clean-up?

Start with the data that you generate revenue from such as:

  • Data you use to process/run your annual dues: custom fields, billing addresses and emails, etc.

  • Data used to sell products and events

Once you know the types of data you'll be cleaning up, Russ recommends segmenting it even further by member type or status so you can regularly do data clean-up in bite-size pieces.

How often do you do clean-up?

This may vary greatly by association or Chamber but setting a little time aside weekly will help to make sure this doesn't become overwhelming. More on this below.

Tools You Can Use in Novi to Keep Your Data Clean

DataMaid
One of the most robust tools, DataMaid scans and analyzes your QuickBooks Online data for duplicates, incorrect formatting, etc.

DataMaid is free to all Novi customers and can be run at anytime.  Here are the step-by-step instructions. You may want to consider running this quarterly.

AE Tip: Because this is such a robust scan on your data, it can take several minutes to run. Consider running this before heading to lunch!

Address Clean-up Tool
Found under Settings > Tools, the address cleanup does exactly what its name suggests. Russ recommends using the "filter" option to divide up the work. For example, if you are about to send out dues, filter down to those records missing a billing address.
>> Learn more!

QuickBooks Analyzer
The QuickBooks Analyzer tool runs right within Novi and scans your data for things like duplicates and ambiguous records. Because this tool is a filtered down version of DataMaid, it is quicker to run and you may want to consider running this every week to two weeks.
>> Learn More!

Custom Report
In Novi, you can build a variety of reports to help you check and identify missing or incomplete data. A few reports mentioned in the call include:

  • Members Who Haven't Logged-In Recently

  • Individuals Missing a Parent Company (for trade/hybrid associations)

  • Companies missing a Primary and/or Billing Contact

Member Engagement Report
Filter this report down to your current members who haven't engaged and you have a list of folks who may not be engaging with you because they aren't receiving your communications. Non-engaged members are more likely to not have updated their data in awhile.

Other Tools

LinkedIn
Did one of your connections just post an update that they've switched companies? Give them a quick shout to let them know you can update their company and email (thanks Meg, TAA).

Email
Email your primary/billing contacts before running your dues. Any "bounces" mean bad data (thanks Cheryl, MISBO).

Export your Novi data and perform a mail merge in Office 365 or Gmail and ask for people to update their member compass or fill out changes in a simple Google Form (Russ).

Avoiding Duplicate Data

While the above tools can help you cleanup data already in your system, it's important to try and catch bad data before it fully makes its way into the system. 

By processing your recent signups queue and checking for duplicates and missing info, before clearing it out, you will help to ensure your new data is in top shape from the start.

AE Tip: When searching for a duplicate in the recent sign-ups queue, search by last name and email address vs. just the first name. This way, you will find those people who may have registered with a shortened version of their name such as Jenn vs Jennifer (thanks Jake, AAMD).

>> How to Merge Duplicate Data

Did this answer your question?