National Business Incubation Association
 

Choose a practice domain:

Governance

Staffing


Incubator Finances

Selecting Clients

Serving Clients

Graduation

Marketing and PR

Facilities Management

Measuring Impact

Environmental Impacts

Measuring Impact

This incubator at least annually collects quantifiable data and information to ensure the incubation program is meeting its mission. [View]

This incubator at least annually collects impact data (revenue, employment, investment, etc.) from its current clients. [View]

This incubator collects impact data (revenue, employment and investment, etc.) from graduates on an annual basis for a minimum of at least five years. [View]

 

This incubator at least annually collects quantifiable data and information to ensure the incubation program is meeting its mission.

Do you know the magnitude of the impact your incubation program has on the local economy? Not just in terms of clients served, but the real meat of the matter: jobs created, salaries paid, revenues earned and other economic gains?

If you don’t have these figures handy, you might be missing opportunities to convince potential funders, champions, and the public in general of your program’s importance.

Additionally, if you aren’t tracking this information, you might be unwittingly contributing to feelings of doubt about the industry’s effectiveness.

Those of you who know your program’s impact understand the many good reasons for tracking outcomes. Of major importance to most incubator managers is that impact data is essential ammunition for fundraising. Secondly, it’s critical to have this data to prove your program’s contribution to the local economy. Finally, individual programs’ impacts contribute to building industry credibility. Unfortunately, many business incubators do not track outcome data about their client and graduate firms. And the problem doesn’t appear to be endemic to the incubation industry alone. A 2004 survey by the International City/County Management Association found that 66 percent of 726 responding local governments did not use performance measures to assess the effectiveness of their economic development efforts. The survey also found that 38 percent of responding local governments operated business incubation programs.

Previous research conducted by NBIA provided persuasive evidence for the impact of incubation on companies and local economies. NBIA’s Impact of Incubator Investments Study – the results of which were published in 1997 – found that among incubation programs responding, 87 percent of graduate firms were still in business. However, that same research also uncovered a wide variance in evaluation capacities and efforts of individual programs. That variance still exists. Some programs conduct fairly sophisticated process and outcome data collection; others use rudimentary systems. Many use nothing at all. One thing is certain: if the incubation industry is to effectively demonstrate the value of its services, individual programs must come to terms with the practice of program evaluation.

Excerpted from Erlewine, Meredith, Measuring Your Business Incubator’s Economic Impact – A Toolkit, NBIA Publications, 2007. This 26-page toolkit covering all aspects of tracking impacts from why to do it, what to collect, how to collect it, and analysis and reporting (and more) is available free to both members and nonmembers from NBIA. We developed this product with support from Southern California Edison and offer it to you at absolutely no cost because we believe collecting impact data and reporting on it is essential to best practices business incubation – and it’s seldom done right and perhaps not at all. To read this document in its entirety, see the link at the bottom of NBIA’s home page, or go to www.nbia.org/impact. Utilize this valuable free toolkit today.

Overall, two-thirds of top-performing incubators (66.7%) collect outcome data. More than half collect this information for two or more years, while over 40% collect data for five or more years. Collected data include client and graduate firm revenues and employment, firm graduation and survival rates, and information on the success of specific program activities and services.

Analysis provides sound empirical evidence that the length of time an incubation program collects graduate firm outcome data, resident client employment data, and graduate firm sales data are all statistically significant and positively correlated with measures of client firm success. This finding could mean that programs with the capacity to collect data also have the resources to implement best practices covering the array of management practices and services that lead to client firm success. It is equally plausible that collecting outcome data demonstrating a positive return on investment assures funders that business incubation is a viable part of a sound economic development strategy and that continuing to invest in the program will result in the anticipated outcomes. Of course, success breeds success, as program stability enhances the capacity of an incubator to meet its stated goals. But having a written policy requiring clients to provide outcome data is also positively correlated at a statistically significant level. This suggests that the capacity to collect data is not the only means to ensure data collection, but that including this requirement among the entry criteria can reduce the administrative burden of data collection.

Adapted from Lewis, David A., Elsie Harper-Anderson, and Lawrence A. Molnar, Incubating Success: Incubation Best Practices That Lead to Successful New Ventures, University of Michigan, 2011, p. 8, 54.

Economic impact is not the only measure of an incubation program’s effectiveness; evaluating internal operations is vital as well. Questions to ask include:

  • Does the program conform to its mission?
  • Does the program have the right staff to meet clients’ needs?
  • Is the program operating within its budget?
  • Does the program have the right mix of board members?
  • Have staff become complacent, or do staff constantly try to improve?
  • Has the program achieved its performance goals?
  • Are performance goals aligned so the program can meet clients and stakeholders’ expectations?
  • Where is the program strong? Where is it weak?

Answering these questions may require some outside information to benchmark the program effectively. In some cases, the incubator can compare itself to the entire incubation industry; NBIA’s State of the Business Incubation Industry reports give baseline data about overall industry averages for comparison. NBIA’s online benchmarking tool, of which this Resource Library is a part, can show how the program’s practices compare with those of others.

Other areas require information specific to a program’s particular type and mission, so that apples are compared to apples. For example, a technology incubator associated with a university might collect data on the number of university technologies successfully commercialized through new company formation. But this same measure would not be relevant to a technology incubator not affiliated with a university. NBIA may have some of that information, as may other associations, such as the Association of University Technology Managers or the State Science and Technology Institute.

Excerpted from Colbert, Corinne, Dinah Adkins, Chuck Wolfe and Karl LaPan, Best Practices in Action: Guidelines for Implementing First-Class Business Incubation Programs, Revised 2nd Edition, NBIA Publications, 2010, p. 56. Also see in this book, discussions of program evaluation that offer guidance to incubator managers and developers in "Collection of Statistics on Program Parameters," "Meeting Industry Standards," Incubator Benchmarking Survey," "Collecting and Reporting Economic Impacts," "Economic Impact Data Reports," and articles and examples on evaluating incubator services such as "Client Satisfaction Survey and Gap Analysis." (Available from the NBIA Bookstore.)

One approach to incubator evaluations comes from the client’s perspective. An incubator manager must regularly gather feedback from clients about the usefulness and effectiveness of the incubator’s programs and services. Based on the feedback and keeping the incubator’s mission in mind, the manager then can eliminate or adjust ineffective services or add new programs that reflect client requests and changing needs.

Client surveys are a useful way to gather not only outcome data, but also client satisfaction data. Questions gauging client satisfaction might cover:

  • Training and mentor programs
  • Space and facility services
  • Networking and social events
  • Efforts to assist firms in obtaining financing
  • Service provider programs
  • Incubator staff performance
The idea is to evaluate clients’ use of services and whether those services are making a difference in their businesses

Clients will appreciate the opportunity to rate incubator programs and services as well as answer open-ended questions. They will also appreciate surveys that are easy to understand and do not take an unreasonable amount of time to complete.

Another effective way to gather input is to organize focus groups. Used in combination with surveys, focus groups offer additional insights and can bring out more spontaneous reactions from participants. Some managers ask graduate companies to participate in focus group sessions because graduates often have a greater appreciation for what they gained from the incubation program as clients.

To gauge the satisfaction of its kitchen incubator clients, the Jubilee Business Incubator in Sneedville, Tennessee, recently conducted two surveys, according to Steve Hodges, executive director of the Jubilee Project/Jubilee Business Incubator. The first was a general client satisfaction survey that asked open-ended questions including:
  • Has this program been helpful to you? If so, how?
  • How do you think the incubation program could be improved?
  • Have any staff been particularly helpful or unhelpful? If so, how?

The second survey was specifically related to the use of the kitchen facility for food production.

Excerpted from Cammarata, Kathleen, “Evaluating Incubator Performance and Measuring Impact,” A Comprehensive Guide to Business Incubation, Completely Revised Second Edition, NBIA Publications, 2004, pp. 193-195. (Available from the NBIA Bookstore.)

For further information on measuring incubation program performance, see:

  • Cammarata, Kathleen, Self-Evaluation Workbook for Business Incubators, NBIA Publications, 2003. This spiral-bound workbook contains all the tools you need to conduct annual reviews of operations, including some 300 qualitative metrics and advice on why evaluating operational issues (mission, governance, staffing, selecting clients, servicing them, facilities management, graduation and more) is important; it also contains space for you to track your program’s progress, time frames, who’s responsible, etc. (Available from the NBIA Bookstore.)


[Back to top]

This incubator at least annually collects impact data (revenue, employment, investment, etc.) from its current clients.

Do you know the magnitude of the impact your incubation program has on the local economy? Not just in terms of clients served, but the real meat of the matter: jobs created, salaries paid, revenues earned and other economic gains?

If you don’t have these figures handy, you might be missing opportunities to convince potential funders, champions, and the public in general of your program’s importance.

Additionally, if you aren’t tracking this information, you might be unwittingly contributing to feelings of doubt about the industry’s effectiveness.

Those of you who know your program’s impact understand the many good reasons for tracking outcomes. Of major importance to most incubator managers is that impact data is essential ammunition for fundraising. Secondly, it’s critical to have this data to prove your program’s contribution to the local economy. Finally, individual programs’ impacts contribute to building industry credibility. Unfortunately, many business incubators do not track outcome data about their client and graduate firms. And the problem doesn’t appear to be endemic to the incubation industry alone. A 2004 survey by the International City/County Management Association found that 66 percent of 726 responding local governments did not use performance measures to assess the effectiveness of their economic development efforts. The survey also found that 38 percent of responding local governments operated business incubation programs.

Previous research conducted by NBIA provided persuasive evidence for the impact of incubation on companies and local economies. NBIA’s Impact of Incubator Investments Study – the results of which were published in 1997 – found that among incubation programs responding, 87 percent of graduate firms were still in business. However, that same research also uncovered a wide variance in evaluation capacities and efforts of individual programs. That variance still exists. Some programs conduct fairly sophisticated process and outcome data collection; others use rudimentary systems. Many use nothing at all. One thing is certain: if the incubation industry is to effectively demonstrate the value of its services, individual programs must come to terms with the practice of program evaluation.

Excerpted from Erlewine, Meredith, Measuring Your Business Incubator’s Economic Impact – A Toolkit, NBIA Publications, 2007. This 26-page toolkit covering all aspects of tracking impacts from why to do it, what to collect, how to collect it, and analysis and reporting (and more) is available free to both members and nonmembers from NBIA. We developed this product with support from Southern California Edison and offer it to you at absolutely no cost because we believe collecting impact data and reporting on it is essential to best practices business incubation – and it’s seldom done right and perhaps not at all. To read this document in its entirety, see the link at the bottom of NBIA’s home page, or go to www.nbia.org/impact. Utilize this valuable free toolkit today.

Overall, two-thirds of top-performing incubators (66.7%) collect outcome data. More than half collect this information for two or more years, while over 40% collect data for five or more years. Collected data include client and graduate firm revenues and employment, firm graduation and survival rates, and information on the success of specific program activities and services.

Analysis provides sound empirical evidence that the length of time an incubation program collects graduate firm outcome data, resident client employment data, and graduate firm sales data are all statistically significant and positively correlated with measures of client firm success. This finding could mean that programs with the capacity to collect data also have the resources to implement best practices covering the array of management practices and services that lead to client firm success. It is equally plausible that collecting outcome data demonstrating a positive return on investment assures funders that business incubation is a viable part of a sound economic development strategy and that continuing to invest in the program will result in the anticipated outcomes. Of course, success breeds success, as program stability enhances the capacity of an incubator to meet its stated goals. But having a written policy requiring clients to provide outcome data is also positively correlated at a statistically significant level. This suggests that the capacity to collect data is not the only means to ensure data collection, but that including this requirement among the entry criteria can reduce the administrative burden of data collection.

Adapted from Lewis, David A., Elsie Harper-Anderson, and Lawrence A. Molnar, Incubating Success: Incubation Best Practices That Lead to Successful New Ventures, University of Michigan, 2011, p. 7, 54.

Tim Strege, executive director of the William M. Factory Small Business Incubator in Tacoma, Washington, understands the value of collecting data that illustrate how the incubator is contributing to the local and state economies. The incubator compiles information on client revenues, average wages, jobs created, earnings, taxes, purchases of supplies, and other data annually.

William M. Factory focuses on specialty construction trades, business services, and applied technology companies (e.g., ICT, RFID). Located in the low-income area of east Tacoma, the incubator has targeted businesses that provide high quality jobs for minority and low-income people, although it includes a wide variety of client firms by race and ethnicity.

Two incubator buildings—joined together—provide 42,000 square feet of client, administrative, and shared space.

Data compiled by the incubator for 2009 represent twenty-seven specialty construction trade companies, seven information technology companies, and one business service company that were clients of the incubator in that year. (The incubator opened a second, 20,000-square-foot facility in late 2009; at capacity, the incubator will house fifty to sixty firms.)

Strege provided the following data in his 2009 report:

$35,227,010 Aggregate client commercial revenues

343 Full-time employment by client companies

$23.25 Average hourly wages of client employees

$16,589,697 Total wages paid by client companies

$9,612,113 Total client company supply purchases

$2,726,161 Total client company overhead and reinvested profits

$ 6,299,039 Total business taxes, comprising:
Federal employment taxes $1,389,353
Federal business taxes $544,355
State employment taxes $490,360
State and local business taxes $3,874,971

$3,259,875 Employee Social Security and income taxes

$9,558,914 Total business and employee taxes

The federal employment tax rate is the required FICA (Social Security and Medicare) 7.65 percent contribution and an estimated 0.85 percent of payroll based upon unemployment insurance rates of 0.008 to 0.062, with a maximum credit of 5.4 percent for state unemployment insurance taxes paid for the first $7,000 of wages paid. “This is about $400 per full-time equivalent employee,” Strege says.

The employee tax rate assumed a married person with four allowances—either dependents or other income tax deductions—and is based upon published IRS withholding tables. State and local taxes include business, occupation, and combined sales tax (collected by the state from companies and distributed on a formula basis to government entities).

“All calculations are direct economic impacts,” Strege emphasizes, noting that he also uses Minnesota IMPLAN (Impact Analysis for Planning) economic modeling software multipliers for Washington to calculate indirect and induced economic benefits.

The data are collected for each company on a quarterly basis and aggregated to obtain totals. The incubator sends its data to all its funding agencies that want employment, revenues, and wage income figures.

“We provide information on tax ratios to illustrate the cost-benefit ratio. For example, about $4.4 million of state and local taxes were produced by incubator client companies,” Strege says. “The incubator received about $216,000 of government operating support in 2009, but generated $20 in state and local taxes to offset each $1 of operating support.”

Tracking such data is important, but reporting them is even more so. When Strege presented the 2009 report to the Tacoma Community Redevelopment Authority, “Members were impressed that the incubator helps companies grow favorable wage employment,” Strege says.

Strege provides a concrete example of client success: “One soon-to-be graduate company, Advanced Government Services, which specializes in traffic control at construction sites, has invested over $1 million in equipment purchases and for property acquisition and renovation of its new corporate headquarters.” The woman-owned company grew from a two-person start-up to forty-two employees during its time in the incubator.

Excerpted from Colbert, Corinne, Dinah Adkins, Chuck Wolfe and Karl LaPan, Best Practices in Action: Guidelines for Implementing First-Class Business Incubation Programs, Revised 2nd Edition, NBIA Publications, 2010, pp. 58-59. (Available from the NBIA Bookstore.)


[Back to top]


This incubator collects impact data (revenue, employment and investment, etc.) from graduates on an annual basis for a minimum of at least five years.

When collecting statistics, you must include incubator graduates. Most stakeholders want to know not just the number of companies the program has nurtured, but also how those graduates perform in the long run. NBIA recommends that all incubation programs track their graduates for at least five years to demonstrate long-term economic impact. (To ensure that this information can be obtained from graduate companies, include it as an admission requirement, clearly outlining its purpose in the client contract, or rental or service agreement.)

Excerpted from Colbert, Corinne, Dinah Adkins, Chuck Wolfe and Karl LaPan, Best Practices in Action: Guidelines for Implementing First-Class Business Incubation Programs, Revised 2nd Edition, NBIA Publications, 2010, p. 58. (Available from the NBIA Bookstore.)

Overall, two-thirds of top-performing incubators (66.7%) collect outcome data. More than half collect this information for two or more years, while over 40% collect data for five or more years. Collected data include client and graduate firm revenues and employment, firm graduation and survival rates, and information on the success of specific program activities and services.

Analysis provides sound empirical evidence that the length of time an incubation program collects graduate firm outcome data, resident client employment data, and graduate firm sales data are all statistically significant and positively correlated with measures of client firm success. This finding could mean that programs with the capacity to collect data also have the resources to implement best practices covering the array of management practices and services that lead to client firm success. It is equally plausible that collecting outcome data demonstrating a positive return on investment assures funders that business incubation is a viable part of a sound economic development strategy and that continuing to invest in the program will result in the anticipated outcomes. Of course, success breeds success, as program stability enhances the capacity of an incubator to meet its stated goals. But having a written policy requiring clients to provide outcome data is also positively correlated at a statistically significant level. This suggests that the capacity to collect data is not the only means to ensure data collection, but that including this requirement among the entry criteria can reduce the administrative burden of data collection.

Adapted from Lewis, David A., Elsie Harper-Anderson, and Lawrence A. Molnar, Incubating Success: Incubation Best Practices That Lead to Successful New Ventures, University of Michigan, 2011, p. 7, 54.

Some incubator managers don’t track graduate data because they believe they can’t take credit for accomplishments (or mistakes) of firms after they’re out on their own. How is an incubator manager to know if a series of decisions that led to a company’s ultimate success or failure is due to its former association with the incubator? How can an incubator say it had a hand in creating jobs that come into existence after a company leaves the program?

Skip Farrar, business development manager at Southern California Edison, a public utility in California, says it’s important for incubators to continue to track and report this data. “It’s a but/for argument,” he says. “But for the incubator, this business might never have existed, and the persistency wouldn’t be there.” In Farrar’s opinion, the role the incubator plays in getting a business off the ground is extremely important. “Therefore, the incubator gets to claim results over time,” he says.

But what about those who say the business might have started on its own? Or that its success might have nothing to do with services provided by the incubator long ago? “Think about it this way,” Farrar says. “But for the contributions of Steve Wozniak, Apple Computer never would have existed. So does he get to keep claiming responsibility for Apple Computer’s success over time? Absolutely.” Farrar notes that this line of thinking is prevalent in the academic realm as well, with universities taking credit for the wage-earning capabilities of graduates.

Do you know the magnitude of the impact your incubation program has on the local economy? Not just in terms of clients served, but the real meat of the matter: jobs created, salaries paid, revenues earned and other economic gains?

If you don’t have these figures handy, you might be missing opportunities to convince potential funders, champions, and the public in general of your program’s importance.
Additionally, if you aren’t tracking this information, you might be unwittingly contributing to feelings of doubt about the industry’s effectiveness.

Those of you who know your program’s impact understand the many good reasons for tracking outcomes. Of major importance to most incubator managers is that impact data is essential ammunition for fundraising. Secondly, it’s critical to have this data to prove your program’s contribution to the local economy. Finally, individual programs’ impacts contribute to building industry credibility. Unfortunately, many business incubators do not track outcome data about their client and graduate firms. And the problem doesn’t appear to be endemic to the incubation industry alone. A 2004 survey by the International City/County Management Association found that 66 percent of 726 responding local governments did not use performance measures to assess the effectiveness of their economic development efforts. The survey also found that 38 percent of responding local governments operated business incubation programs.

Previous research conducted by NBIA provided persuasive evidence for the impact of incubation on companies and local economies. NBIA’s Impact of Incubator Investments Study – the results of which were published in 1997 – found that among incubation programs responding, 87 percent of graduate firms were still in business. However, that same research also uncovered a wide variance in evaluation capacities and efforts of individual programs. That variance still exists. Some programs conduct fairly sophisticated process and outcome data collection; others use rudimentary systems. Many use nothing at all. One thing is certain: if the incubation industry is to effectively demonstrate the value of its services, individual programs must come to terms with the practice of program evaluation.

Excerpted from Erlewine, Meredith, Measuring Your Business Incubator’s Economic Impact – A Toolkit, NBIA Publications, 2007. This 26-page toolkit covering all aspects of tracking impacts from why to do it, what to collect, how to collect it, and analysis and reporting (and more) is available free to both members and nonmembers from NBIA. We developed this product with support from Southern California Edison and offer it to you at absolutely no cost because we believe collecting impact data and reporting on it is essential to best practices business incubation – and it’s seldom done right and perhaps not at all. To read this document in its entirety, see the link at the bottom of NBIA’s home page, or go to www.nbia.org/impact. Utilize this valuable free toolkit today.

[Back to top]