The Public Lab Blog


stories from the Public Lab community

About the blog | Research | Methods



Interview: Chris Nidel on environmental evidence in court

by warren | over 2 years ago | 0 | 5

A few months ago, as our first interview for the Environmental Evidence Project blog series (#evidence-project), we caught up with Chris Nidel, an attorney with Nidel Law, PLLC, based in the DC area. Lead image: satellite images of waste at a Maryland Perdue chicken farm from a case Chris fought in 2012.

_
_

As Chris writes in his bio, he's been involved in environmental law for a long time, starting shortly after getting a Master's in Chemical Engineering at MIT:

After graduate school, I went to work for a major pharmaceutical company doing drug process development. After a few years, I became disillusioned with the approach that the pharmaceutical industry, including the company I was working for, took toward putting profits over saving lives.

...

The combined realization that this company seemed more concerned about profits than creating cost-effective lifesaving medicines and that they seemed to care less about the illness and death that they were responsible for in their own right, forced me to leave. I went directly to law school at the University of Virginia. After graduating, I relocated to Dallas, Texas so I could work on major environmental and toxic tort litigation with the late Fred Baron at his firm Baron and Budd.

In the years since, Chris has worked at different firms and eventually started his own, and has been part of cases involving "cancer clusters, unsafe landfills, and air and groundwater contamination", and "air pollutants from TVA's coal fired power plants in Tennessee, Kentucky, and Alabama."

With all he's seen and done in environmental law, we wanted to ask some general questions about how evidence is used in court, from a pragmatic, real-world perspective.

I get this question all the time, where somebody says "how do I take samples that'll be admissible in court" -- and unfortunately, there's no right answer. ... I've had defendants in cases challenge EPA's own historical sampling. So you can have EPA scientists take samples and send them to whatever lab they feel is the appropriate lab to do the analysis, with whatever certification they have, and then the big company still comes back and say, "well, we don't believe those sample results. So while there is no silver bullet, the key is the reliability and ultimate credibility of the results. If the sampling and analytical methods are defensible...the data should be admissible"

Chris is realistic about the shortcomings of the current system, and emphasizes that you can never be 100% sure that evidence will be admitted, or that it will be effective:

Any time there's a legal argument in a court, there's always a chance that you'll lose it. If somebody said to me, "I want make sure I'm guaranteed that these samples -- or this evidence, or data -- is going to get admitted into court, I would be a fool to say, "sure, here's what you need to do guarantee it," because there's no guarantee. If you file a case against Monsanto, and the judge happens to play golf with the vice president of Monsanto, you may not be able to get those samples in. These issues may be out of your control and are independent of who did the analysis, and what qualifications they have, and what protocols they followed.

Data without advanced degrees

Chris has, however, worked on a case with Waterkeeper Alliance where samples were collected, stored, and managed by someone without a formal scientific degree. In this case, the person collecting samples was member of the local Riverkeeper organization and was an environmental advocate, but did not have extensive scientific training. Rather, she had a web-based certification for water sampling. According to Chris,

...the more important thing was that she actually followed a protocol. She got sterilized sample bottles from the lab, she wore sterilized gloves when she took the samples ... she filled out the chain of custody, she took it to the lab within the specified amount of time, she kept the cooler on ice, whatever was needed. And that was it. She took it to a lab, and they followed certified protocols. We didn't hire an engineer, or a chemist or a biologist to go out there and we didn't need to.

Due to this sampling protocol and detailed chain of custody, and since the laboratory followed an established analytical protocol, the samples were admitted as evidence in court without issue. In fact, the validity of the samples themselves wasn't even questioned in this case. Demonstrating the use of well-documented and rigorous methods can sometimes be sufficient for samples and data collected by non-accredited persons to be accepted as court-admissible evidence.


Clearly, there are nuances to this, and even without a cookie-cutter approach, there are pathways for people without a scientific degree to produce knowledge in ways that can be legally recognized. But that's what can be so frustrating about this topic -- it's not a clear set of rules you can just follow, and even if you follow what rules there are to the letter, you're still not guaranteed a result.

Expert witnesses

The example above specifically deals with samples that were analyzed later in a lab. What about other pathways? Many, it turns out, involve some kind of expert witness to authenticate, or essentially vouch for, the evidence. The judge or jury, who may not have relevant formal backgrounds to evaluate the evidence, won't be, as Chris puts it, "sitting there trying to speculate as to why they should believe one over the other" -- they'll rely on experts for that sort of thing. But who chooses the experts?

The problem with those experts are they are going to be paid by both sides. Clearly, they're going to have an opinion that's in your favor, and the other side is going to come in with an opinion in their favor, and... if you thought the DNC was bad, that's the sausage being made.

Not to mention -- what qualifies someone to be an expert in court? How much do expert witnesses tend to cost, and what happens if one side or the other can't afford an expert witness? We hope to circle back to this question in a later post.

image description

_
_

Chris spoke at our June 6 OpenHour online panel on the topic of "Concepts of Proof" -- where he discussed his use of aerial photography in pollution cases. Watch the full video here.

Photographic evidence

Looking for a way through the thicket, we were really interested in photographic evidence, and how it might be different -- does it require expert authentication too? Chris said that it depends:

What conclusions or facts are you trying to establish? In our PCB case we used a lot of historic aerial photos... and EPA had an aerial photo lab assess the pictures -- here are barrels, etc, here are contours.

So interpretation by experts can enter into it. But what's most important is how it fits in with, and can relate to, other supporting evidence. One key aspect appears to be whether or not understanding a given piece of evidence is common knowledge, or if it requires certain training and experience to understand the implications of that evidence. Chris talked through a scenario we mentioned in #OpenHour about a photograph taken by a crane operator, who documented a plume of suspected pollution in a nearby waterway:

If you just put the guy in the crane on the stand, and you say, "You were up in the crane, you took these pictures. How did you take them? On what day? What were you looking at? Explain to us what direction you were facing when you took the pictures." [Because] the guy who sat in the crane who doesn't have any background, you probably don't get anywhere with it. You get the fact that there are pictures; you might even get them admitted. You might get the jury to be able to draw their own conclusions:

"I took the picture, I was on top of my crane, it's about 200 feet up, I was looking at the South Bay and this is what I saw."

The jury can look at that picture and they can understand that. Everything is good up to that point. Now, "I think this looks like a bunch of chromium was thrown into the bay. I'm just a crane operator, [but] my conclusion is that a bunch of chromium was going on in the bay that day."

That probably doesn't fly because neither the guy in the crane or the jury could conclude that it's necessarily chromium. On the other hand, if you can get that in front of the jury and the judge says, "I don't know that Johnny in the crane can tell us it's chromium, but I think it's admissible," and the jury could make their own deduction that there was chromium because you have other evidence. Let's say that you have evidence of what was in the barge from which the plume was coming. Now we have test data that shows that the barge was full of chromium, and now you have a photograph from Johnny in the crane that looks like a bunch of stuff is coming out of the barge, ergo, the jury decides chromium came from the barge.

You can also hire an expert and then the question would be does that expert have something that will help the jury themselves that the jury doesn't have without the expert? The expert comes in and says, "I've seen lots and lots of chromium discharges from power plants and I know what chromium looks like when it gets discharged, and this is clearly a chromium discharge." Then you get at it that way as well, and that's probably the typical way to do it, albeit the expensive way to do it. You hand off those pieces of evidence to someone else to draw the conclusion that you want to draw for the jury.

Given the complexity and uncertainty of the law as it plays out in court, we found it very helpful to hear about specific cases and examples. It seems that permissible and influential evidence is in the eye of, well, many beholders. While sometimes it may depend on internal, potentially invisible relationships between a judge and the case at hand, other times it's about walking a jury through enough solid testimony for them to draw conclusions about the value of the evidence themselves.

There are a few things we have learned to that can help us strengthen our cases as potential curators of evidence. We can make sure we're following existing protocol, as seen in the Riverkeeper example. We can also secure supporting evidence for our claim, such as seen in the crane operator example where supporting evidence could come from a lab or an expert witness.

Our hope is that in that the posts to come, more of these pathways to success, and limitations around evidence, are teased out. In the meantime, let's continue these discussions here and ping in with your questions and ideas. Thanks to Chris for making time to talk with us!


Questions

Questions we've addressed here:

  • How is environmental evidence admitted into court-based legal processes?
  • What can strengthen the case for evidence to be admitted in court?
  • Who can vouch for, or interpret, evidence in court, and how is it weighed?
  • Is photographic evidence treated differently from other environmental evidence in court?
  • ...

Questions we'd like to address in upcoming posts:

  • What collection, storage, and analysis protocols can strengthen environmental evidence in court?

We're moving these questions into the new Questions system -- so feel free to repost one here:

Ask a question about evidence

Ask a question  or help answer future questions on this topic

Read more » Subscribe

  • evidence blog openhour with:gretchengehrke


    A message from the Public Lab staff

    by warren | almost 3 years ago | 0 | 4

    Public Lab resists and rejects: racism, sexism, ableism, ageism, homophobia, transphobia, xenophobia, body shaming, religion shaming, education bias, and bullying.

    We do not tolerate hatred towards women, people of color, LGBTQ or based on religious belief. We stand by our community and partners and are committed to continuing our work on environmental and health issues affecting people.

    Read more » Subscribe

  • blog


    Thoughts on Method 9 and its utility

    by gretchengehrke | almost 3 years ago | 3 | 5

    Several people in our Public Lab community are concerned about various kinds of airborne emissions, and what we as the public can do about them. One of the most accessible methods for assessing emissions is to estimate the opacity of emissions (I’ll explain a bit about this below) using EPA’s Method 9. Some community members have gone through Method 9 training and have found it very useful; others have found that it hasn’t been useful for their situations. I recently went through the training and became certified for Method 9, and I want to share some of the things I learned in that process, and my thoughts about the potential utility of Method 9 for various situations and concerns. Please comment and share your thoughts too!

    What is opacity and what is Method 9?

    Opacity is the extent to which light is blocked, or the extent to which you can’t see through an emissions plume. Opacity is caused by small particles and gases that absorb, reflect, or refract light. Particles that are similar size to visible light wavelengths (390-700 nm) can scatter visible wavelengths effectively, muting light rather than preferentially reflecting a given color; carbon particles (like soot) effectively absorb light, largely contributing to plume opacity.

    Method 9 is a method to standardize the direction and distance from which you observe an emissions plume, with regards to both the plume direction (related to wind direction), sun direction, and stack (or pile) height. The basics of this include that the sun needs to be at your back, you need to be looking perpendicular to the plume, and you should be between “3 stack heights” (or, three times the height of wherever the emission is coming out or off of the source) and a quarter mile from the source. These guidelines can be very tough to follow, however, given potentially limited access to unobstructed views of the emissions source. Method 9 recommends that you monitor emissions in the morning or afternoon -- not midday -- and move if the wind suddenly changes direction. Again, this is easier said than done, especially if you are not directly on the property where emissions are occurring. The training for Method 9 includes a lecture component and field training where you practice estimating the opacity of plumes, training your eyes to discern smoke opacities to ~5% resolution.

    Starting as early as 1859 (in the City of New Orleans vs. Lambert case), smoke opacity has been used to regulate air pollution. Today, states regulate plume opacity for point-source emissions (like from smokestacks) and most regulate opacity of fugitive emissions. Common opacity limits are 20% opacity, which means that 20% of light is blocked, or you can only see through the plume to see 80% of the background behind the plume. In practice, 20% opacity is visible, but can be hard to differentiate the bounds of the plume -- it’s really not thick smoke. Method 9 is used to measure opacity and enforce state emissions opacity regulations; if you are certified in Method 9 (which anyone can do), you can report violations and prompt an enforcement response.

    What are some of the limitations of the method?

    Method 9 can be very useful, but also has many limitations.

    1. First and foremost, Method 9 only allows you to assess visible emissions -- it provides no ability to ascertain the chemical composition of what is being emitted, and is not useful for most vapor emissions.

    2. Steam plumes are a significant complicating factor too, as steam is not subject to opacity rules, and it is often difficult to distinguish whether or not a plume contains steam or not.

    3. The physical restrictions of conducting Method 9 also limit its utility since it is often not possible to view plumes with the specific siting requirements mentioned above.

    4. A significant limitation of Method 9 is that there is no residual evidence of the visible emissions observed, which can limit agency’s ability to enforce violations. It is recommended that people conducting Method 9 also take photographs of the site and the emissions to document what was observed. There is also a digital camera alternative to Method 9, which has its own limitations, and is discussed below.

    5. Another limitation is that persons need to be re-certified every 6 months, with each certification training/test fee ~$200. In some places, this fee is waived for citizens and covered by permit fees for industry, but in most places each person is responsible for paying their certification fee, and can be exclusionary to people who can’t afford that.

    What about fugitive emissions?

    In some states, like Wisconsin, opacity limits apply to fugitive emissions too. Fugitive emissions are any emissions from a process that are not through a specified emissions point -- they are construction dust plumes, dust kicked up from unpaved roads, wind-blown dust coming off of sand piles, plumes emanating from blasting, etc. Assessing the opacity of fugitive emissions can be complicated since there often isn’t a distinct plume with a distinct direction, but as long as you are looking through the narrowest/shortest dimension of the emissions, and are following the proper siting requirements (i.e. sun at your back, appropriate distance from emissions point), Method 9 assessments are valid for fugitive emissions. Since fugitive emissions are more sporadic and variable than smokestack emissions, it is recommended that you become familiar with the characteristics of those fugitive emissions before starting your monitoring. It is useful to check out your state’s regulations before starting monitoring too, since some states, like Colorado, unfortunately have exemptions from opacity rules for fugitive emissions.

    Where can I learn about emissions opacity regulations in my state?

    Visible emissions opacity limits are included in each state’s air pollution regulations. Usually these regulations are searchable online in each state’s administrative codes on the state legislature website. Opacity standards are also included in the “State Implementation Plan” (SIP) which the state develops to detail how the state will achieve the National Ambient Air Quality Standards (NAAQS). Legislative websites and SIPs can both be somewhat onerous to navigate, so it may be most efficient to search for “opacity” on your state’s environmental agency (usually a DEP, DEQ, or DNR) website.

    What are some similar methods?

    There are four other methods recognized by EPA that are similar to Method 9.

    1. Alternative Method 82, also known as ASTM D7520, is the “Digital Camera Opacity Technique” (DCOT) that can be used in place of Method 9 when approved. Note that Alternative Method 82 is only approved to demonstrate compliance (or lack thereof) with federal opacity limits, but not opacity limitations set by the state or municipality. Alternative Method 82 does have the advantage of having a data record of visible emissions, however, it can be very difficult to actually conduct. First, the DCOT system, which includes a digital camera, a photo analysis software platform, and a results assessment and reporting component, needs to be certified, and this certification process is more arduous than that of Method 9. The DCOT operator has to complete a manufacturer-specified training course, and follow all of the Method 9 siting requirements (and a couple of additional limitations), and then the images also generally have to be sent to a third party for analysis. Also, as of today, there is still only one DCOT system that is commercially available and certified to conduct Alternative Method 82, and the software licenses can be thousands of dollars per year. In addition to the cost for the DCOT system and analysis, it also takes longer, and cannot immediately identify opacity violations (it takes processing time). Therefore, while there are some definite advantages of Alternative Method 82 (notably the data record with photographs), there are currently considerable drawbacks. The company whose training I took, AeroMet, recommends that the EPA address these drawbacks to make the method more accessible and feasible for people to actually use.

    2. Methods 203a, 203b, and 203c are alternative methods for Method 9 for slightly different types of opacity limitations: time-averaged, time-exception, and instantaneous limitations, respectively. Each of them are the same general procedure, but in 203a total time assessed can be 2-6 minutes (whereas Method 9 requires 6 minutes), 203b averages the amount of time that emissions are above the opacity limit, and 203c takes observations every 5 seconds for 1 minute (whereas Method 9 takes observations every 15 seconds for 6 minutes). For Methods 203a-c to be acceptable for assessing compliance with air pollution regulations, that must be specified in the state implementation plan.

    3. Method 22 is somewhat similar to Method 9, but is used to assess the frequency of visible emissions, not the opacity of those emissions. Method 22 is mostly used for fugitive emissions and gas flares. In Method 22, the observer uses two stopwatches, one to measure total time elapsed, and one to measure the time when visible emissions are present, to determine the frequency and percentage of time that a source is visibly emitting. For Method 22, the observer can be indoor or outdoor and there are fewer siting requirements overall. If industries are subject to compliance assessed by Method 22, it will be stated in the SIP.

    SmokeSchool.jpg

    Read more » Subscribe

  • plume air-quality blog pm


    Enforcing Stormwater Permits with Google Street View along the Mystic River

    by mathew | almost 3 years ago | 1 | 4

    Compressed autos at Mystic River scrap yard, Everett, Massachusetts, 1974. Spencer Grant/Getty Images. CC-NC-SA

    In 2015 the New America Foundation asked @Shannon and me to write a chapter for their Drone Primer on the politics of mapping and surveillance. I worked in an example of positive citizen surveillance by the Conservation Law Foundation (CLF) that I’d heard about in a session at the 2015 Public Interest Environmental Law Conference. I’ve excerpted and adapted my writeup of the CLF case as a part of our ongoing Evidence Project series. If you know of similar cases please get in touch!

    Geo-tagged aerial and street-level imagery on the web can be a boon to both environmental lawyers and the small teams of regulators tasked by US states with enforcing the Clean Water Act. Flyovers and street patrols through industrial and residential districts can be conducted rapidly and virtually, looking for clues to where the runoff in rivers is coming from. Combining aerial and street-level photographs with searchable public permitting data, the 1972 Clean Water act’s stormwater regulations are now more enforceable in practice than they have ever been (Alsentzer et. al., 2015).

    State and federal environmental agencies often do not have time or resources to adequately enforce permits under the National Pollutant Discharge Elimination System (NPDES) that regulates construction and industrial stormwater runoff, and roughly half of facilities violate their stormwater permits every year (Russell and Duhigg, 2009). Enforcement can be picked up by third parties, however, because NPDES permits are public. Plaintiff groups and legal teams conduct third-party enforcement through warnings and lawsuit filings. Legal settlements from lawsuits recoup the plaintiffs legal costs, and can also include fines whose funds are directed towards community-controlled Supplemental Environmental Projects that help improve environmental conditions in the violator’s watershed. The Conservation Law Foundation (CLF), a Boston-base policy and legal non-profit, operates in precisely this manner, recouping their costs through lawsuits and directing funds to Supplemental Environmental Projects in the Mystic River Watershed.

    In 2010 a neighborhood group approached the CLF about a scrap metal facility on the Mystic River. Observable runoff demonstrated the facility had never built a stormwater system, and a quick US Environmental Protection Agency (EPA) NPDES permit search revealed that they had never applied for or received a permit. The facility was flying under the EPA’s enforcement radar, and so were four of the facility’s neighbors.

    Between 2010 and 2015 CLF’s environmental lawyers initiated 45 noncompliance cases by looking for industrial facilities along waterfronts in Google Street View, and then searching the EPA’s stormwater permit database for the facility’s address. Most complaints are resolved through negotiated settlement agreements, where the facility owner or operator funds Supplemental Environmental Projects for river restoration, public education, and water quality monitoring that can catch other water quality criminals. Together, CLF and a coalition of partners such as the Mystic River Watershed Association, are creating a steady stream of revenue for restoration, education, and engagement in the environmental health of one of America’s earliest industrial waterways.

    Regardless of their effect, legal threats are stressful, often expensive, and can take years to resolve. Even when threatened polluters are acting in good faith to clean up their systems, the process of identifying and persuading companies to comply with environmental regulations can be strain relationships in communities. Non-compliant small businesses on the Mystic River that have been in operation since before the Clean Water Act was passed in 1972 may never have been alerted to their obligations under the law. Their absence from the EPA database reflects mutual ignorance from bureaucrats of businesses and businesses of bureaucracy. However, businesses bear the direct costs of installed equipment, staff time, and facility downtime, indirect costs of professional reputation from delayed operations or identification as a polluter, and transactional costs of paying for legal assistance or court fees. Indirect and transactional costs are hidden punishments that can accrue regardless of guilt or readiness to comply.

    To combat the negative perceptions that can accrue from the use of legal threats, CLF proactively works to fit itself into a community-centered watershed management strategy. CLF and their partners run public education and outreach campaigns and start with issuing warnings that aren’t court-filed (Alsentzer et. al., 2015). Identifying and working with businesses operating in good faith is a tenet of community-based restoration efforts. By using courts as a last resort and participating in public processes where citizens can express the complexity of their landscape relationships, CLF and their partners are increasing participation in environmental decision-making and establishing the legitimacy of restoration and enforcement decisions.

    Regulations and permit databases can often be tough to put to work, but the CLF’s case was fairly straightforward: They simply searched for company’s addresses in a publicly available database. We would love to hear cases of more groups using this approach or other simple modes of regulatory engagement.

    Excerpted and Adapted from Mathew Lippincott with Shannon Dosemagen, The Political Geography of Aerial Imaging, 19-27 Drones and Aerial Observation, New America Foundation 2015.

    CC-NC-SA

    Sources and Further Reading:

    Alsentzer, Guy, Zak Griefen, and Jack Tuholske. 2015. CWA Permitting & Impaired Waterways. Panel session at the Public Interest Environmental Law Conference, University of Oregon.

    Conservation Law Foundation Newsletter “Coming Clean”, Winter 2014;

    D.C. Denison, “Conservation Law Foundation suing alleged polluters”, Boston Globe, May 10, 2012.

    Russell, Karl and Charles Duhigg, Clean Water Act Violations are Neglected at a Cost of Suffering. In The New York Times, Sept 12, 2009. Part of the Toxic Waters Series

    Read more » Subscribe

  • evidence epa blog water


    What goes into choosing a topic name?

    by liz | almost 3 years ago | 2 | 3

    above: sketch of figuring out how to organize "air" into a research area, and which methods are part of the research area, and which activities would go on what grid...Photo by @nshapiro

    We've been having some fun discussions over the past couple months with people on each of the topical lists about what to name the new "top-level" pages where we're organizing. That means -- when posting activities, do they end up on /wiki/balloon-mapping or /wiki/aerial-photography? Do we use the older /wiki/spectrometer page, or the new one at /wiki/spectrometry? But we're hoping for even MOAR discussion!

    Let's think about:

    1. where and how these new pages will show up -- most likely on a dropdown menu and maybe eventually on the front page of publiclab.org,
    2. and, the timing -- we're prioritizing the creation of these "origin" pages amidst all the creation of activities and activity grids we've been working on and will continue to work on through Barnraising.

    So far we've created drafts of:

    Up next:

    When naming new pages, some things to consider are that names should be:

    Looking ahead, we have more naming to do! There are some mismatched names:

    • "dssk" vs. "desktop-spectrometry-kit-3-0"
    • "infragram" vs. "infrared" vs "multispectral-imaging"
    • "timelapse" vs. the broader" photo-monitoring"

    We'd really like to hear from a wide selection of voices about naming! Please pile on in the comments! Thank you!

    Read more » Subscribe

  • blog with:warren with:cfastie with:nshapiro


    What makes a good activity?

    by warren | almost 3 years ago | 13 | 3

    In our continuing shift towards using the new Q&A feature and the new Activity grids as a framework for collaboration on PublicLab.org, we're encouraging people to post their work more in the spirit of Instructables.com -- "showing each other how to do something" rather than just telling people about something you've done. This shifts the emphasis from solely documenting what you've done, to helping others do it too. (image above from a Lego Technics kit)

    There are several reasons we like this. A how-to guide (what we're calling Activities) must have extremely thorough and easy-to-follow steps (and may need to be revised if people get stuck). Perhaps even more importantly, its success (we hope) can be measured by how many people are able to follow the steps successfully, which exercises and fuels the power of broad communities and open science.

    What's needed?

    While there are various types of activities for various purposes, all of them ought to set out some basic information to help people get started:

    • a description of the purpose of the activity
    • a list of materials needed
    • a clear description of your conditions (e.g. lighting, temperature, or other relevant factors)
    • a detailed sequence of steps to follow
    • a description of how to confirm you've followed the steps correctly
    • a hypothesis or expected outcome
    • a discussion of your results
    • a list of questions to explore next (unknowns, or followup activities)
    • a request for input (there's always room for improvement!)

    Speaking of room for improvement, can folks suggest other important parts of an activity? With an eye toward making it easy for anyone to write and post activities, and for others to replicate them, what's the minimum necessary?

    image description

    (IKEA Stonehenge. Justin Pollard, John Lloyd, and Stevyn Colgan designed an IKEA manual for Stonehenge, publishing it under the title HËNJ in the QI 'H' Annual)

    Drafts welcome

    We'd also like to suggest that people post things early -- to share ideas, solicit input, and acknowledge that most posted activities will go through some (if not many) revisions as people try them out and offer feedback. Could we even have a separate "Publish Draft" button so they're clearly marked as such, and people know they're encouraged to share early and often?

    Break it up!

    One important way we think will increase the chances that people will complete a replication of your activity is to simply write shorter activities -- perhaps breaking up a longer set of steps into several related modules. Instead of posting a long and complex activity, a few shorter ones -- each with a simple way to verify that the steps so far were correctly completed -- are much more accessible, and will tend to separate distinct possible causes of failure for easier troubleshooting.

    Distinct modular activities can be linked and referenced to create a larger activity that might span, for example, building and verifying a tool functions properly, tool calibration, and lab or field tests of various materials using the tool. Even if the final activity cannot be completed without the previous activities first, breaking them out into distinct activities that build on each other will help the onboarding process.

    image description (above, @cfastie shows how to swap a lens in a #mobius camera)

    Supporting activity authors

    Finally, beyond this overview, what more can we do to make it easy to write good activities? Some have suggested a kind of "assistance group" who could provide helpful tips and constructive critique to people posting on Public Lab. This sounds like a great idea, and potentially extra helpful to folks who are hesitant or unsure of what makes a good and thorough post.

    Would "activity templates" be useful, to the extent that they can be generalized?

    We're also, of course, posting some example Activities, such as this spectrometer calibration activity, which we hope will help set some conventions.

    Next steps

    We're also interested in how people could be introduced to other activities on a topic once they complete the current one -- maybe there's a "sequence" of activities that grow in complexity? Or we could display a mini activity grid of "related activities" at the bottom of each one?

    Finally, we're trying to figure out how people can request an activity for something they want to learn to do, but for which there is not yet an activity posted. This'll be especially important as we're starting out, since we have very few complete activities posted -- but it'll also be a great starting place for people hoping to share their knowledge and expertise. Our initial stab at this is to list "limitations and goals" for a given kit, clearly explaining the problem we'd like to solve. This is actually a list of questions using our new questions system -- and we imagine people might post an activity, then link to it as a proposed answer.

    We need your input!

    This is all quite new, and we'd love to hear other ideas for how this could work. And of course, if you're interested in giving it a try and writing an Activity, please do! Activity grids are going up on many wiki pages across the site, so if you have questions about where and how to post, please leave them in the comments below. Thanks!

    Read more » Subscribe

  • collaboration community leaffest blog