Tuesday, August 1, 2023

What is Commercially-Available Information (CAI)?

I'm working on some thoughts about the ODNI report on Federal Agencies purchasing Commercially-Available Information (CAI for short), and I thought that it would be worth fleshing out some of the basics about what this information actually is.

So then, what is CAI?

The ODNI report, itself says this about CAI:

As the acronym indicates, and as we use it in this report, “CAI” is information that is available commercially, through a commercial transaction with another party. The acquisition may occur on a one-time or subscription basis, and may involve the IC directly ingesting the CAI or obtaining a license agreement that affords a continuing right of access. CAI typically is acquired for a fee, but as we use the term it also includes information offered at no cost if it is the type of information that is normally offered for sale – e.g., a free trial offering of CAI.

As we use the term in this report, CAI does not include information that is stolen or otherwise misappropriated and then acquired from a black market or otherwise via traditional HUMINT acquisition methods (e.g., espionage). Nor does it include information obtained through traditional SIGINT acquisition methods (e.g., wiretapping) that does not involve a commercial transaction at all. As such, it does not necessarily include all information acquired from commercial entities, such as information acquired via lawful process (e.g., a search warrant or subpoena) served on a communications service provider or financial institution.

Cutting through all the Governmental jargon, "CAI" is information:

  • obtained through legal means
  • typically acquired for a fee from another party
  • available commercially
  • not obtained via "traditional" human intelligence (HUMINT) collection methods (such as espionage or interrogation)
  • not obtained via "traditional" signals intelligence (SIGINT) collection methods (such as wiretapping)

Generally, the idea is that if anyone can acquire the information legally, it counts as Commercially Available Information.

Next up will be my thoughts on the Government using CAI...

Tuesday, April 26, 2022

Staying Relevant in Cyber and Information Security: PODCASTS!

    Staying up to date is absolutely essential to remaining relevant in your field.  Here, I've curated most of the podcasts I listen to, arranged in a few different categories.



Daily - Morning Listens (every weekday morning):

As They Are Published (mostly time-sensitive content):
  • Paul's Security Weekly
    • ~3 hours
    • Phenomenally insightful and funny cyber security show "for security professionals, by security professionals"
    • I always get something useful from this show
    • http://securityweekly.com/
  • Risky Business
    • ~60 minutes
    • Weekly cyber security news and analysis
    • Very insightful, regularly voted one of the top industry podcasts
    • https://risky.biz/
  • Security Weekly News
    • Another podcast in the Paul's Security Weekly network (several of these are worth the listen if you've got the time and interest)
    • ~30 min, twice weekly
    • News and insight, then a show recap from the rest of the network's offerings [ie. Paul's Security Weekly, Enterprise Security Weekly, Application Security Weekly, etc.]
    • http://hacknaked.tv/
  • Talkin' About Infosec News
  • Defensive Security Podcast

As They Become Available (not necessarily time-sensitive content):
  • Darknet Diaries
  • Malicious Life
    • ~30-60 min
    • Similar to Darknet Diaries (but with a specific underwriter), with a lot more episodes
    • https://malicious.life/
  • Cyber Security Interviews
  • Command Line Heroes
  • The Idealcast with Gene Kim
    • ~90-120 min
    • DevOps-focused podcast, with insights that can contribute to the improvement of most projects from the author of "The Unicorn Project"
    • https://itrevolution.com/
  • Caveat
  • Down the Security Rabbithole
  • Click Here
    • ~20-30 min
    • Former NPR journalist focuses specifically on cyber and intelligence, providing industry insight with geopolitical context
    • From Recorded Future
    • https://therecord.media/podcast/
  • Programming Throwdown
    • ~60-90 min
    • Two computer scientists describe and discuss programming languages and technologies
    • Surprisingly interesting and good primers to languages and technologies you may not be familiar with
    • http://www.programmingthrowdown.com/


Other Podcasts with Topical Interest:


    It's a long list.  But there's always something to learn.

    I don't necessarily listen to every single episode of each of these (except the daily ones), and I generally listen at accelerated speed (about 1.2-1.5x).  I listen to several podcasts that inevitably cover the same thing, because there's usually some decent insight to be gained by listening to several different opinions.

    It has been a tremendous source of pride to be able to provide current, relevant information to my leadership, as well as provide insight as to what the geopolitical effects might be.

    Happy Listening!!

Thursday, April 14, 2022

Staying Relevant in Cyber & Information Security

    Hi folks!  I know it's been a while since I posted, but there's been a lot going on in the world and my life.  I am hoping to post more frequently in the future, but some words from you can help motivate me!

    Anyway, here is today's post...


    Staying up to date in a field like cyber & information security can be as difficult as it is important.  Being effective in a security professional role requires that you maintain some level of awareness of the landscape in which you operate.  With all the different sources available, choosing some that are worthwhile can be a daunting task.

    This is a presentation I gave to several colleagues last year.  Some of the newer folks were struggling with strategies to keep up with the ever-evolving digital security landscape, and I put this together as an example.  

    I hope it may help anyone who is struggling with the problem of staying relevant.











    As always, comment below with your thoughts and additional ideas to share!

Thursday, April 1, 2021

Creating a Scheduled SystemD Service

This post was generated from a document I wrote for another team to assist them in creating a service that could be scheduled and logged by SystemD.

SystemD Unit Files

You'll put these files will be located in "/usr/lib/systemd/system/".
The .service file will look like this:
[Unit]
Description=Execute My Script

[Service]
User=admin
Group=admin
ExecStart=/srv/scripts/myScript.sh

[Install]
WantedBy=default.target
The name of the .service file is arbitrary, and can be named anything. If you call it "myService.service", then you can invoke it like this:
$ sudo systemctl start myService
Make sure your "ExecStart=" statement value corresponds to the script in the location you want it.
The .timer file looks like this:
[Unit] 
Description=Execute myService Daily at 1215 UTC

[Timer]
OnCalendar=*-*-* 12:15:00
Unit=myService.service

[Install]
WantedBy=default.target
Again, the name is arbitrary. I named mine "myService.timer", which made it simple to pair with my .service file.
The time listed is for a 1215 UTC execution, which is because my server's system time was set to UTC. Make sure you list the name of your .service file in the "Unit=" statement.
You would then enable the timer like this:
$ sudo systemctl enable myService.timer

SELinux

Chances are that the stuff above will all be thwarted by SELinux, for better or worse. It's going to take a few commands to get that squared away.
In my example, the location of the scripts are in "/srv/scripts", and the SystemD unit files are named "myService.service" and "myService.timer". Change this to meet whatever your system reflects.
For the scripts:
# semanage fcontext -a -t bin_t "/srv/scripts(/.*)?"

# restorecon -R -v /srv/scripts
For the SystemD Unit Files:
# semanage fcontext -a -t systemd_unit_file_t /usr/lib/systemd/system/myService.*

# restorecon -R -v /usr/lib/systemd/system
Obviously, these are being run as the root user. The "semanage" command is setting the SELinux file context on each set of files to the appropriate type, and the "restorecon" command is registering the change with the running SELinux subsystem.
 
 
Edited:  10 April 2021

Tuesday, January 26, 2021

InfoSec Professionals vs The World

As InfoSec Professionals, we fight for Freedom and Justice!

We beat back the Forces of Evil that would do our Organizations harm!  And we fight for the End User!

We are Superheroes!


I've got news for you, folks...

The End User could care less.  They just want to do their jobs without interruption by you and your "Cyber crap".

This is one of the most sobering things that a new person coming into the industry must face:  the lack of recognition and gratitude from those whom we help protect can be off-putting for those who aren't used to it (or even thrive on it, such as some of us do).

Indulge me in a bit of a backtrack in my own career... 


Tuesday, October 1, 2019

College vs Experience

I wanted to take the opportunity to address a topic I've wrestled with so often over the years, and it is well worth posting on it.

I've posted about this before (not on this particular blog), and I've said (circa 2010):
I fully believe that college degrees just mean that the person knew how to take the test at that time, and doesn't mean they know anything now.  Experience, I believed, counts for far more than a mere degree.
I stand by that claim.

Now, before I'm crucified by the masses for this statement of equal self-importance, allow me to caveat that I would revise that statement to add this: 
"If you want a job or career that requires some sort of specialized or expert knowledge, you should get a college degree."
The difference in the last nine or so years is that I am quickly closing in on retirement and the prospect redirecting my career (though slightly).  It will require a different mentality, to be sure.  It will also require a resumee.
"Ay', there's the rub..."
I want a job that values my vast and varied experiences, but I also want a job that I can actually be interviewed for because my resumee has made it past the initial selection.  So, it seems that the initial cut is determined by how educated I may look on paper.

How I look on paper...

This is my postulation on this:  Until a person is known independently by name and/or reputation, their "paper face" is just as important as their true qualifications and potential.

To that end, a degree is important to prove multiple things...at least, on paper.

A degree shows that you have the aptitude to pass the program of study at a college-level.  A degree shows that you have the drive to follow-through with a course of action.  A degree shows a minimum level of proficiency.  Having a degree is a definitive discriminator.

A degree doesn't have to mean you know more than another person in the same field, but when you are starting out, it is extremely important to show that you have more "extra" than the next person.  If you don't have the experience in the field, or a name that respected people may recognize as someone worth a damn, then you need some other way to set yourself apart.

Get a degree.  If nothing else, it will open up more doors than it closes.

Sunday, January 27, 2019

Splunk SPL Query to Track Login Locations


https://www.splunk.com


I love programming and programming logic, so the Splunk Search Processing Language (shortened to SPL) was a reasonable jump when I began learning it in the course of my daily work as a Threat Hunter.  Recently, I started tackling several problems regarding how to tease some user behavioral analytic data from the varied logs and other data ingested by Splunk.

This particular query should show login times, IP addresses, and login location information (based on geolocation for the source IP address) for a particular user.  Presumably, a value would be suspect if one of the following criteria were met:
  • A login from the same user occurs from a different IP address/location simultaneously
  • A login from the same user occurs from a different IP address/location shortly after previous login from an improbable distance away
  • A login from the same user occurs from a different IP address/location and there is external evidence that it was not that user (ie. they had logged off for the evening, they did not travel to the location indicated, etc.)

I share this with you all, in the hopes it helps in finding the Bad in your enterprise.

There are a couple of requirements in order to make this query work in its entirety.  Number one is (somewhat obviously) data that encompasses some sort of user login information, such as Windows logs, Syslog data, Active Directory logs, IPA or RADIUS logs, web authentication data, etc.  Perhaps you have all of that, which can help make this work even better by ensuring all avenues of authentication have been covered.

Next, you'll require the Splunk Common Information Model (CIM) app, which can be found here:  https://splunkbase.splunk.com/app/1621/.  This app ensures that, no matter what sort of data you have, the same sort of data can be addressed the same way (ie. "src_ip" instead of "srcip" or "source_ip").  Essentially, this application allows you to create something like a relational database of all your like data, so that no matter where it came from or what source type it is, it can be addressed in one way instead of a long list of OR statements.



Here's the query: 


| from datamodel:"Authentication"   
| search user="*USERNAME*"  
| `get_asset(src)`  
| iplocation src   
| sort 0 + _time

| eval session_lat=if(isnull(src_lat), lat, src_lat)  
| eval session_lon=if(isnull(src_long), lon, src_long) 

| eval session_city=if(isnull(src_city), City, src_city)
| where isnotnull(session_lat) and isnotnull(session_lon)
| sort 0 + _time
| streamstats current=t window=2 earliest(session_lat) as prev_lat, earliest(session_lon) as prev_lon, earliest(session_city) as prev_city, earliest(_time) as prev_time, earliest(src) as prev_src, latest(user_bunit) as user_bunit by user
| where (src!=prev_src)
| `globedistance(session_lat,session_lon,prev_lat,prev_lon,kilometers)`
| table _time src distance session_city session_lat session_lon 



Below, the query is broken out further.  The query is listed on the left, and a description of each statement on the right.



SPL Query Description of SPL Line
| from datamodel:"Authentication" Pull data from “Authentication” Data Model (from Splunk_SA_CIM)
| search user="*USERNAME*" Match username (replace USERNAME with your username)
| `get_asset(src)` Initiate get_asset macro to pull source IP from data, regardless of sourcetype
| iplocation src Geolocate source IP
| sort 0 + _time Sort events by time in ascending order
| eval session_lat=if(isnull(src_lat), lat, src_lat) Get Latitude from source events
| eval session_lon=if(isnull(src_long), lon, src_long) Get Longitude from source events
| eval session_city=if(isnull(src_city), City, src_city) Get City from source events (OPTIONAL)
| where isnotnull(session_lat) and isnotnull(session_lon) Use only events with a Lat/Long
| sort 0 + _time Sort events by time in ascending order
| streamstats current=t window=2 earliest(session_lat) as prev_lat, earliest(session_lon) as prev_lon, earliest(session_city) as prev_city, earliest(_time) as prev_time, earliest(src) as prev_src, latest(user_bunit) as user_bunit by user Pull out event data from two adjacent events for comparison
| where (src!=prev_src) Make sure adjacent events don’t have the same IPs
| `globedistance(session_lat,session_lon,prev_lat,prev_lon,kilometers)` Initiate globedistance macro to get distance between geolocated IPs
| table _time src distance session_city session_lat session_lon Display table with Time, IP, Distance from last different IP, IP City, Lat, and Long


https://www.gosplunk.comThat query took a lot of time to research, figure out, and get right, so I'm pretty proud of it.  I'd like to thank the good folks over at GoSplunk for their service as a repository for useful Splunk queries, and (of course) Google for...well...allowing me to google all the obscure InfoSec items I do.


If you see something wrong, or something that could be executed better, please drop me a line and let me know.

References:

splunk.com
docs.splunk.com/Splexicon
splunkbase.splunk.com/app/1621
gosplunk.com
google.com

What is Commercially-Available Information (CAI)?

I'm working on some thoughts about the ODNI report on Federal Agencies purchasing Commercially-Available Information (CAI for short), an...