What You Will Achieve
By the end of this playbook you will have:
- A curl one-liner to inspect your plan’s batch limits (
GET /bulk/check) - A POST /bulk/check workflow with JSON
entitiesand optionalenrichment - A Python chunking script that respects max entities per request and merges results
- Clear error handling when an individual entity fails validation
Prerequisites
| Requirement | Details |
| -------------------- | -------------------------------------------------------------------------- |
| API key + secret | Dashboard → Account → API; header X-API-KEY = Base64(apiKey:apiSecret) |
| Plan with API access | Bulk max batch size and monthly quotas depend on your subscription |
| Input list | Domains, IPv4/IPv6, or http(s):// URLs (one per line or CSV column) |
Official reference: API documentation and Bulk API overview.
Step 1: Read your batch limits
Always query limits before building automation so you chunk correctly.
export B64=$(printf '%s' "${API_KEY}:${API_SECRET}" | base64)
curl -sS "https://api.ismalicious.com/bulk/check" \
-H "X-API-KEY: ${B64}"
The JSON includes limits.maxEntitiesPerRequest for your authenticated plan (and plan metadata under limits). Treat that number as the hard cap when chunking POST bodies.
Step 2: Run a single batch POST
Send mixed entity types in one body. For bulk, the API accepts enrichment of basic or standard (default standard).
curl -sS -X POST "https://api.ismalicious.com/bulk/check" \
-H "X-API-KEY: ${B64}" \
-H "Content-Type: application/json" \
-d '{
"entities": ["example.com", "8.8.8.8", "https://example.com/path"],
"enrichment": "standard"
}'
Each result includes entity, type (domain | ip | url), isMalicious, confidence, sources, categories, and optional error.
Step 3: Python — chunk a file and write CSV
Save IOCs one per line in iocs.txt, then:
import base64
import csv
import json
import os
import time
import urllib.error
import urllib.request
API_KEY = os.environ["ISMALICIOUS_API_KEY"]
API_SECRET = os.environ["ISMALICIOUS_API_SECRET"]
BATCH_CAP = int(os.environ.get("ISMALICIOUS_BULK_CAP", "100")) # upper bound; real cap from API
def auth_header() -> str:
raw = f"{API_KEY}:{API_SECRET}".encode()
return base64.b64encode(raw).decode()
def get_limits():
req = urllib.request.Request(
"https://api.ismalicious.com/bulk/check",
headers={"X-API-KEY": auth_header()},
method="GET",
)
with urllib.request.urlopen(req, timeout=30) as resp:
return json.loads(resp.read().decode())
def bulk_post(entities: list[str], enrichment: str = "standard"):
body = json.dumps({"entities": entities, "enrichment": enrichment}).encode()
req = urllib.request.Request(
"https://api.ismalicious.com/bulk/check",
data=body,
headers={
"X-API-KEY": auth_header(),
"Content-Type": "application/json",
},
method="POST",
)
with urllib.request.urlopen(req, timeout=120) as resp:
return json.loads(resp.read().decode())
def main():
limits = get_limits()
cap = limits.get("limits", {}).get("maxEntitiesPerRequest", BATCH_CAP)
max_batch = min(BATCH_CAP, int(cap))
with open("iocs.txt") as f:
entities = [line.strip() for line in f if line.strip()]
rows = []
for i in range(0, len(entities), max_batch):
chunk = entities[i : i + max_batch]
try:
data = bulk_post(chunk)
except urllib.error.HTTPError as e:
print(e.read().decode())
raise
for r in data.get("results", []):
rows.append(
{
"entity": r.get("entity"),
"type": r.get("type"),
"isMalicious": r.get("isMalicious"),
"confidence": r.get("confidence"),
"sources": r.get("sources"),
"categories": ";".join(r.get("categories") or []),
"error": r.get("error", ""),
}
)
time.sleep(0.2) # polite spacing between chunks
with open("bulk_results.csv", "w", newline="") as out:
w = csv.DictWriter(out, fieldnames=list(rows[0].keys()) if rows else [])
if rows:
w.writeheader()
w.writerows(rows)
if __name__ == "__main__":
main()
Set ISMALICIOUS_BULK_CAP from Step 1 so you never exceed the server-enforced maximum.
Step 4: Operational tips
- 429 responses: Back off exponentially; monthly and burst limits apply across API usage.
- Pre-filter: Dedupe and strip whitespace before POST to save quota.
- Full enrichment: For deep context (risk score, MITRE, timelines), use single-entity
GET /checkon the highest-priority rows after bulk triage.
Related reading
- API Playground — experiment without writing scripts.
- Streaming API (overview) — progressive enrichment patterns for UIs and long-running checks.